A brief note, you can now sign up and read the newsletter directly here: newsletter.tableautim.com
There's a social media trend at the moment where people share photos from 2016 alongside where they are now. On the left is me after work, picking up my puppy Toby from daycare. On the right is me today with my wife Bree and the kids.

If you'd told me a decade ago that I'd be pouting whilst wearing a rainbow satchel round my neck, I'd have laughed in your face. But here I am.
The other thing I was doing in 2016 was building my career as a BI developer. I was two years in at that point, and as I mentioned last week, most of my career has been spent upstream in Alteryx, serving Tableau. That was true back then too. For some time now, dashboards have been the summit of what I'll call the Analytics Quest.
KPI cards, fancy visuals, containers, perfect spacing, rounded corners, tips and tricks that rely on clever calculations or data shaped just so. Plenty of that, but you won't see the workflow that feeds it, the dbt pipeline behind it, the SQL wizardry required to make it happen or the data model that drives it.
I don't think my experience is unusual. I think it's just the bit nobody talks about.
Why?
Confidentiality. The real work is often proprietary. You can anonymise a dashboard and share viz best practice easily enough. A data pipeline is harder to share without exposing business logic or infrastructure, which is often a competitive advantage.
Imposter syndrome. Some of my proudest moments in data have been threading together solutions in situations where it didn't look possible. So why haven't I talked about it more? I assumed my prep work was messy or hacky. I didn't realise until later that everyone's pipeline work is messy. The world's not perfect, and neither is your data.
The community formed around the output. I speak from a heavily biased Tableau perspective here. Tableau has Viz of the Day (VOD), Workout Wednesday, Iron Viz. The upstream here doesn't have that same culture of sharing. Nearly all of my VODs’ were underpinned by an Alteryx flow that brought those visualisations to life: crops, music, boat races, travel and politics. I built a framework to create what I thought was beautiful work without recognising the beauty of what enabled it.
So we stayed quiet. But here's the thing: the upstream is where the real value sits. Anyone can learn to drag fields onto a canvas. But understanding where data comes from, why it breaks, and how to fix it? That's a career moat. It's what gets you a seat at the table. If I were mentoring a young analyst today, I'd tell them to start with data engineering and architecture before BI development. Not because dashboards don't matter, but because the upstream is where you build the instincts that make everything else easier. What about those of us working in the industry now?
The BI platforms never caught up.
The community has always celebrated the output, and so did the tools. Tableau Public is a gallery of polished visualisations. Sigma celebrates “Apps”. That’s not to say they don’t have products that serve the upstream component.
Sigma has a strong data modelling component and capabilities that bring CI/CD into a BI tool alongside a UX that enables workflows that they call “Apps” (more on that in the future). Power BI and Power Query put data transformation front and centre in the user experience, and for Excel users especially, it was transformational: repeatable ETL without VBA.
Tableau Prep is, by a country mile, one of the best-designed data prep tools I've ever used. Intuitive, visual, genuinely enjoyable. But there's no Tableau Public for Prep. No Iron Viz equivalent. The tool exists, but the culture around celebrating this work isn’t the same. Other than passionate community members creating their own space, I’ve never seen a prep flow on the LinkedIn Tableau Feed. It doesn’t even have a sub-menu on the Tableau homepage; it used to, but it’s like it doesn’t exist.

I have so many good things to say about Tableau Prep. It’s such a joy to use and has quickly matched Alteryx for most of the powerful capabilities around multi row calculations.
In many ways, we always end up back at Alteryx. Their Grand Prix is one of the few community events that celebrate upstream work and actually showcases it. It's telling that the exception comes from a tool built entirely around data preparation, not bolted onto a BI platform, yet their biggest challenge has been nailing the platform offering, and to be fair, DBT (now merged with Fivetran) is also facing the same challenge.
I've also often thought an ecosystem of industry-standard data models was already possible with the tools we have today. Vendors could already be saving industries countless hours by having boilerplate starter setups, for example, an end-to-end insurance underwriting boilerplate, with domains, data models, and governance frameworks mapped onto the vendor tool, making it easy to stand up solutions in weeks rather than months. Retail, manufacturing, consumer goods, you could go on and even drill into the niches. I get that not every business is the same, but that’s exactly why boilerplates are perfect. They’re a starting point, heavily customisable, but translated to map onto the way the platform that’s offering it works, lowering the burden on enablement and support.
Things are changing have changed.
This matters because the landscape is shifting. I can't scroll my feed without seeing Snowflake or Databricks. Databases and the workloads they enable are the hot thing again. For BI developers, this poses a real challenge: we likely face one of the toughest transitions of the coming years. It’s not moving our skill sets further up the stack, but reminding everyone that we’ve actually been doing this for years, and that, given the right tools and some refocused investment in our enablement, we’re well placed to solve this problem going forward. The risks to the industry is real.
The risk to BI vendors is pretty straightforward. If every tool can connect to the same semantic model and render the same metrics, what's left to compete on? Chart types? Formatting? That's a race to the bottom. And when tools become interchangeable, so do the people who specialise in them.
The risk for the BI industry is more immediate: you either move up the stack or you become a technician executing someone else's requirements. AI is accelerating this shift. Tools are getting better at generating charts and dashboards in seconds, sometimes not even needing AI beyond summarising insights, if I think about Thoughtspot or Tableau Pulse, and the analysis piece is closer than I thought possible. Just this weekend, I tried Claude co-Work and have been impressed with how, in just 1 year, this technology has come on leaps and bounds. When AI can build the dashboard and perform exploratory analysis, the human value shifts to deciding what’s worth building in the first place, to understanding data quality, lineage, and business logic, and, critically, to enabling the actions people should take once they arrive at an insight.
The analysts who thrive won't be the ones who build dashboards fastest or know how to drive the deep intricacies of bi tools. In fact, they might not even be called analysts because they’ll serve more as curators of the analytics experience. They'll be the ones who know whether the data feeding the experience is valid and trustworthy, and, if not, how to make it so. Additionally, this job isn’t just about technical roles; there’s governance, infrastructure, maintenance, and the broader responsibility of owning the experience (in a way, like product owners) and making architecture decisions about which tools belong in your stack and which don’t. All of this moves upstream.
The irony.
Last week, I talked about my brand “Tableau Tim", but my actual edge has always been what happens before and after Tableau. The industry has caught up to something I stumbled into by accident thirteen years ago, or at least that’s what I tell myself to make myself sound ahead of the curve.
The big jump for me is getting far more familiar with the tools, platforms and the core principles that underpin successful implementations. Ive been using BI-focused tools to solve upstream challenges; it’s time to explore the now-mature ecosystem of well-designed tools for upstream work.
The upstream is the work. It always has been. In this spirit, next week, I’ll be drilling down into the announcement last year about the Open Semantic Interchange.
If this landed in your inbox, thank you for signing up. Feel free to forward it on. I’ll be creating a new email inbox for responses to my newsletter shortly. I’m also considering launching an audio-only version in podcast feeds so you can listen rather than read. Let me know if that would be something you’d like. If you read this on LinkedIn, please share your comments and perspectives, challenge my thinking and or add your own perspective. If you do this, we can grow together.