First and foremost, this edition is a little late. I’m Sorry. I looked at what I wrote last week, and right before I hit publish, I had a change of heart and split the newsletter into 2 parts which meant i had to rewrite both halves, so this week you get 2 posts. Let’s get into it!
Two years ago at the Tableau Conference (Apr 30, 2024), Salesforce showcased what they were calling the next generation of Tableau. The demos were slick. It even had a tag line, so-called the 4th Wave. They showcased a semantic layer that would give data real business meaning. Agentic AI that could surface insights proactively. A composable visualisation layer to solve for the fragmentation and sea of unused assets. The vision, on paper, was genuinely exciting. Salesforce wanted you to know they’d heard you as an analyst and they were working on something to solve that problem.
Then TC25 came around. And if you were paying attention, you might have noticed something: it was largely the same vision, re-presented. Some features had shipped (just about). Others had shifted timelines (still not rolled out). The product also had new names along the way.
We're now approaching TC26, a full 2 years or 735 days since the original demo, and I sincerely hope (for the brand’s sake) that this year we will see progress towards that vision. If you allow me to illustrate a very small example of what I mean, the vision they showed us for the Workspace in April 2024 is shown below on the left. Nearly 2 years on, the right is what Workspaces look like today. It’s a far cry from what we were promised; all the same, you might argue that they have shipped the workspace.

The Vision vs the reality today.
I use the Workspace as an example because it’s the part of the vision I liked most, but I could repeat the exercise above with many other parts of Tableau Next, and the same would apply.
It's also crucial and fair to point out that a ton of features have shipped in that time, but it’s also fair to say the vision we were pitched still hasn't been delivered. That's not because the vision was bad. It's because the problem was harder than the vision let on. And I think the gap between a compelling roadmap and the ability to actually execute against it tells us something important about how this product came to be.
The Venn Diagram Problem
Here's how I think about what's really going on with Tableau Next, and it’s also fair to say that hindsight is on my side.
Picture a Venn diagram. On one side, you've got Salesforce customers. On the other, Tableau customers. And in the middle, a small sliver of overlap where organisations happen to be both.

Now look at the left side of that diagram. It's huge. Millions of Salesforce customers without a BI SKU on their Salesforce invoice. That's what Salesforce sees as the prize. Take the Tableau brand, the most recognisable name in analytics, and use it to sell analytics (and compute through consumption while you're at it) into that enormous existing customer base. In theory, it's a logical, safe bet. You've acquired the best brand in BI. You've got a massive customer base that doesn't have analytics as a capability they purchase through you. Connect the two, and you've unlocked a huge revenue opportunity that you can drive with the Salesforce and Tableau brand.
There's a second layer too. It's not just about a new offering to customers. It's about keeping Salesforce data inside the Salesforce ecosystem. Rather than seeing that data exported to hyperscalers where customers are spending their compute credits elsewhere, Tableau Next becomes a vehicle for data gravity. Keep the analytics close to the CRM, and you keep the customer locked in. Genie, wait, Data Cloud, sorry, I mean Data 360, is designed for this very purpose. That said, that's not product vision. That's protectionism dressed up as innovation.
But even setting that aside, if we assume this is indeed the calculation being made, the bet makes three assumptions that I don't think hold up.
First, it assumes the industry is standing still, that the analytics market will wait patiently while you retool an existing product for a new audience.
Second, it doesn't validate whether the analyst persona actually translates to Salesforce's customer base. The people Tableau was built for are analysts, consultants, and data teams working across dozens of data sources. They chose Tableau because it was the best tool for exploring and visualising data, full stop. That's a very different user from a Salesforce admin who needs a dashboard on top of their CRM.
And third, perhaps most critically, it doesn't factor in the cost of the pivot itself. To market Tableau Next hard enough to win over that untapped Salesforce base, you risk alienating the core community of analysts who are the reason the brand has value in the first place. And you're doing this at exactly the moment those analysts are asking for the opposite: innovation to keep Tableau competitive against the rest of the industry.
In short, while Salesforce looks inward to serve its existing customer base, its analyst community is looking outward and asking why Tableau isn't keeping pace.
Why Vision Matters More Than Ever
Here's how this ties back into vision. When Salesforce acquired Tableau, having breadth and capability was a strong moat. The market rewarded platforms that could do a bit of everything. If your BI tool could handle dashboards, data prep, governance, collaboration, and embedded analytics, you would have a compelling pitch for enterprise procurement. So the Venn diagram play made even more sense in that context: take a broad, capable platform and push it into a massive new customer base. Simple right? No! That world is gone.
The centre of gravity has shifted back to the warehouse, accelerated by investment around enabling opportunities tied to AI innovation. The compute, the logic, the business definitions, the real action is happening closer to the data, not in the front-end tool that sits on top of it. And the tools that are gaining traction aren't the ones trying to own the entire stack. They're the ones who picked a specific workflow or outcome and committed to it with conviction.
Tools like Sigma, Omni, Count, and Hex. They don't try to be everything. They've each picked a specific workflow or way of working and committed to it.

Hex, as an example, is built on a specific bet: that the process of analysis is just as valuable to share as the polished end result. At its core, it's a computational notebook. You write SQL, Python, or both, mix in visuals and narrative, and work through your analysis in one continuous flow. Then, without switching tools, that same notebook becomes an interactive app you can hand to a stakeholder.
It's an opinionated product. It isn't trying to be a drag-and-drop BI tool or a general-purpose dashboard builder. It picked a lane: code-first, notebook-driven data work, productised and polished. For people who think and work that way, it's a remarkably good fit. And that shows. Hex users tend to be very happy because the tool was designed for their workflow, not for everyone else's.

Sigma takes a different approach entirely. It plants itself directly on top of your cloud warehouse and stays there. No data extraction, no proprietary engine in the middle. You get a familiar, spreadsheet-like interface, but every action writes back to Snowflake or Databricks under the hood. Full warehouse-scale analysis, no SQL required.
It's a product that knows exactly what it is, and crucially, what it's not. Sigma isn't trying to replicate what the warehouse does. It's designed to complement it. It leans hard into that partnership, and increasingly frames itself around "Apps", blurring the line between analysis and interactive data products. The philosophy is clear: the warehouse is the source of truth, and Sigma is just the best window into it.
What these tools share is product self-awareness. These tools understand their role in a larger ecosystem. They don't try to own the entire stack. They play their part brilliantly and trust the ecosystem to handle the rest. That clarity is the product vision. And it's what the large BI platforms are struggling to articulate.
I spend a lot of time with organisations that have invested heavily in large, capable platforms. Tableau, Power BI, the full enterprise BI stack. And what I consistently see is that most clients aren't using the full capabilities of these platforms, let alone the tools within them. Not even close. The breadth is there. The features are there. But the adoption isn't. These organisations have incredibly powerful tools that sit largely untouched because they try to do everything for everyone, and in doing so, they end up feeling bloated and present a huge challenge from an enablement perspective, so it’s not uncommon for the vendors to pair you up with an enablement partner just to grasp some of the opportunity the platforms have to offer. The sheer scope becomes a barrier rather than an enabler.
Big monoliths aren't just unattractive to clients anymore. They're actively the thing people are trying to move away from. The analytics buyers I talk to aren't looking for one tool that does everything. They're looking for the right tool that does their thing brilliantly, and plays nicely with everything else. In today's market, breadth doesn't translate into value. It translates into lock-in. A sea of unused capability that exists primarily to justify the increasing licence cost and make it harder to leave.
The Engine Hiding in Plain Sight
This brings me back to Tableau. And this is the part that I find genuinely frustrating, because I think Tableau is sitting on something the market needs. It just doesn't seem to know it.
When I look at my own video analytics, the pattern is clear: when I mention Tableau, most people still think of Desktop. The drag-and-drop. The visualisation layer. That's the brand association. And it's understandable. The desktop is what made Tableau famous but its also increasingly a smaller and smaller part of the platform.
In a world where the warehouse is the centre of gravity, being known primarily as a visualisation tool is a liability, not an asset. And here's the irony: underneath the desktop app, underneath the Cloud interface, underneath all the front-facing infrastructure, Tableau has spent years building an extraordinarily capable set of APIs. The REST API, the Metadata API, the Hyper API, the Extensions API, the Embedding API, VizQL. Layer by layer, Tableau has assembled one of the deepest API surfaces in the analytics space.

VizQL in particular is still, in my view, one of the most powerful query and visualisation languages out there. But it's notably weak in the browser. It was built for a desktop-first world, and it shows. The VizQL Data Service feels like a plugin layer on top, helping it play nicely in the browser and applications, but underneath the engine, it still struggles when you really push it.
Imagine if, instead of retooling the entire Tableau brand to chase a different customer base, Salesforce invested in re-engineering Tableau for the web and AI era: truly performant in the browser, exposed as a modern APIs, and ready for the kinds of composable, programmatic workflows the industry and the wider open source data community has been asking for. That would be investing in the ecosystem. That would be playing to Tableau's genuine strengths. Instead, this incredible API surface sits largely landlocked behind a developer skill set; discoverability and usability for non-developers are almost nonexistent.
Interestingly, Tableau actually mentioned the concept of "Headless BI" at TC24. It appeared briefly in the Devs on Stage showcase: the idea that developers could build custom applications on top of Tableau's engine without using the standard front end. But it was positioned as a niche capability rather than a strategic direction. A footnote, not a headline. I think that's a mistake. And I think Tableau should seriously consider a headless BI SKU. Not just as a product move, but as a brand move. A way to reposition Tableau around the part of its stack that the industry actually needs right now.
Now, yes, a headless BI SKU is a developer-first proposition. And I've just spent several words arguing that the APIs are landlocked behind a developer skill set. So let me be clear about why I don't think that's a contradiction.
Look at where the energy in the Tableau ecosystem already is. The most exciting things happening right now, LaDataViz, BizTory, the growing MCP server community, are all happening at the API layer. That's where the momentum is. That's where people are pushing boundaries and building things that genuinely extend what Tableau can do. The Tableau Exchange has also been quietly getting richer with capabilities but it’s also remarkable how hard Salesforce have made it for developers to push innovation. A simple example of expecting developers to become partners in the pre-2020 analytics era is a broken model.
Now, a headless SKU doesn't solve the accessibility problem on its own, but it puts Tableau's investment where the momentum already is, rather than pouring resources into a monolithic front-end rebuild for a customer base that may never arrive. And headless doesn't just mean raw APIs. It means the engine, embed capability, and drag-and-drop UI as a component you can place on top of your own data engine or application. Some of those are developer-facing, but the embed layer and the visualisation components are things that analysts and business users benefit from directly once someone has wired them up. The headless SKU serves the builders, and the builders serve the analysts. They’ve already shown that hey respond to user feedback and build capabilities faster than Salesforce can.
And here's the obvious thing: Tableau already knows this. The Marketplace was called out in that original 2024 vision. Salesforce clearly recognised that opening the platform to developers and partners was part of the future. The best play is to lean into that instinct fully. Open Tableau up. Let developers and their customers take the product to the next frontier and solve the challenges that Salesforce can't solve alone from the inside. That's how you build an ecosystem. That's how you stay relevant. Tableau's most capable, most differentiated assets are its APIs and its engine, and right now those assets have no commercial identity of their own. Giving them one would be the single clearest signal Tableau could send that it understands where the market is heading. More than that, it would be a chance to free itself from the Desktop-shaped corner it's currently unaware it's stuck in.
Imagine a version of Tableau where you don't buy Cloud, Server, or Desktop. You buy the engine. The ability to query data using VizQL, render visualisations programmatically, build data models, and connect to whatever sources you want, all through Tableau's underlying capability, without ever touching the traditional front-end. Let customers plug that into their own applications, workflows, and ecosystems. Make the capability the product, not the interface. That would position Tableau exactly where the market is heading: as a best-in-class component in a larger ecosystem, rather than trying to be the entire ecosystem itself.
Coming Up in Part 2
So if the strategy is wrong, the question becomes: how did we get here? How does a product as beloved as Tableau end up being reshaped around a customer base that isn't its own?
The answer, I think, has less to do with Salesforce specifically and more to do with a fundamental shift in how modern software gets built. When you can measure everything, a strong product vision becomes optional. And when vision becomes optional, something important gets lost. On Friday, I want to dig into what that something is and why it matters more than ever.
