

This editorial appeared in the June 12th, 2025, issue of the Topline newsletter.
Want all of the latest go-to-market insights without the wait? Subscribe to Topline and get the full newsletter, including bonus commentary and takeaways, delivered to your inbox every Thursday.
A year ago, OpenAI announced that 92% of the Fortune 500 was using their AI. Look beyond the Fortune 500 and you see the same thing. 90% of US companies are using or exploring AI. In SaaS, it's 98% — virtually everyone has AI somewhere in their stack.
We've never seen technology adoption at this pace which is why it's one of the defining stories of this platform shift.
But there's another number we need to talk about.
Gartner just revealed that 85% of AI initiatives fail to deliver their promised value. Nearly nine out of ten companies are striking out.
The knee-jerk reaction is to blame the AI. Maybe it's not as good as promised. Maybe we've been sold a fantasy.
Wrong! The AI is fine.
The problem is us. Our data is a mess — and data is the lifeblood of AI. Our track record with digital transformations is even worse — they succeed only 12% of the time.
Feed world-class AI garbage data. Wrap it in broken processes. And you get where we are today.
What about the 15% of companies who are winning? They're not doing anything magical. They just understood something the others missed: AI amplifies what you already are. If your data is clean and your processes work, AI makes you unstoppable. If not, then AI just helps you fail faster.
The path from the 85% to the 15% isn't complicated. But it does require something most companies avoid: fixing the fundamentals before chasing the future.
The AI age is analogous to the computer age of the 1970s-1990s. A “productivity paradox” emerged when companies panic-spent billions on computers only to watch output flat-line. Economist Robert Solow’s quip resonates: “You can see the computer age everywhere but in the productivity statistics.”
Go-to-market leaders of today cannot assume purchasing powerful AI tools will lead to productivity gains. There is more to it. Leaders would be wise to look to McKinsey’s famous case study of Wal-Mart’s experience in the computer-age productivity paradox.
Until the early-1980s a Wal-Mart checkout looked like everyone else’s: price tags on every item, cashiers punching numbers into electromechanical registers, nightly paper tallies, and buyers re-ordering by phone or fax. Store managers did manual shelf counts and mailed weekly reports to the HQ in Bentonville. Wal-Mart was in a position to bring on new technology to modernize these workflows.
Between 1983 and 1987 the retailer blanketed every checkout with UPC scanners and spent US $24 million (~US $70 million today) on a private satellite network that funneled real-time sales data to their HQ. But there was a problem: the company wasn’t set up to actually leverage the data.
→ No productivity gains
Then, in 1991 executives poured roughly US $4 billion (a lot by any standard!) into Retail Link, an extranet that let suppliers see the same barcode feed and place their own replenishment orders. There was an assumption that if the data bypassed HQ and made it directly to suppliers, those suppliers would act faster. To everyone’s surprise, productivity still stayed flat and Wal-Mart’s own sales-per-employee curve was still hugging the industry average well into the mid-1990s.
→ No productivity gains
Wal-Mart had the data, then it had the infrastructure, but the payoff arrived only after the company overhauled their entire workflow around the tech. This meant cross-dock warehouses, vendor-managed inventory, and algorithmic reorder points that finally turned raw scans into instant shelf replenishment.
→ Huge productivity gains
From 1995 to 1999 the chain’s sales-per-employee leapt by 22%, and rivals forced to copy its playbook added another 28%, a ripple effect McKinsey credits with one-third of the total productivity surge in U.S. general-merchandise retail. In other words: the costs hit the balance sheet in the 1980s, but the productivity dividend didn’t land until the workflows caught up a decade later.
It wasn’t enough for Wal-Mart to just buy the game-changing technology available to them. They needed to actually build and optimize around it.
The reality is that AI is working so long as a company is AI-ready. Teams with disciplined processes, clean data, and simple hand-offs saw their revenue engine thrive. Accenture categorized companies by AI-readiness and those they deemed “re-invention ready” achieved “Scary Good” results. Layering thoughtful AI tooling on top of a strong foundation resulted in 2.5x revenue growth and a 3x jump in individual productivity among this cohort.
Today, AI can draft an email, score a lead, and summarize a sales call in 30 seconds, but this is only effective if your CRM isn’t a junk drawer. If it is, AI simply creates errors and noise faster and buries it all deeper. Gartner, in polite language, calls this “multiplying the chaos.” Data from Cloud Security Alliance shows that organizations with poor data hygiene and no process discipline see AI projects fail twice as often as they succeed.
In other words, the tech doesn’t merely expose operational cracks: it blows them wide open. Companies that survived the pre-AI era through sheer hustle, and perhaps excessive ZIRP-era funding, may have been just functional enough to survive until now, but add autonomous prospecting or auto-generated sequences to that mix and the wheels will come off fast.
The explosion in AI tooling also happens to be coming at a time where tech stacks are already getting excessively large. In 2015, most B2B SaaS companies ran eight SaaS apps. The 2023 median is 80, and some organizations now break 200. On average, employees spend 1.8 hours per day ping-ponging between tabs (Mckinsey), and only 29 percent are happy with their toolset, down from 40 percent two years ago (Reworked).
There is nothing inherently wrong with an elaborate tech stack if it is implemented well, but if it isn’t, layering on automation-rich AI tools is going to compound the hygiene problem.
Poor Hygiene + Tool Sprawl + Digital Fatigue
Not good.
This widening delta between efficient, AI-ready organizations and the rest is almost certainly a final nail in the coffin of a generation of ZIRP-era startups that never cleaned up their act. Meanwhile, good organizations will continue to do what they do: be good.
I get it. Everyone's racing to adopt AI. Nobody wants to be Blockbuster watching Netflix eat their lunch.
But adoption for adoption's sake is theater. The goal isn't to use AI — it's to win with it.
AI needs something specific from us to work: clean, structured data. In GTM, that means mapping every step from lead to renewal. Capturing every interaction that matters. Building the discipline of data hygiene into your culture.
Skip this work and your expensive new AI tools can't tell signal from noise. But they'll still do something. And that's the problem because AI is an accelerant. Pour it on a well-oiled revenue engine and it compounds every advantage. Pour it on chaos and you'd better run for cover.
Walmart learned this the hard way. They had barcode data for years before they figured out how to use it. Once they rebuilt their operations around that data, they became unstoppable. But that transformation took time and money most startups don't have.
The companies winning with AI understand this sequence: First, fix your fundamentals. Then, and only then, add the accelerant.
CEO @ VEN