Forty-two percent.
That is the share of companies that abandoned the majority of their AI initiatives this year, according to S&P Global Market Intelligence's 2025 survey of over 1,000 enterprises. Last year, the figure was seventeen percent.
The data paints a stark picture of the current landscape: Two-thirds of organizations cannot move even half their AI pilots into production. Proof-of-concept projects are currently being scrapped at a rate of 46% before they ever deliver a single result.
For private equity professionals watching portfolio companies chase AI transformation, these statistics raise an uncomfortable question: how do you distinguish the initiatives that will generate returns from those that will quietly drain resources?
The answer, it turns out, rarely lies in the technology itself.
The real barrier to scale
Informatica's CDO Insights 2025 survey of 600 chief data officers found that 43% cite data quality, completeness, and readiness as the leading obstacle preventing AI initiatives from reaching production. Not algorithms. Not computing power. Not talent.
When AI fails, technology is almost never the problem. The data is.
This has practical implications for anyone evaluating a business—whether for acquisition, growth investment, or operational improvement. Data readiness is not merely a technical prerequisite for AI. It is a leading indicator of operational maturity, and one of the most reliable signals available during due diligence.
What data readiness reveals
From our experience, a business that struggles to produce clean customer records often has broader gaps worth understanding. Fragmented systems point to inconsistent processes. Missing historical data signals blind spots in management information. When a company cannot answer basic questions about customer profitability or working capital trends, the issue is rarely just data—it is visibility into the business itself.
The reverse is equally instructive. Companies that have built solid data infrastructure—even without AI in mind—tend to demonstrate the operational discipline that supports scalable growth. They segment customers reliably. They see margin erosion before it becomes a crisis. Their reporting scales with the business rather than breaking under complexity.
Data readiness absolutely correlates with how a management team thinks about their business. Companies that treat data as a strategic asset tend to make better decisions across the board; the rest tend to repeat the same mistakes.
These patterns tend to surface during any serious evaluation. Management teams that lack clean data on past performance rarely have reliable forecasting capabilities either. When sales, operations, and finance each maintain their own version of truth, expect misalignment on priorities and slower decision-making. Poor data quality usually reflects poor process discipline—and decisions made on bad data create more problems, which create more bad data.
Three layers of data readiness
Data readiness is not a single checkbox. It unfolds across three layers, and each one matters.
Availability
The starting point is whether the data exists at all. In many SME companies, it does not—at least not in any accessible form.
Customer profitability has never been calculated because no one mapped revenue to the actual cost of serving each account. Contracts sit buried in email inboxes, and pricing history is impossible to reconstruct—deals were negotiated case by case, nothing recorded systematically. Operational metrics were tracked diligently for a specific project, then quietly abandoned when priorities shifted.
Companies also accumulate noise. Hundreds of reports pile up over the years—each one useful at some point, now just obscuring what matters. The information exists somewhere, but surfacing it means digging through legacy outputs, mapping what connects to what, and piecing together a coherent picture. We have spent more hours than we'd care to admit reconciling financial data from scattered exports and half-forgotten reports before any analysis could begin. That work is tedious. It is also unavoidable.
Structure
Data that exists in isolation creates its own problems. The CRM tracks customer interactions, the ERP manages orders, the accounting software handles financials—but none of them share a common identifier. The practical result: the CRM shows a customer as highly engaged while the ERP flags them sixty days overdue. Nobody holds the complete picture, and teams waste hours reconciling conflicting records by hand.
This fragmentation is not accidental. ERP and software vendors are optimized for implementation, not ongoing data architecture. The work of restructuring data flows, redesigning management reports, or building bridges between systems—that kind of partnership is less common than businesses expect. Once a system goes live, the work of making data actually useful falls elsewhere.
Quality
The final layer is accuracy and completeness. Research published in Harvard Business Review found that only about 3% of company data meets basic quality standards. Forty-seven percent of newly created data records had at least one critical error.
The issues tend to be familiar. Duplicate entries split a single customer's history across multiple records. Naming conventions vary by who made the entry. Exports that should be consistent change format month to month—columns shift, date formats flip, fields appear or disappear. What could be an automated process becomes manual cleanup every time.
The path forward
Most failed AI initiatives never get far enough to test what the technology can do. They stall earlier—at the point where ambition meets the reality of undocumented processes, disconnected systems, and years of accumulated data debt. The gap between strategy and execution is not a technology problem. It is a readiness problem.
The path forward requires honesty about where things stand across all three layers. Invest in data infrastructure before investing in AI models. Recognize that the unglamorous groundwork of cleaning, structuring, and connecting data is not a preliminary step to rush through. It is the work that makes everything else possible.
Portfolio companies that succeed with AI will not be the ones with the most advanced tools or the most ambitious strategies. They will be the ones that got the foundation right first—or worked with partners who helped them build it before the pressure to show results made shortcuts inevitable.
Sources
- S&P Global Market Intelligence, "Voice of the Enterprise: AI & Machine Learning, Use Cases 2025" (survey of 1,006 respondents, North America and Europe)
- Informatica, "CDO Insights 2025: Racing Ahead on GenAI and Data Investments While Navigating Potential Speed Bumps" (survey of 600 CDOs)
- Nagle, T., Redman, T.C., and Sammon, D., "Only 3% of Companies' Data Meets Basic Quality Standards", Harvard Business Review, September 2017
Sophia Rizopoulou is Business Performance Management Associate at Fortivis, applying financial analysis and quantitative modeling to portfolio company operations. Trained in economics and computer science at Bocconi University, she builds performance dashboards and data infrastructure that turn fragmented information into usable insights for management teams.
