Gartner: $1.5 Trillion AI Spend in 2025 Faces Data Quality Barriers
Enterprise data leaders identify data quality as the top barrier to AI success, with 73% citing it above other issues, according to a Demand Gen Report analysis.
Gartner Highlights $1.5 Trillion AI Investment Amid Quality Concerns
Companies spent $1.5 trillion on artificial intelligence in 2025, according to Gartner, yet 73% of enterprise data leaders rank data quality as the primary barrier to AI success, surpassing issues like model accuracy, compute costs, and talent. Additionally, 60% of companies report little to no value from their AI investments, indicating that poor data underlies these challenges.
The Complexity of Enterprise Marketing Stacks
Enterprise marketing teams manage stacks involving multiple systems, where leads flow from sources such as paid campaigns, content syndication, webinars, online forms, tradeshows, and telemarketing into a marketing automation platform (MAP) that connects to various CRM instances, data warehouses, analytics platforms, consent management systems, and AI models. Bad data in this setup propagates across the stack, affecting segments, scores, pipeline forecasts, and AI training data, as it moves through 10-to-15 systems and compounds errors silently.
B2B contact data decays at roughly 30% per year, with one study of over 1,200 business contacts showing that 70% experienced at least one change within 12 months, such as job title updates or email address shifts, and 94% of organizations suspect their customer and prospect data is inaccurate. The average enterprise CRM carries a 25% critical error rate on contact records, exacerbating the issue according to Demand Gen Report.
How AI Amplifies Data Problems
The Sirius Decisions “1-10-100 rule” states that it costs $1 to verify a record at entry, $10 to clean it later, and $100 if ignored, but in modern stacks where data syncs in real time across MAPs, CRMs, data stores, analytics platforms, and consent layers, bad data multiplies costs per system. Bad data costs the average organization $12.9 million annually, per Gartner, with MIT Sloan estimating a 15–25% revenue impact, and AI exacerbates this by amplifying errors without human intervention, as seen in cases where AI lead scoring models act on outdated information. Forrester noted that data quality is the primary factor limiting B2B GenAI adoption, and Gartner predicts that through 2026, organizations will abandon 60% of AI projects lacking AI-ready data, while a Sales Hacker survey found that 41% of predictive lead scoring initiatives failed due to CRM data issues.
US B2B marketing data spending growth is at 0.5%, per eMarketer, contrasting with 36% year-over-year growth in AI tool spending, which highlights the mismatch according to Demand Gen Report. BCG’s 10-20-70 framework advises allocating 10% of resources to algorithms, 20% to technology, and 70% to people and processes, including data governance and quality, to address these foundational problems. Additionally, 59% of organizations do not measure data quality, leaving them unable to assess their data infrastructure.