I Need IT Support Now
Measure Success
Shane

7 Factors That Drive Returns on AI Investments – And Why Your CFO Should Be in the Room

Why Your AI Investment Isn’t Paying Off And What To Do About It – Seven Practices That Separate Successful AI Adopters From The Rest

7 Factors That Drive Returns on AI Investments for SMBs
AI for SMBs

7 Factors That Drive Returns on AI Investments - And Why Your CFO Should Be in the Room

A survey of 1,006 executives reveals that AI value is a management problem,
not a technology problem - and most companies are solving the wrong one.

TL;DR
A new survey of 1,006 global executives found that 90% report at least moderate AI value, but how they get there matters more than the tools they buy. The seven drivers: define value clearly, pursue both product and process gains, use all AI types (not just generative), adopt a structured framework, involve your CFO, train everyone, and follow a maturity model that moves from pilots to formal reporting.

U.S. companies spent an estimated $37 billion on generative AI alone in 2025. Boards and executives are starting to ask a pointed question: where are the returns? And the consequences for leaders who can't answer are getting real - 71% of global CIOs said their AI budgets would be frozen or cut if value couldn't be demonstrated within two years.

A new research study published in Harvard Business Review by Thomas Davenport and Laks Srinivasan - "7 Factors That Drive Returns on AI Investments, According to a New Survey" - set out to understand how companies are actually faring. Based on a survey of 1,006 global senior executives and interviews with 12 enterprise AI leaders, the results were more positive than headlines suggest: 45% of respondents said they're getting a great deal of value from AI, and another 45% reported moderate value. Only about 9% said they're getting minimal returns.

SMB Takeaway

But here's the part that matters for Houston businesses: the companies getting the most from AI are not the ones with the most advanced technology. What separates them is how clearly leaders define value, who they hold accountable for delivering it, and how seriously they treat measurement. That's a management challenge, not a technology challenge. And it's one where a 50-person business has just as much ability to get it right as a Fortune 500 company.

SURVEY AT A GLANCE 1,006 Executives Surveyed 90% Report AI Value 7 Factors Identified HOW EXECUTIVES RATE AI VALUE 45% Great Deal of Value 45% Moderate Value 9% Minimal Value Source: Davenport & Srinivasan, HBR, Survey of 1,006 executives (2026)
🎯
Factor 1: Be Clear on What Type of Value You're Trying to Achieve
Value is in the eye of the beholder - and that's exactly the problem.

This sounds obvious, but the survey revealed a telling disconnect. About 14% of respondents said they're getting a great deal of value from AI but only slight ROI. Another 9% reported moderate value but substantial ROI. Those two groups are clearly measuring different things.

The Value vs. ROI Disconnect How companies define "value" from AI differs dramatically from financial ROI PERCEIVED VALUE Great Deal Small FINANCIAL ROI Slight Substantial HIGH VALUE, LOW ROI HIGH VALUE, HIGH ROI LOW VALUE, LOW ROI LOW VALUE, HIGH ROI 14% of respondents Transformation-Focused "AI is doing what we want" but ROI is still building 9% of respondents Financially-Focused Strong ROI numbers but modest perceived impact Source: Davenport & Srinivasan, HBR, Survey of 1,006 executives (2026)

Two groups measuring "value" in completely different ways - both are valid, but leaders need to pick one and be explicit.

The researchers put it simply: value should be understood as "AI is doing what we want it to do." Some organizations pursue short-term financial returns. Others are playing a longer game, building a technology foundation that positions them for future transformation. Both approaches can work, but you have to pick one and be explicit about it.

For a Houston-area business, the practical question is: what does "working" look like for your AI spend? If you're a CPA firm using AI to speed up document review, the metric might be hours saved per engagement. If you're a construction company using AI for project estimation, the metric might be bid accuracy or win rate. The worst position to be in? Spending money on AI tools without a clear answer to "what are we trying to accomplish here?"

📦
Factor 2: Seek Value in Both Products and Processes
Internal process improvements are the default. Customer-facing AI may be the bigger opportunity.

Most organizations default to using AI for internal process improvements - automating repetitive tasks, speeding up data entry, generating internal reports. That's where most businesses start and it makes sense. But the survey found that high-value organizations are also focused on embedding AI into their customer-facing products and services.

Where Should AI Investment Flow? AI Investment Internal Processes Where most companies focus ▸ Automating repetitive tasks ▸ Speeding up data entry and reporting ▸ Reducing operational costs ▸ Immediate, measurable ROI Customer-Facing Products Where high performers also focus ▸ AI-enhanced client deliverables ▸ Competitive differentiation ▸ Protecting market share ▸ Longer-term strategic value Source: Davenport & Srinivasan, HBR, Survey of 1,006 executives (2026)

The study highlighted executives at companies like Schneider Electric, where internal AI delivers more immediate financial returns while customer-facing AI represents a longer-term play focused on capturing market share. Each requires different approaches to measuring success and different timelines for seeing results.

There's also a defensive angle that doesn't get enough attention. One medical device executive made the point that AI capabilities in products protect market share - customers expect it, and companies that don't offer it will lose business to competitors who do. That's not a revenue gain. It's a revenue preservation strategy. And it's just as valid a reason to invest.

The Two-Track Test for Your Business Track 1: Where Can AI Cut Costs? ✓ Document review and processing ✓ Data entry and reconciliation ✓ Internal report generation ✓ IT support ticket routing ✓ Scheduling and resource allocation OUTCOME Immediate cost savings Track 2: Where Can AI Improve Service? ✓ Faster research for client deliverables ✓ AI-powered analytics in presentations ✓ Smarter project estimates and bids ✓ Enhanced client reporting ✓ AI features in your product/service OUTCOME Competitive differentiation + market share Only pursuing one track leaves money on the table. High performers pursue both.

Ask both questions: "Where can AI save us money?" and "Where can AI improve what we deliver to clients?"

For SMBs in Houston and Katy, think about this in two tracks. Track one is the obvious one: where can AI make your internal operations faster or cheaper? Track two requires more imagination: can AI improve the service you deliver to clients? A law firm using AI to deliver faster research to clients, or a wealth management firm using AI-powered analytics in client presentations - those aren't just efficiency plays. They're competitive differentiation.

🧰
Factor 3: Use All the Tools in the AI Toolbox
Generative AI gets the headlines. Analytical and rule-based AI generate the most value.

This finding should make every business owner pause and reconsider their assumptions. When asked which type of AI produces the most value, 50% of survey respondents pointed to analytical AI - things like dynamic pricing, customer targeting, and predictive analytics. Rule-based AI came in a close second at 40%, covering applications like insurance underwriting, anti-money-laundering systems, and robotic process automation. Generative AI? Just 9%. Agentic AI came in at 2%.

Which Type of AI Produces the Most Value? % of survey respondents selecting each type Analytical AI 50% Rule-based AI 40% Generative AI 9% Agentic AI 2% Source: Davenport & Srinivasan, HBR, Survey of 1,006 executives (2026)

Most businesses get the most value from analytical and rule-based AI - not generative AI.

Generative AI dominates every news cycle and conference stage. It's the category most people think of when they hear "AI." But for businesses focused on actual returns, the older, less glamorous forms of AI are pulling significantly more weight.

That said, the study did find that companies adopting agentic AI - the newest category, where AI systems can take independent actions - are 22% more likely to report achieving a great deal of overall AI value. Agentic adoption appears to be an indicator of organizational maturity, even though the technology itself isn't yet the primary value driver.

SMB Takeaway

Don't get so fixated on ChatGPT-style tools that you ignore the analytical and automation capabilities that might deliver faster, more concrete returns. A manufacturing company using predictive analytics for equipment maintenance, or an oil and gas services firm using rule-based AI for compliance documentation - those might not make for exciting LinkedIn posts, but they make for better financial results.

📋
Factor 4: Adopt a Structured Framework for Achieving Value
An AI playbook, a product orientation, or a stage-gate approach - pick one and follow it.

Companies that extract real value from AI tend to have a structured process for moving initiatives from idea to production to measured outcomes. The survey found multiple approaches that work: one financial services company uses a custom "AI playbook" that guides business lines from use-case exploration through responsible deployment. An electrical utility applies a stage-gate method borrowed from R&D. The most common approach among the executives interviewed was a digital product orientation - managing each AI initiative as a product with defined stakeholders, timelines, and accountability.

One executive credited the product orientation as the single most important factor in achieving value, because it creates a structure for proposing expected benefits, reviewing results over time, adjusting the approach, and holding stakeholders accountable for what gets delivered.

Three Proven Frameworks for AI Value Pick one structured approach and follow it from idea to measured outcome AI Playbook Custom internal guide Explore Use Cases Deploy Responsibly Measure Value Stage-Gate Borrowed from R&D Concept Build Gate ✓ Test Deploy Gate ✓ MOST COMMON APPROACH Product Orientation Manage AI as a product Conceive & Propose Build & Ship Review & Iterate ⚠️ ANY FRAMEWORK REQUIRES DATA READINESS 55% of organizations cite unready data as their biggest obstacle to AI value Source: Davenport & Srinivasan, HBR, Survey of 1,006 executives (2026)

Three approaches that work - the product orientation was cited most frequently among executives interviewed.

The framework has to include a data readiness component too. The survey found that 55% of respondents cited unready data as an obstacle to getting value from AI. That's consistent with other research we've seen. You can have the best framework in the world, but if your data is scattered across disconnected systems with no quality controls, AI can't perform.

💡

For Houston SMBs: Start With the Data

Over half of organizations in this survey said unready data is blocking their AI results. Before investing in AI tools, get your data house in order - clean records, connected systems, and solid cybersecurity protecting it all. A managed IT services provider can assess your infrastructure readiness before you spend a dollar on AI.

Learn about CinchOps data infrastructure services →
💰
Factor 5: Involve the CFO and Finance Function in Certifying Value
When the CFO owns AI value accountability, 76% of organizations report high returns. Under other leaders, that number drops by half.

This is maybe the most striking finding in the entire study. Most organizations assign AI value accountability to their chief data/analytics/AI officer (38%) or individual functional executives (35%). Only 2% assign it to the CFO. But when the CFO is responsible for achieving AI value, 76% of organizations reported getting a great deal of value from their investment. Compare that to 53% under CIOs or CTOs, and just 32% under functional line executives.

The CFO Effect: Who Owns AI Value Accountability? % of organizations reporting "great deal" of AI value, by accountability owner 76% CFO (2% assign) 53% CIO / CTO (38% assign) 32% Functional Execs (35% assign) Source: Davenport & Srinivasan, HBR, Survey of 1,006 executives (2026)

Only 2% of organizations assign AI accountability to the CFO - but those that do see dramatically higher returns.

Why the massive gap? Finance leaders bring measurement rigor, organizational credibility, and the authority to hold other departments accountable for results. They're trained to validate claims, question projections, and demand evidence. When the person asking "did this AI project actually deliver?" is the same person who controls the budget, the answers tend to get a lot more honest.

The study highlighted DBS Bank in Singapore, where the finance organization partners with technology executives to certify AI value. Each business unit's CFO validates their respective results, which then roll up to an aggregate company-wide number that gets published in the bank's annual report.

For a small business, this doesn't mean hiring a CFO just for AI. It means whoever manages your books - your controller, your outsourced accounting firm, your bookkeeper - should be part of the conversation when you're evaluating AI results. Is the tool actually saving the hours we projected? Did the automation reduce the cost we expected? If nobody with financial discipline is asking those questions, the answers tend to be vague and self-serving.

Who Should Be Validating Your AI Results? For SMBs: bring financial oversight into every AI evaluation $ Controller / Bookkeeper Validates cost savings and budget impact 📊 Outsourced Accounting Firm Certifies ROI projections vs. actual outcomes % Financial Advisor / CFO Aggregates value across all AI investments AI VALUE VALIDATION Did the tool actually deliver the hours saved, cost reduced, or revenue gained we projected? Financial discipline in AI evaluation produces 76% high-value outcomes vs. 32% without it
🎓
Factor 6: Train Both Users and Executives on AI
58% of organizations haven't trained employees on AI tools. The barrier isn't employee resistance - it's leadership inaction.

The training gap runs in two directions, and both matter. According to the survey, 58% of organizations have not trained their employees on AI productivity tools and workflows. Separately, 29% acknowledged that their leadership team lacks sufficient understanding to drive AI value creation. Organizations that invest in both employee skills and leadership fluency see a 23-percentage-point advantage in value realization compared to those that don't.

THE TRAINING GAP Employees untrained on AI tools 58% Leadership lacks AI understanding 29% THE MYTH Employee resistance to AI 13% Employees aren't resisting. They're waiting. For direction, training, and leadership. Training both employees + leadership = +23 percentage point advantage in value realization Source: Davenport & Srinivasan, HBR, Survey of 1,006 executives (2026)

Here's the detail that changes the narrative: only 13% cited workforce resistance as a barrier to AI value. Employees aren't pushing back. They're waiting. Waiting for clear direction from leadership, waiting for training that makes the tools useful in their specific roles, waiting for someone to remove the other barriers like missing frameworks and messy data.

SMB Takeaway

Training doesn't have to be expensive or complicated. A few focused sessions showing employees how AI applies to their actual daily tasks - paired with leadership visibly using the tools in meetings and decisions - closes the adoption gap faster than any enterprise training program.

📊
Factor 7: Follow an AI Economic Value Maturity Model
A six-stage progression from unmeasured pilots to formal reporting - with two major inflection points along the way.

The researchers developed a maturity model based on three components: putting AI into actual production (not just running pilots), assessing the value of those production use cases before and after implementation, and aggregating and reporting that value across the organization. Based on survey responses, they identified six stages with dramatically different outcomes.

The stages and the percentage of organizations in each that reported getting a "great deal" of AI value tell a clear story:

  • Stage 0 - Unmeasured Pilots (3% of organizations): Running AI experiments with no outcome measurement. Only 4% in this stage report high value.
  • Stage 1 - Production Without Assessment (11%): AI is in production but nobody's evaluating the business impact. Just moving past pilots jumps the number to 18% reporting high value.
  • Stage 2 - Pre-Implementation Assessment (17%): Building ROI projections and business cases before deployment, but not validating outcomes afterward. Only a modest improvement to 20%.
  • Stage 3 - Post-Implementation Assessment (30%): Measuring individual use cases after deployment. This is the first major inflection point - 44% report high value, more than double Stage 2. But companies seem to get stuck here, spending a median of six years at this stage.
  • Stage 4 - Aggregated Annual Assessment (21%): Rolling up AI value across the portfolio annually and making results available internally. Another significant jump to 58% reporting high value.
  • Stage 5 - Formal Reporting (16%): Reporting AI value to boards, investors, or public markets. This is the second and largest inflection point - 85% report high value.
AI Economic Value Maturity Model % of organizations reporting "great deal" of value at each stage 80% 60% 40% 20% 4% Stage 0 Unmeasured Pilots 18% Stage 1 Production, No Assessment 20% Stage 2 Pre-Implementation Assessment 44% Stage 3 Post-Implementation Assessment ⚡ 1st Inflection 58% Stage 4 Aggregated Annual 85% Stage 5 Formal Reporting ⚡ 2nd Inflection Source: Davenport & Srinivasan, HBR, Survey of 1,006 executives (2026)

Two inflection points: measuring after deployment (Stage 3) and formal reporting (Stage 5) produce the biggest jumps in value.

Two things jump out. First, the biggest single improvement comes from measuring AI outcomes after deployment - not before. Business cases and projections are fine, but the real value signal comes from asking "what actually happened?" Second, organizations that formally aggregate and report AI value see dramatically better results, likely because the discipline of reporting forces accountability and honest assessment.

For a 50-person business in Sugar Land or Cypress, you're not going to publish AI results in an annual report. But you can absolutely track what your AI tools deliver, review those numbers quarterly with your leadership team, and compare total AI spend against measurable outcomes. That's Stages 3-4, and the data shows a massive difference in results.

🛠️
How CinchOps Can Help
Practical AI guidance built on 30+ years of IT experience.

CinchOps is a managed IT services provider based in Katy, Texas, serving small and mid-sized businesses across the Houston metro area. CinchOps specializes in cybersecurity, network security, managed IT support, VoIP, and SD-WAN for businesses with 10-200 employees.

This survey confirms what we've seen across 30 years of technology deployments: the tool is never the hard part. Getting it to work inside a real business - with messy data, untrained staff, and unclear goals - is where most investments succeed or fail. Here's where we help Houston SMBs with their AI strategy:

  • Data infrastructure readiness: We audit your data environment, identify gaps in quality and connectivity, and build the foundation that AI tools need to perform. 55% of organizations in this survey said unready data is their biggest AI obstacle - our managed IT team fixes that problem before it costs you money
  • Cybersecurity for AI implementations: AI tools create new data flows and new attack surfaces. We ensure your AI adoption doesn't become a security liability
  • Business process automation: We help design and implement the structured frameworks this survey identifies as a key value driver - connecting AI tools to actual workflows with defined outcomes
  • Managed IT support for AI environments: As you scale AI across your business, we keep the underlying infrastructure stable, secure, and performing at the level these tools demand
  • CTO/CIO advisory services: We provide the strategic technology guidance that helps you define value targets, select the right AI tools, and measure results with discipline

The path to AI value is a management challenge, not a technology challenge. We've been helping Houston businesses work through exactly that kind of challenge for decades. If you're spending money on AI and not sure what you're getting back, that's a conversation worth having.

Frequently Asked Questions

What type of AI delivers the most business value for small businesses?

According to the Davenport-Srinivasan survey of 1,006 executives, analytical AI like predictive analytics and customer targeting delivers the most value (cited by 50% of respondents), followed by rule-based AI and robotic process automation (40%). Generative AI was cited by only 9% as the most valuable type. Small businesses should consider analytical and automation tools alongside generative AI rather than focusing exclusively on chatbot-style applications.

How can a small business measure AI return on investment?

The survey's maturity model shows that the biggest improvement in AI value comes from measuring outcomes after deployment, not just building business cases beforehand. For SMBs, this means tracking specific metrics - hours saved, error rates reduced, revenue influenced - for each AI tool in production. Companies that aggregate these measurements and review them quarterly see dramatically better results than those that deploy AI tools without ongoing evaluation.

Why do most AI pilot programs fail to produce returns?

The survey identified several factors: organizations run experiments without measuring outcomes (Stage 0 of the maturity model, where only 4% report high value), they focus on generative AI alone while ignoring more valuable analytical and rule-based AI applications, and 55% cite unready data as a direct obstacle. Additionally, 58% of organizations haven't trained employees on AI tools, and 29% acknowledge their leadership lacks the understanding to drive AI value - creating an adoption gap from both directions.

Who should be accountable for AI results in a small business?

The survey found that organizations where the CFO or finance function owns AI value accountability report dramatically higher results - 76% achieving high value compared to 53% under CIOs and 32% under functional line managers. For SMBs without a dedicated CFO, this means whoever manages financial oversight should be involved in evaluating AI results. The discipline of financial review forces honest assessment of whether AI investments are actually delivering measurable returns.

Is employee resistance a major barrier to AI adoption?

No. Only 13% of survey respondents cited workforce resistance as a barrier to AI value. Employees are not resisting AI tools - they're waiting for effective direction from leadership, practical training on how to use tools in their specific roles, and the removal of other obstacles like missing value frameworks and unclean data. The training gap is the real problem: 58% of organizations haven't trained employees on AI, and investing in both employee and leadership AI training produces a 23-percentage-point improvement in value realization.

100% Free

Know Your Business Security Score

Get a FREE comprehensive security assessment for your Houston area business. Understand vulnerabilities across your network, applications, DNS, and more.

Discover More

Sources

Take Your IT to the Next Level!

Book A Consultation for a Free Managed IT Quote

281-269-6506