In a rapidly evolving enterprise AI landscape, organisations are moving decisively beyond pilots toward real-world deployment. Across industries, CIOs are rethinking, “How can I ensure my organisation’s IT is structured, funded, and aligned to business outcomes. “
Speaking with iTNews Asia on how CIOs can overcome inherent challenges and successfully plot their AI implementation, Kumar Mitra, Managing Director & GM, CAP & ANZ, ISG, Lenovo, shares his take on a number of areas, from how AI is transitioning into a foundational enterprise capability, to ways to reshape infrastructure, improve workforce productivity, as well as the long-term IT strategy they should look at.
Mitra highlighted that the convergence of AI inferencing, employee productivity, and AI scaling is today marking a critical turning point in enterprise adoption. Enterprises are no longer focused solely on training models. Instead, the emphasis has shifted to real-world deployment, where AI delivers value through continuous, day-to-day use.
“Inferencing is where AI delivers its most immediate value through real-time insights and decision support that directly impact employee productivity,” he explained.
According to Mitra, the transformation is being driven by a combination of economic realities and technological readiness. CIOs are now prioritising use cases tied to measurable business outcomes, whether improving productivity, streamlining operations, or enabling growth.
“Technology maturity has provided the foundation for economic realisation. At the same time, economic pressure is forcing sharper discipline around investments,” he added.
A structural shift, not a temporary trend
While short-term pressures may have accelerated AI adoption, Mitra emphasised that the shift is long-term and structural.
“CIOs are redesigning IT around AI as a foundational capability, embedding it into workflows, decision-making, and operations rather than treating it as a standalone initiative,” he said.
This transformation is also redefining IT operating models, governance frameworks, and investment strategies, moving away from fixed cycles toward continuous value delivery.
Inferencing redefines infrastructure design
As inferencing becomes the dominant cost and workload driver, infrastructure strategies must also evolve accordingly.
“AI workloads will be persistent, widely used, and distributed. Infrastructure must prioritize efficiency, scalability, and reliability from day one,” Mitra said.
Hybrid architectures are emerging as essential, balancing centralised model training with distributed inferencing across data centers, edge environments, and devices. He said this approach helps manage costs, latency, and data sovereignty requirements.
Hybrid-by-default: The new enterprise standard
Mitra emphasised that centralised cloud strategies alone are no longer sufficient, particularly in Asia Pacific markets with strict latency, privacy, and data sovereignty requirements. “Long-term planning must move toward a hybrid-by-default model,” he said.
Industries such as financial services and healthcare must also navigate strict regulatory requirements, making localised data processing critical. “The edge-to-cloud fabric ensures resilience, reduces dependency on connectivity, and optimises workload placement based on use-case needs,” Mitra added.
While the benefits are clear, operational challenges remain significant. Despite growing adoption, Mitra said managing the distributed inferencing at scale remains a major hurdle.
CIOs must address complexities in orchestration, model lifecycle management, and governance, while ensuring compliance with data sovereignty and responsible AI standards. Many organisations struggle to move beyond pilots due to gaps in operational readiness.
“Treating inferencing as a core operational capability is critical to scaling AI with predictability and control,” he said.
You need to rethink how you measure your ROI
Traditional ROI metrics are proving insufficient in measuring AI success. “ROI must be framed around repeatable business outcomes, not isolated cost savings,” Mitra noted.
Metrics such as decision speed, task completion time, and adoption rates are becoming more relevant indicators of success, especially for AI-enabled devices operating at the edge. “Productivity scales when employees trust and routinely use AI as part of their workflow,” he added.
Employee productivity has rapidly emerged as a top CIO priority, reflecting a deeper shift in how organisations measure AI success. The focus instead should move beyond automation and towards human-AI collaboration, where employees and AI systems operate in a continuous loop of perception, reasoning, and action.
“The goal is to redesign how work gets done, enabling employees to focus on high-value activities while AI handles routine, data-intensive tasks,” he explained.
Why AI projects still fail to scale
Despite strong expectations, only a fraction of AI proofs-of-concept reach production. “The gap is less about ambition and more about operational readiness,” Mitra said.
Key barriers include lack of production-grade data, insufficient AI operations skills, and weak governance frameworks. “Many early projects focused on experimentation rather than outcomes, leading to too many POCs without defined success metrics,” he noted.
Mitra also pointed to a growing imbalance in enterprise AI strategies. Organisations often over-engineer pilots while under-investing in data readiness, governance, and operational skills.

Most AI failures are not model failures, they stem from weak data discipline and immature operating models.This disconnect leads to high costs and limited scalability.
- Kumar Mitra, Managing Director & GM, CAP & ANZ, ISG, Lenovo
Successful enterprises, he noted, are focussing on building strong operational foundations rather than overly complex architectures.
Looking ahead, Mitra recommends CIOs treat AI as a business growth capability rather as an IT project. “Organisations must align growth intent, enterprise scale, and trust-by-design as a single strategy. The winners will be those that embed AI into core workflows with clear outcomes, turning it into a durable operational capability that delivers sustained business impact.”




