When innovation gets stuck: Humanity at the core

By Published On: February 25, 2026Last Updated: March 6, 2026
When innovation gets stuck: Humanity at the core

By Gil Bashe, Chair Global Health and Purpose, FINN Partners, and author of Healing the Sick Care System: Why People Matter, is a Health Tech World Correspondent

Vendor lock-in sounds technical, yet its consequences are deeply human.

It occurs when an organisation becomes so dependent on a technology platform that moving away becomes costly, risky or disruptive.

Data becomes difficult to move. Workflows grow tightly bound to proprietary systems. Security and governance rely on one environment.

When that dependency limits the ability to adapt, evolve or respond to risk, lock-in stops being a technical issue.

It becomes strategic and human vulnerability.

That is why the latest State of Cloud Computing Survey from Parallels deserves attention across the health ecosystem.

The global study, based on more than 600 IT leaders across North America, Europe and Asia, found that 94 percent fear vendor lock-in.

The report describes “deeper structural pressures including vendor lock-in risk, operational fatigue and a growing gap between AI marketing and real-world priorities.”

With digital platforms now central to how health systems function, these findings reach far beyond technology procurement.

They signal a broader reassessment of digital infrastructure.

At nearly the same time, Harvard Business Review published Why Great Innovations Fail to Scale.

Gil Bashe

The article argues that promising innovations often falter not because they lack value, but because organisations fail to build the leadership alignment and collaboration needed to carry them from pilot into durable practice.

Together, these signals point to a deeper truth. Innovation does not fail for lack of intelligence.

It fails when the leadership structure and shared commitment required to sustain it are incomplete.

Human Freedom is a Tipping Point for IT Security

The Parallels findings reflect a quiet shift in mindset.

As Prashant Ketkar, Chief Technology and Product Officer at Parallels, observed, “Last year, organisations were focused on escaping rising costs.

“This year, they are focused on avoiding regret.”

In health systems, regret is not financial. It appears when platforms become so embedded that they cannot evolve, when security depends on systems that cannot adapt, when support weakens or direction becomes uncertain.

More than half of surveyed leaders worry about future support and nearly half about unclear product direction.

When digital platforms underpin clinical operations, cybersecurity and data governance, uncertainty becomes risk.

Vendor lock-in, once viewed as a procurement issue, is now understood as a security issue.

Security is no longer defined only by protection against breaches. It is defined by the freedom to adapt safely over time.

This shift explains the move toward hybrid and multi-cloud strategies.

Leaders seek control, flexibility and resilience.

Within the health ecosystem, where data represents identity and trust, the ability to evolve without losing control is foundational. Freedom, in this context, is a form of security.

There is another signal in the data that deserves a plain-language explanation.

Not long ago, many organisations invested in artificial intelligence because it was new and exciting.

Having the technology itself was often seen as the value. That mindset is changing.

The survey shows only a minority are willing to pay more for AI.

Leaders are now asking a grounded question. Does this help us run better, safer and more reliably? If the answer is no, interest fades quickly.

In simple terms, organisations are no longer buying AI because it is impressive. They adopt it when it is useful, when it stabilizes operations and when its impact can be measured.

This is the moment when AI moves from hype to proven infrastructure.

Equally important, AI is not being positioned as a replacement for people. It is used to detect problems earlier, automate routine tasks and reduce administrative burden.

In complex environments such as health systems, judgment, coordination and trust remain human responsibilities.

AI strengthens systems when it supports people. It weakens them when it attempts to replace them.

Scaling Innovation Requires a Human Connection

The Harvard Business Review perspective deepens the story.

Innovation often stalls because organisations underestimate the difficulty of scaling across silos.

The article highlights the need for connectors, people and structures that bridge functions, align priorities and embed innovation into daily practice.

Technology alone does not scale innovation. People do.

The Parallels data reveal why this matters.

Operational fatigue is widespread. Many organisations spend significant time managing infrastructure and staffing strain is often the largest hidden cost.

Fatigue is not simply an efficiency issue. It is a security issue.

When teams are stretched, monitoring weakens, patching slows and vulnerability grows.

In mission-critical environments such as health systems, resilience depends on both technology and human capacity.

Artificial intelligence sits at the epicentre of this tension.

Organizations are prioritising AI to detect issues earlier, automate routine maintenance and reduce repetitive administrative work.

This reflects a shift from experimentation toward operational value.

AI is extending human capability, not replacing it.

Concern persists that AI may displace jobs.

In health systems already under workforce strain, this anxiety is well placed. The reality is more balanced.

AI cannot build trust, bridge departments or navigate the complexity of clinical, legal and regulatory decisions. Those responsibilities remain human.

The greater risk is not job loss. It is poorly integrated AI that adds complexity instead of relief.

The message for leaders across the health ecosystem is clear. Technology lock-in must be managed carefully, balancing operational strength with long-term flexibility.

Platforms must preserve adaptability, data sovereignty and resilience.

Artificial intelligence must be judged by whether it reduces burden and strengthens operational stability.

Scaling innovation requires investment in people who connect technology, operations, security and governance, and continued investment in human capability.

Technology alone does not create trust. People do. Leadership does. Culture does.

Secure, adaptable and sustainable systems reinforce confidence.

Systems that trap organizations, strain teams or obscure responsibility erode it.

The health ecosystem is entering a more disciplined phase of digital transformation. Early enthusiasm surrounding cloud and AI is giving way to realism.

Leaders are prioritising resilience over speed, sustainability over novelty and freedom over dependency.

These are strategic choices that determine whether innovation becomes lasting infrastructure or temporary disruption.

Innovation does not fail because it lacks imagination. It fails when the structures needed to sustain it are neglected.

Imagination is the nucleus of innovation.

The next chapter of digital health will belong to leaders who understand that resilience, security and human connection must scale together.

Healthcare AI shifting from pilot to profit, report finds
Kyndryl and Liverpool uni partner to advance AI innovation in healthcare