What separates AI leaders from laggards?
The US health industry just stepped into a new era where competitive advantage will favor organizations able to AI-enable their workforce better than their competitors. The transformational allure of AI is nearly irresistible. In my last blog, I referenced analyst firm, Frost & Sullivan, predicting AI’s potential to improve health outcomes by 30-40% and to reduce the cost of treatment by as much as 50% in next 7-10 years. It’s clear there’s a lot of potential in this space, but understanding how to unlock those benefits and put them to work isn’t.
According to a recent AI is here report by Forbes Insights, organizations most likely to emerge as market leaders are those that can build and execute a company-wide AI strategy. Those leaders are likely to approach AI as it applies to today’s business and clinical challenges. They consider AI a mission critical tool they’ll need to master organization-wide to survive and thrive in the new fee-for-value health economy. Gartner analysts tell us that realizing the full benefits of AI requires a holistic, company-wide AI strategy; companies without one typically end up utilizing AI only in a few, limited use cases.
Beyond having a company-wide AI strategy, there’s a less obvious AI success factor that’s separating AI leaders from laggards. This HBR analysis concludes that the number one reason that AI underperforms in organizations is what we already know eats strategy for breakfast—culture. The best laid strategies often flop when the corporate culture necessary for the workforce to execute that strategy doesn’t permeate the whole organization.
This raises the question, “What special culture is needed for AI to succeed and how does it differ from the culture of safety, innovation, and collaboration we meticulously implemented over the last decade?” In the Forbes Insights survey of 305 global executives across multiple industries, AI leaders intentionally infused an “AI-ready” culture centered around data across their entire workforce from the C-suite to first-line teams.
Why AI underperforms without an “AI-ready” culture
What’s so different about AI that it requires a completely different culture?
Here’s my take: AI isn’t like any technology innovation wave we’ve seen before. It’s so radically different that it creates two completely new categories of technology and collaboration.
Because AI is a technology that works more like we do, it isn’t like any previous technology innovation—like social, mobile, cloud, or analytics. These technology innovations are more like tools that come with a learning curve that requires humans to learn how to use them. Because these tools are less like us, they require us to adapt to the way the tools work. Adoption of these technologies always comes at the expense of human productivity, which is why broad use of these technologies can take up to a decade. AI introduces a new technology category because it’s a tool that’s more like us and adapts to the way we work, rather than the other way around. AI can adapt to our natural workstyles and flows, and even work on our behalf in ways that scales our productivity and expertise. Because AI can adapt to the way we work, it turns the traditional adoption model upside down. The more we use AI, the more it learns how it can help us be more productive and take on more complex tasks.
AI also ushers in a completely new category of collaboration. Current organizational cultures are designed around human-to-human collaboration and have leveraged each technology innovation wave to advance how humans collaborate with each other by removing time and location boundaries. AI takes us beyond this paradigm with human-to-machine collaboration. AI doesn’t just enable humans to interact and collaborate better with humans, it also enables humans and machines to work together as allies. The human-machine relationship is, potentially, symbiotic, with each benefiting from the others’ complementary strengths. This type of collaboration opens the door for AI to speak and understand the language and policies of employees and internalize the values and business practices of the organization in the same way that the workforce does. Somewhat ironically, AI can actually make us more human in the way we serve consumers, patients, and each other. AI, for example, may soon “free” clinicians to spend more time listening and empathizing simply by offloading many of administrative tasks that consume over half (51%) of a nurse’s workload and nearly a fifth (16%) of physician activities.
3 steps to get your culture AI-ready
What does an AI-ready culture look like? It’s corporate culture that’s:
- Empowering, inclusive, and experimental. C-suite leaders consistently communicate that AI is mission critical to future success, that AI is for everyone, and that AI’s purpose is to amplify and scale the impact of the entire workforce. To dispel myths and get workforce excited about the possibilities of AI, the entire workforce is expected to be part of the AI transformation and think about AI as a force that they need to influence rather than a force that will influence them. Every employee is empowered to use AI to amplify their impact and eliminate the routine, repetitive activities they need to free them up to focus on higher value activities that can’t be automated.
- Rigorously data-driven. AI is only as powerful as the quality and completeness of the data it runs on. To make better decisions, act more effectively, and deliver better customer experiences, every department needs to adopt rigorous data practices that break down data siloes, turn data into data supply chains, and turn supply chains into value. Leaders and teams in every part of the business reframe their business problems into computational business problems that AI teams can understand and solve.
- Responsible. The organization’s journey to AI must be rooted in humility, with a commitment to AI solutions that reflect ethical principles that are deeply rooted in their values and comply with government rules and regulations. Each step in the journey may accomplish something unexpected that may require teams to verify that every AI solution follows these six responsible AI principles: Fair, Reliable and safe, Private and secure, Inclusive, Transparent, and Accountable.
Check out AI Business School for a leadership crash course on AI strategy, culture, and responsibility.
Learn How BayCare is fast-tracking its cultural transformation from Ed Rafalski, Chief Strategy and Marketing Officer, PhD, MPH, FACHE, BayCare
Questions about getting started with AI? Ask Priya Gore.