Before an organization resists AI, a person does.
This distinction matters more than it might appear. When we say that organizations are slow to adopt AI, or that firms are struggling to integrate it meaningfully, we are using institutional language to describe something that is, at its core, a human experience. Organizations do not feel threatened. Executives do — and by executives, I mean anyone who carries strategic authority and resource control, whether that is a CEO or a division head running a business unit.
Understanding what happens inside these individuals — before any committee is formed or pilot program is launched — is the more honest starting point.
Fear comes first — and it is strategically rational.
Executives are not paid to manage day-to-day tasks. They are paid to think and act strategically — to read markets, allocate resources, build relationships, and make high-stakes judgments about where value will come from next. These are analytical and communicative competencies, developed over decades, and they form the foundation of an executive's professional identity and organizational legitimacy.
AI does not merely challenge these competencies at the margins. It disrupts the underlying assumptions on which they rest. When AI reshapes how customers live — what they expect, how they decide, what they are willing to pay for — it does not just change the competitive landscape. It invalidates the mental models executives have used to navigate it. The fear that surfaces in these individuals is therefore not irrational anxiety. It is a strategically accurate signal that the ground has shifted beneath them.
The deeper problem is capital depreciation — across all three dimensions simultaneously.
To understand why this fear so reliably produces low efficacy, it helps to think like an entrepreneurship scholar rather than a psychologist. Executives, like entrepreneurs, accumulate capital over the course of their careers — and AI is depreciating all three forms of it at once.
Their financial capital — the ROI logic, resource allocation frameworks, and investment intuitions they have refined over years — was built for a market structure that AI is actively dismantling. The analytical models that once reliably predicted customer behavior are losing explanatory power as AI creates entirely new consumption patterns and lifestyle possibilities that no historical data anticipated.
Their human capital — the expertise, specialized knowledge, and hard-won experience that justified their authority — is being structurally devalued. When AI can perform sophisticated analysis, synthesize complex information, and generate strategic options faster and at lower cost, the scarcity premium on an executive's cognitive labor collapses. The competence that made them indispensable becomes, at best, a starting point rather than a destination.
Their social capital — the networks, reputations, and influence relationships that amplified their decisions — is losing its currency in a landscape being redrawn by AI-native players who did not build their positions through the same institutional channels. The relationships that once opened doors now open doors to rooms that matter less.
What makes this moment distinctively destabilizing is that these three depreciations are happening simultaneously, leaving executives with no stable form of accumulated value to anchor their confidence or credibility while they rebuild. This is not a skills gap. It is a capital crisis.
The challenge executives face is therefore entrepreneurial before it is managerial.
The executives navigating this most effectively are not those who have learned to use AI tools — they are those who have recognized that their situation now resembles that of an entrepreneur more than a seasoned executive. They are, in effect, starting over with depleted assets in a market they do not yet fully understand, and they are choosing to engage that reality honestly rather than defend against it institutionally.
That reframing — from executive protecting accumulated capital to entrepreneur rebuilding it — is the psychological and strategic shift that separates those who are genuinely moving forward from those who are presiding over the performance of doing so.
AI will not disrupt most organizations from the outside. It will stall inside them, in the gap between what executives have built and what the new landscape will no longer reward.
The triple depreciation of executive capital is a phenomenon I am actively exploring in my research. Have you observed this pattern — in your own experience or in executives around you? I would welcome your perspective.