AI Didn't Cause These Layoffs. Leadership Did.

AI Didn't Cause These Layoffs. Leadership Did.

AI Didn't Cause These Layoffs. Leadership Did.

It was a Tuesday morning in late 2025 when I sat across from a VP of Engineering. The company was growing fast enough that nobody had taken the time to question the hiring plan. The VP had his spreadsheet open showing forty-seven open headcount requisitions across six teams. I asked a question that looking back I should have asked more frequently, "Which of your open roles are tied to revenue-generating work?"

After a thoughtful pause he finally replied, "All of them, we need all of them."

He wasn't completely wrong, but he also wasn't completely right either. It was more instinctive. He wasn't wrong because he was careless. He had demonstrated over and over again in our time together that he was one of the most thoughtful engineering leaders I've worked with. He was simply wrong because the system he was operating in had stopped requiring the question to be asked because business was good. The product management due diligence around budgets, headcount and deliverables was almost non-existent. Capital has been cheap since 2021 with industry desiring to return to pre-COVID prominence. Boards measured engineering maturity by headcount, acting as a proxy for customer need and product fit. Product roadmaps were built on aspiration and excitement, with money being provided through funding rounds instead of revenue-backed capacity planning and customer demand. Hiring forty-seven people felt like building something to support the growth of the company. Large teams affirm to all staff that this is the place to be, in the hope of staving off the highly competitive job market and securing those much-needed skills. This was all deferring the much bigger question.

I've seen this pattern at multiple organizations and for varying economic reasons. I spent time reflecting on that Tuesday morning conversation, watching the news as various Tech companies laid off 45,000 tech workers in the first three months of 2026 alone. Roughly 20% of those cuts have been attributed to AI, automation or some other efficiency. Other companies have been more brazen stating 100% of layoffs are related to the rise of AI. Companies are telling the press, their shareholders, and their employees that artificial intelligence made these roles unnecessary. But at the same time, there is a fear in the Boardroom that companies not embracing AI are going to fall behind, lose shareholder confidence and evaporate into the annals of history.

That narrative is convenient. It is also, in the ways that matter most, a fiction.

The Overhire Nobody Wanted to Name

The hiring patterns between 2019 and 2022 were primarily driven by capital conditions with low interest rates, investor pressure to recover after COVID, and pressure to demonstrate growth. The mistake many leaders made was in the persistent conflation of headcount with capability. I watched it happen across multiple organizations. Teams doubled in size not because the work demanded it, but because the aspirations did. In many planning sessions I remember seeing roadmaps resembling a collection of desires, appropriated into roadmaps, then staffed to match. In other organizations the roadmap expanded to ensure every member in every team was 'busy'. Ironically, this extra capacity never resulted in work directed to fixing technical debt or shoring up the foundations. This was all net new feature development. The business plan and data didn't exist and everything was based on 'intuition'. The organizations manifested hope assuming the gap between what was promised and what was achievable would be met by the additional hiring activities. The honest conversations about scope, priority and market trends became quieter, lost amongst the noise of activity and onboarding new hires.

What made the overhiring so durable was the fundamental lack of incentive to question it. Engineering leaders got larger teams and bigger titles. Bigger teams often meant attracting better talent, and Product leaders got their roadmaps funded. Finance approved headcount because the revenue plan assumed positive intent driven by the promise of features the new skills would theoretically bring. The whole machinery suddenly became self-reinforcing. A loop where hiring justified itself because spending justified itself because growth justified itself. And if you raised the question, if you asked whether the team really needed to grow by 30% when the product only supported 10%, or even a reduction, you were labelled the person who "didn't think big enough", "not a team player", or even "you don't understand what we are building". I know because I've been that person.

Here's the part that sits uncomfortably with those of us who have led engineering organizations. We participated in it. Not maliciously. Not cynically. But who wouldn't take additional headcount? We graciously took the headcount when it was offered. We built teams around the work we hoped would materialize or the priorities we hoped leaders would finally allow us to address. We knew, on some future Tuesday morning, that our spreadsheets didn't quite add up. We trusted those running the company that they knew what needed to be done. The comeuppance would arrive later at the end of 2025, when AI forced an awkward question, "How efficient are we and would AI help?".

The Baseline Nobody Wanted to Measure

The accountability gap that made the overhiring possible was culturally protected in many organizations. There was a genuine reluctance, and sometimes fear, when asked to look closely at what teams were actually producing relative to their cost. Developer productivity became a subject you discussed at conferences. Senior Leaders weren't going to rock the boat because it created a more leisurely work environment, more time for fun, or created a better work-life balance. Engineering leaders resisted measurement frameworks because some had been burned by bad ones, lines of code, story points used as performance metrics, utilization percentages that simply rewarded busyness over outcomes. Others resisted because of the cultural pressure of creating a 'great place to work' in the hope of attracting talent. The resistance was understandable. But the alternative was worse: no accountability at all.

The result was a widening gap between perceived activity and actual output. Invisible by design. Standups showed motion. Sprint velocities held steady. Dashboards were green. For many, delivery timelines stayed the same, quality didn't improve, revenue increases couldn't be attributed to new features, and nobody could explain why. Teams doubled, so why didn't the velocity? The instruments to explain it often didn't exist, or many times had been deliberately avoided. Metrics like DORA (Deployment Frequency, Lead Time, Change Failure Rate, Mean Time to Recovery) existed as industry standards. Some hadn't adopted any way of measuring productivity. Those that had adopted them cosmetically tracked numbers on a dashboard nobody reviewed with rigor and completely missed the feedback loop. Most organizations I work with are fearful to implement any kind of framework that might shine a light on performance, cycle time or value.

Then AI arrived in the conversation. Not as a tool that replaced engineers, but as a question that exposed them.

When senior leaders started asking "what could our teams accomplish with AI assistance?" they were inadvertently asking a different question nobody had wanted to ask for years. Very few could answer the all important question in a quantifiable way, "What are our teams actually accomplishing now?". You cannot evaluate a productivity multiplier without a baseline. The moment you establish a baseline you discover the gap between what you're paying for and what you're getting. AI didn't create that gap. It made it impossible to keep ignoring it any longer.

The Cover Story That Serves Everyone Except the People Being Cut

Attributing layoffs to AI is a narrative that serves almost every stakeholder. It justifies a smaller severance on the basis that somehow AI was inevitable, or worse, blames the downsizing on the individuals that have failed to integrate AI into their workstreams and remain relevant. Boards get a forward-looking story, "we're investing in AI and right-sizing the engineering teams accordingly". Investors hear efficiency and the executives get political cover for corrections they should have made previously. But the raw reality is no Board is going to push back on a leader saying he can now achieve more with less. Ironically it's a win/win for the leader. Cutting headcount because of AI and accelerating development and reducing costs is a valid achievement. But the more surreptitious sleight of hand, in which reducing headcount changes absolutely nothing other than reducing the financial burden, is still rewarded. Features still get delivered, customers are still served, operating expenditure reduces. In both cases, the cost reduction is yet another fictitious achievement the leader can now add to their resume as they move up the corporate ladder, oblivious to the wake of destruction left behind them. In many organizations frameworks weren't used to measure efficiency and velocity before AI. The idea that these same leaders are now going to be able to magically report on efficiencies and capacity plan while adopting AI is disingenuous.

The employees, meanwhile, are told their roles were made redundant by technology. Some of them will believe it and some will spend the next year trying to become "AI-proof". Many are upskilling in prompt engineering, LLM model training, furiously building home labs to try and work out how they can be relevant in this ever changing tech space. Many are building portfolios of AI-augmented projects, while chasing a moving target that was never the actual problem. As we look back across our careers, many of us will realize the best engineers we've worked with weren't the ones who adopted every new tool the fastest. They were the ones who understood what they were building and why it mattered and could make educated decisions on where the true optimizations could be made. No amount of AI fluency will compensate for an organization that never prioritized what it actually needed.

HBR research from late 2025 found that companies were laying off based on AI's potential, promises of increased returns from vendors, not to mention the peer pressure to not fall behind with the Shareholders choosing to invest elsewhere. Companies are not reducing their workforces because of demonstrated performance. Forrester reported a 55% employer regret rate on AI-related restructuring. The organizations cutting headcount aren't doing it because AI proved certain roles unnecessary. They're doing it because AI gave them a reason to do what the balance sheet had been suggesting for two years. Look at any Fortune 500 and take a look at the operating expenditures. These have been out of control, and now it's the reckoning. 62% of the Fortune 500 have reported struggling with ever-increasing SG&A expenses over the last 5 years according to Hackett Group. As of 2023, and thanks to lower inflation, expenses have started to climb slower than revenue, but the picture is still pretty grim.

I've been the leader who inherited an organization that was larger than it needed to be. I've made the calls to restructure teams. It is genuinely hard. But it's harder, and less honest, when you wrap a headcount rationalization in a technology narrative that lets everyone pretend this was inevitable rather than avoidable.

The Cycle, and What Breaks It

Here is the part of the story that concerns me most, this cycle is predictable. We've seen it before. The same leaders who presided over the overhiring will begin rebuilding headcount within 12 to 24 months. Not because AI created new roles, but because the cuts will go too deep in some places and not deep enough in others. There simply isn't the data available for many of these organizations to lead with precision. The decisions coming are going to be ones made to support a narrative. You can't right-size an engineering org without understanding what each team produces. You can't understand what each team produces if you spent the last five years avoiding that question.

The structural problems aren't going away. The absence of capacity planning tied to revenue will continue. The reluctance to measure developer productivity with rigor and fairness won't change. The conflation of headcount with maturity will persist. Layoffs aren't suddenly providing the tools, training and data points, nor creating competent leaders to understand what the tools are telling them. Shamefully, what it will do is create a pause, a sigh of relief for those leaders at the top. The stock price will recover. Eighteen months later, the hiring plans come back, built on the same assumptions that created the surplus in the first place.

I've watched this cycle multiple times in my career now, and no industry is immune. My second time was hardest to stomach because I recognized it earlier, the growth-phase hiring frenzy, the avoidance of hard measurement, the quiet dread when the revenue didn't follow the headcount. I still couldn't convince enough people to break the pattern before the correction arrived.

The organizations that will break it, and some will, are the ones installing the disciplines now that should have been there before. Engineering org design tied to actual capacity, not aspirational roadmaps. Developer productivity metrics that are fair, transparent, and used for planning rather than punishment. Workforce planning that treats headcount as a capital allocation decision with the same rigor applied to all other investments. Visibility into what teams produce, measured in outcomes delivered rather than hours logged, features created or sprints completed.

These aren't revolutionary ideas. DORA metrics have been available for a decade and multiple capacity planning frameworks exist. The problem has never been a lack of tools and frameworks. It was a lack of willingness to use them to drive accountability at all layers of the engineering organization from the CTO down. Using them meant having conversations that were easier to avoid.

I think about that VP of Engineering regularly. He was eventually part of a layoff himself. He told me he wasn't surprised, which might be the saddest part of this whole story. He'd seen the pattern forming but couldn't effect change. He didn't have the organizational leverage, skills or muscle memory, and he hadn't developed the data driven vocabulary to describe it.

The vocabulary exists now. The question is whether the leaders who survived this round will use it to build something more durable. Or will they simply wait for cheap capital to return and let the spreadsheet fill up again? We owe it to our engineers and our teams to have the right staffing and standards to stop this constant boom and bust that causes stress and layoffs every 2-3 years.

This isn't AI's job to answer. This is ours.

References

  • Harvard Business Review (2025). Research on AI-motivated workforce reductions and the gap between AI potential and demonstrated organizational performance.
  • Forrester Research. 55% employer regret rate on AI-related restructuring decisions.
  • Hackett Group. Analysis of Fortune 500 SG&A expense trends (2018-2023); 62% of Fortune 500 companies reporting sustained increases in selling, general and administrative expenses over a five-year period.
  • DORA (DevOps Research and Assessment). Industry-standard metrics: Deployment Frequency, Lead Time for Changes, Change Failure Rate, Mean Time to Recovery.

Ready to break the cycle?

I work with engineering organizations on the capacity planning, productivity measurement, and org design that prevents the next overhire-then-cut correction.

← Back to Insights