The Blog Post That Erased $30 Billion from IBM

Anthropic published a blog post on Monday. Not a product launch, not a partnership announcement, not a keynote at a major conference. Just a simple blog post explaining that Claude Code can read COBOL.

IBM proceeded to drop 13%, its worst single day loss since October 2000, with twenty five years of stock resilience gone in an afternoon because one AI company quietly updated the world on what its coding tool can do.

Here is what actually happened, and why it matters more than the stock price suggests.

1. We All Knew This Day Was Coming

Nobody in technology is surprised that COBOL is finally meeting its match. The writing has been on the wall for years, and AI was always going to get here eventually. The debate was never if, it was when.

What nobody predicted was how it would actually arrive. We imagined a moment of reckoning — a dramatic product launch, a CEO on stage, a press cycle with gravity proportional to what was being disrupted, something that signalled to the world that a $30 billion industry was about to be restructured. Instead we got a blog post with the energy of a minor feature enhancement, casual, almost blasé, tucked between other announcements. “By the way, Claude Code can now help you modernise COBOL. Here is a playbook. Have fun.”

That casualness is itself the signal. When the death blow to fifty years of mainframe dependency reads like a changelog entry, it tells you something profound about the pace at which AI is normalising disruption. The technology has gotten so capable so fast that genuinely historic announcements are being made in the same tone as a library update. COBOL’s day of reckoning came. It just did not bother to dress up for the occasion.

2. Which Businesses Feel Safe Now?

That question is worth sitting with, because if a blog post can erase $30 billion from IBM in an afternoon, the question every board should be asking is not “is this bad for IBM?” but “what is our equivalent of COBOL?” Every industry has one: the process that has not changed because it was too expensive to understand, the system that has not been replaced because the analysis cost was prohibitive, the business model that persisted not because it was good but because the complexity protecting it was real and formidable.

AI is not just threatening COBOL. It is threatening complexity itself as a competitive moat. Legal firms built on the impenetrability of case law, consulting practices built on the opacity of enterprise systems, insurance actuarial models built on proprietary data interpretation, compliance functions built on regulatory complexity — any organisation whose value proposition includes “we understand the incomprehensible so you do not have to” should be reading Monday’s news very carefully.

I wrote about this dynamic in a different context earlier this year. The Death Star Paradox explores why AI first mover advantage is not a gradient but a cliff. The organisations that move first do not just get ahead — they make the response irrelevant. Monday was a live demonstration of that thesis. Anthropic did not outcompete IBM’s COBOL tools. They made IBM’s COBOL tools feel like they belonged to a different era, and the same technology, framed by a different narrative, landed with completely different force.

3. The Language That Refuses to Die

COBOL is 67 years old, designed in 1959 via a public private partnership that included the Pentagon and IBM with the goal of creating a universal, plain English programming language for business applications. Most of the developers who wrote it have retired, and most universities stopped teaching it years ago. And yet COBOL handles roughly 95% of ATM transactions in the United States, with hundreds of billions of lines of it running in production every single day, powering banks, airlines, and government systems on every continent.

The developers who built these systems encoded decades of business logic, regulatory compliance, and institutional knowledge directly into the code, with no comments and often no documentation. The only way to understand what a COBOL system actually does is to read it, trace it, and map it — a process that takes teams of specialists months before a single line of replacement code gets written. That analysis cost is exactly why COBOL never got replaced.

4. The MIPS Tax Nobody Talks About

Here is something the financial press almost never covers when they write about mainframes. IBM does not sell mainframe capacity the way cloud providers sell compute. IBM prices mainframe usage in MIPS: Millions of Instructions Per Second, and that pricing model has had profound consequences for the institutions running it.

MIPS pricing means that every workload you run on a mainframe is metered — every transaction, every batch job, every new product feature. As your business grows, your IBM bill grows with it, not because you bought more hardware but because you used more of the hardware you already own. The mainframe also only scales vertically, so you cannot add nodes the way you add cloud instances. When you hit the ceiling, you hit an outage, not a queue, not a slowdown, but a ceiling and then a fall. Burst protection was therefore not a nice to have on mainframe estates but an architectural necessity, because the alternative was a production outage triggered by demand spikes you could not absorb. Financial institutions spent years engineering around a constraint that simply does not exist on modern horizontally scaled infrastructure.

The consequences of MIPS pricing for customer facing products have been quietly catastrophic. I have spoken to technology leaders at major financial institutions who made deliberate decisions to restrict what products they offer to retail customers specifically to manage MIPS consumption. Think about that for a moment: a bank limiting its own product portfolio not because of regulation, not because of market demand, not because of engineering constraints, but because launching a new feature would push their IBM bill past a threshold their CFO had approved. That is the hidden tax the mainframe imposed on an entire generation of financial innovation, and it is one that COBOL modernisation, done properly, finally removes. When your transaction processing runs on commodity cloud compute, burst protection comes standard, you pay for what you use, you scale horizontally, and nobody in your product team has to ask whether a new feature is worth the MIPS.

5. What Claude Code Actually Does

Anthropic’s announcement is technically precise. Claude Code can map dependencies across thousands of lines of legacy code, document workflows that have never been written down, identify migration risks, and surface institutional knowledge that would take human analysts months to find. The key claim is that with AI, teams can modernise their COBOL codebase in quarters instead of years, and that single sentence is what sent IBM’s stock into freefall.

If the analysis phase collapses from months to days, the entire economic argument for leaving COBOL alone collapses with it. The reason banks, governments, and airlines kept paying IBM billions was not that they loved mainframes — it was that the alternative required an enormous, expensive, risky analysis programme before any actual migration work could even begin. Remove that barrier and the calculation changes entirely.

6. IBM’s Uncomfortable Position

Here is the part that does not make it into most of the coverage. IBM has been saying this themselves since 2023, having built watsonx Code Assistant for Z specifically to help organisations understand and modernise their COBOL estates. Their own CEO said in mid 2025 that it had wide adoption across their customer base. Nobody moved IBM’s stock 13% when IBM said it.

What moved the stock is that Anthropic said it. A company the market has decided represents the future described disrupting something the market has decided represents the past, and the technical merits became almost irrelevant once that narrative took hold. That is the uncomfortable truth IBM is sitting with today. It is not that their technology is inferior — it is that the market no longer grants them the credibility to define what modern looks like. When an AI startup and a 113 year old technology company make the same claim, the market weights them very differently.

7. The Architectural Sin Nobody Named

The mainframe did more than create a pricing problem. It created an architectural pathology that infected an entire industry and quietly persisted for fifty years. When everything runs on a single box, you stop thinking in systems, stop thinking in domains, stop asking which parts of your business logic belong together, which data belongs to which bounded context, which services should be decoupled from which. You just throw it all on the mainframe and call it an architecture. It is not an architecture. It is fly tipping with a Service Level Agreement.

The core banking platforms that emerged from the mainframe era inherited this thinking wholesale: monolithic systems that encode every conceivable banking function into a single codebase, with a data model built for batch processing in the 1970s, sold to banks as enterprise architecture when they are really just mainframe thinking with a modern price tag. These platforms have been extraordinarily difficult to displace not because they are good but because replacing them requires untangling the same kind of complexity that makes COBOL modernisation so expensive.

The insidious thing is that the architectural pattern itself became normalised — everything on one box, no domain boundaries, no service separation, no independent scalability — and entire generations of banking technologists grew up thinking this was how enterprise systems were supposed to work, that you built the big thing and managed the big thing and that was the job. It was never the job. It was the compromise you made when the alternative was too expensive to contemplate. With that excuse now weakening, there is no reason left to defend it. Engineers should be waking up to what actually comes after the mainframe: domain driven design, clear service boundaries, independent scalability, systems built around how the business actually works rather than around what a 1970s box could physically accommodate. Stop fly tipping on a single box and calling it enterprise architecture. The mainframe deserved our respect. It does not deserve our imitation.

8. The Question That Still Needs Answering

Anthropic released a Code Modernisation Playbook alongside the announcement, and it is detailed, technically credible, and genuinely useful for organisations thinking about where to start. What it does not contain is a completed end to end migration of a production core banking system in a regulated environment, validated against the original system and signed off by an external auditor.

That is the proof that matters. The analysis phase getting faster is real, but what happens after the analysis — the data architecture redesign, the regulatory validation, the transaction integrity verification, the performance engineering — that work is still hard. A better map of the territory does not flatten the territory. The organisations that respond to this announcement by treating it as a solved problem will learn that lesson expensively, while the organisations that respond by running careful pilots on bounded parts of their estate, building genuine modernisation competency, and treating AI as an accelerant rather than a replacement for rigorous engineering will be in a fundamentally stronger position three years from now.

9. The Real Signal in the Noise

IBM lost 13% on Monday and 27% in February, its worst monthly performance since 1968. That is not a market making a precise technical assessment of what Claude Code can and cannot do to mainframe revenue. That is a market expressing something it has believed for a while and finally found a reason to act on: that the era of complexity as a competitive moat is ending, and that organisations whose entire value proposition depends on being the only ones who can navigate the obscure, the legacy, and the deliberately impenetrable are facing a structural repricing.

The mainframe era produced extraordinary engineering. It also produced an architectural culture that mistook consolidation for design, confused vertical scale with resilience, and let pricing models constrain what products banks could build for their customers. That era is ending, not because of a blog post, but a blog post just made it impossible to pretend otherwise. And it did it in the most devastating way possible: casually, without drama, in the same tone you would use to announce a new keyboard shortcut. That is how you know it is real.

Andrew Baker is Chief Information Officer at Capitec Bank. He writes about enterprise architecture, banking technology, and the future of financial services technology at andrewbaker.ninja.

Leave a Reply

Your email address will not be published. Required fields are marked *