David Ha on LinkedIn: “Today's LLMs are the 'Mainframe Computers' of our generation"
I was on Bloomberg News today discussing Sakana AI, and to share my view that today’s LLMs are our generation’s “Mainframe Computers”. We are still in the very early stages of AI, and it is inevitable, due to market competition and global innovation (especially from those innovating with resource constraints), that this technology will become a million times more efficient.
The Future of AI Technology
There is a narrative established in Silicon Valley that AI is a “Winner Takes All” technology, and that scaling up existing models and consuming ever greater resources will require (and even justify) the largest investments of our generation, in order to “win” the AI race.
In contrast, I believe that AI is not a “winner takes all” technology. LLMs will be commoditized, become vastly more efficient, and made widely available in all countries. Ultimately, there will be thousands, if not millions, of AI models used by everyone. Just like with the evolution of early clunky mainframe computers to modern computing, how we use AI today will look very different in a few years (or even a year from now), compared to today’s ‘clunky’ LLMs.

Spot on, David Ha! AI is evolving fast, and efficiency will be the real game-changer. Exciting times ahead!
The Evolution of AI
Reinforcement learning is general learning I guess. Agree. History’s tech leaps weren’t about lone giants – but ecosystems:
- 1960s: IBM System/360 needed Unix to scale
- 1990s: Linux required Apache to conquer the web
- 2020s: LLMs will demand collaborative networks of models and data

We stand at the Cambrian explosion of composite AI:
- Swarm Architectures: Like TCP/IP linking isolated computers, new protocols will fuse specialized models (code+math+design)
- Lego-Brick Innovation: Sakana AI proves resource-constrained teams can build better components than Big Tech’s monoliths
- Emergent Intelligence: When Mistral 7B talks to AlphaFold3 via decentralized mesh – that’s where magic happens
The Breakthrough in AI
The real breakthrough isn’t bigger models, but interoperability layers:
- Model handshake protocols (AI’s HTTPS)
- Provenance tracking
- Cross-domain reinforcement learning

The PC revolution wasn’t won by faster chips – but by BIOS standards letting components collaborate.
We’re rebuilding that playbook in AI. Thank you, good points.
Transformation in AI Technology
Resonates with my perception of AI technological transformation as a move from Bronze Age to Iron Age.
If history has shown us anything at all, it is this singular truth. Don't you think that the main beneficiaries will be in the top 1-2% most educated individuals and the rest of the population will be mostly passive end users, more or less as TV or YouTube viewers? My fear is that, for more people to really leverage AI professionally, a tremendous quantum leap will be required in secondary education.
In Feb 2024, this article discussed a similar take https://www.v1.co/story/future-road-to-personal-ai — Including the mainframe analogy
David, your analogy is spot on. If we don’t learn to set aside our selfishness, maybe it’s time for the machines to take the wheel—they might do a better job for all of us.
First fuel. L.1318 : Q factor: A fundamental metric expressing integrated circuit energy efficiency https://www.itu.int/rec/T-REC-L.1318/en
To view or add a comment, sign in










