Preferences

10,000 hours of what are we comparing ? .

Compilers already do better than me million hours for every program I write because I am not crafting assembly code.

Computers save billions of hours compared to doing it by hand with an abacus or pen and paper.

Productivity of humans is always dependent on tools they have access to, even with agents become that much productive so will humans who use tools

——

Projecting doubling rate over so many generations is no different than saying earth is flat just because it feels flat in your local space. There is no reason to believe exponential doubling holds for 10 generations from now.

There is just one example in all of history that did that over 10 generations , Moore law . Doubling for so many generations at near constant rate is close to impossible , from the times of the ancient tale of grain and chessboard, people constants struggle with the power of the exponential.

——

I would say current LLM approaches are fairly close to end than beginning of the cycle, perhaps 1-2 generations left at best .

The outlay on funding is already at $50-$100 B/year per major player. No organization can spend $500B/year on anything. I don’t see any large scale collaboration in the private sector or in public like space station / fusion reactor for allocating resources to get to say a 3rd gen from now.

Comparing it with semiconductor tech the only other exponential example we have, the budget for foundries and research grew similarly and those are today are say $10-20B for leading edge foundry, but doesn’t keep growing at the same pace.

Constraints on capital availability and risk when deploying it is why we have so few fab players left and remaining players can afford to stagnate and not seriously invest .


Great writeup. There are other scaling axes of course, around data (even synthetic data) and improving AI generation at the 'software' layer (smarter design / training efficiencies / inference speed ups) — progress in those might make the the currently-unthinkable orders of magnitude $500b and beyond not as necessary?
manquer OP
Thank you , I am not sure if those dimensions will deliver the kind of generational boosts needed to keep the exponential going.

I could be quite wrong of course but it is not a certain bet that we will get fundamental breakthroughs from them.

There are specific areas which are always going to have major improvements .

In the semi conductor industry, Low power processors or multi core dies etc produced some results when core innovations slowed down during 2008-2018, i.e. till before the current EUV breakthrough driven generations of chip advances.

The history of EUV lithography and ASML’s success is an unlikely tale and it happened after both public and industry consortium funding of work for 2 decades that was abandoned multiple times .

Breakthroughs will happen eventually, but each wave ( we are on the fourth one for AI?) stagnates after initial rapid progress .

This item has no comments currently.