LLM model training costs arise primarily from commodity costs (GPUs and other compute as well as electricity), not locally-provided services, so PPP is not the right statistic to use here. You should use nominal GDP for this instead. According to Wikipedia[0], the median country's nominal GDP (Cyprus) is more like $39B. Still much larger than training costs, but much lower than your PPP GDP number.
[0] https://en.wikipedia.org/wiki/List_of_countries_by_GDP_(nomi...
The median country GDP is approximately $48.8 billion, which corresponds to Uganda at position 90 with $48.769 billion.
The largest economy (US) has a GDP of $27.7 trillion.
The smallest economy (Tuvalu) has a GDP of $62.3 million.
The 48 billion number represents the middle point where half of all countries have larger GDPs and half have smaller GDPs.
$100 billion is the best estimate around of how much OpenAI took in investment to build ChatGPT.
A top of the line consumer desktop, the Mac Pro, costs $7000. The commonly acknowledged first non-mechanical computer, the ENIAC, cost $400,000, which adjusted for inflation is $6,594,153 (see note). Will AI models follow the same pricing trajectory? Probably not but they no longer cost even close to $100 billion.
Note: 1946 CPI = 19.5, 2025 CPI = 321.465 which makes for an increase of 16.49.
> Will AI models follow the same pricing trajectory?
If the article is correct, and this is the best way to make them, their price will explode.
Maybe it checks out if you don't use 1 year as your timeframe for GDP but the number of days required for training.
https://en.wikipedia.org/wiki/List_of_countries_by_GDP_(PPP)
Models are expensive, but they're not that expensive.