So, to help you understand how they can be true: market cap is governed by something other than what a business is worth.
As an aside, here's a fun article that embarrasses wall street. [0]
I guess that means don't take investment advice from me ;) I've done OK buying indices though.
Plenty of companies have screwed up execution, and the market has correctly noticed and penalized them for that.
P.S. I did not have access to internet in 2006, so I guess the skepticism was normal at the time.
Also, if they are so good, it's best to not level the playing field by sharing that with your competitors.
Also "chip makers with the best chips" == Nvidia, there aren't many others. And Alphabet does more than just produce TPUs.
Google is saving a ton of money by making TPUs, which will pay off in the future when AI is better monetized, but so far no one is directly making a massive profit from foundation models. It's a long term play.
Also, I'd argue Nvidia is massively overvalued.
google, who make AI chips with barely-adequate software, is worth 2.0T
AMD, who also make AI chips with barely-adequate software, is worth 0.2T
Google made a few decisions with TPUs that might have made business sense at the time, but with hindsight haven't helped adoption. They closely bound TPUs with their 'TensorFlow 1' framework (which was kinda hard to use) then they released 'TensorFlow 2' which was incompatible enough it was just as easy to switch to PyTorch, which has TPU support in theory but not in practice.
They also decided TPUs would be Google Cloud only. Might make sense, if they need water cooling or they have special power requirements. But it turns out the sort of big corporations that have multi-cloud setups and a workload where a 1.5x improvement in performance-per-dollar is worth pursuing aren't big open source contributors. And understandably, the academics and enthusiasts who are giving their time away for free aren't eager to pay Google for the privilege.
Perhaps Google's market cap already reflects the value of being a second-place AI chipmaker?
TPUs very much have software support, hence why SSI etc use TPUs.
P.S. Google gives their tpus for free at: https://sites.research.google/trc/about/, which I've used for the past 6 months now
Jax has a harsher learning curve than Pytorch in my experience. Perhaps it's worth it (yay FP!) but it doesn't help adoption.
> They don't really use pytorch from what I see on the outside from their research works
Of course not, there is no outside world at Google - if internal tooling exists for a problem their culture effectively mandates using that before anything else, no matter the difference in quality. This basically explains the whole TF1/TF2 debacle which understandably left a poor taste in people's mouths. In any case while they don't use Pytorch, the rest of us very much do.
> P.S. Google gives their tpus for free at: https://sites.research.google/trc/about/, which I've used for the past 6 months now
Right and in order to use it effectively you basically have to use Jax. Most researchers don't have the advantage of free compute so they are effectively trying to buy mindshare rather than winning on quality. This is fine, but it's worth repeating as it biases the discussion heavily - many proponents of Jax just so happen to be on TRC or have been given credits for TPU's via some other mechanism.
You're conflating price with intrinsic value with market analysis. All different things.
Or rather, there would be if TPUs were that good in practice. From the other comments it sounds like TPUs are difficult to use for a lot of workloads, which probably leads to the real explanation: No one wants to use them as much as Google does, so selling them for a premium price as I mentioned above won’t get them many buyers.
If interesting in further details:
1) TPUs are a serious competitor to Nvidia chips for Google’s needs, per the article they are not nearly as flexible as a GPU (dependence on precompiled workloads, high usage of PEs in systolic array). Thus for broad ML market usage, they may not be competitive with Nvidia gpu/rack/clusters.
2)chip makers with the best chips are not valued at 1-3.5T, per other comments to OC only Nvidia and Broadcomm are worth this much. These are not just “chip makers”, they are (the best) “system makers” driving designs for chips and interconnect required to go from a diced piece of silicon to a data center consuming MWs. This part is much harder, this is why Google (who design TPU) still has to work with Broadcomm to integrate their solution. Indeed every hyperscalar is designing chips and software for their needs, but every hyperscalar works with companies like Broadcomm or Marvell to actually create a complete competitive system. Side note, Marvell has deals with Amazon, Microsoft and Meta to mostly design these systems they are worth “only” 66B. So, you can’t just design chips to be valuable, you have to design systems. The complete systems have to be the best, wanted by everyone (Nvidia, Broadcomm) in order to be in Ts, otherwise you’re in Bs(Marvell).
4. I see two problems with selling TPU, customers and margins. If you want to sell someone a product, it needs to match their use, currently the use only matches Google’s needs so who are the customers? Maybe you want to capture hyperscalars / big AI labs, their use case is likely similar to google. If so, margins would have to be thin, otherwise they just work directly with Broadcomm/Marvell(and they all do). If Google wants everyone using cuda /Nvidia as a customer then you massively change the purpose of TPU and even Google.
To wrap up, even if TPU is good (and it is good for Google) it wouldn’t be “almost as good a business as every other part of their company” because the value add isn’t FROM Google in the form of a good chip design(TPU). Instead the value add is TO Google in form of specific compute (ergo) that is cheap and fast FROM relatively simple ASICs(TPU chip) stitched together into massively complex systems (TPU super pods).
Sorry that got a bit long winded, hope it’s helpful!
https://www.tomshardware.com/tech-industry/artificial-intell...
"Nvidia to consume 77% of wafers used for AI processors in 2025: Report...AWS, AMD, and Google lose wafer share."
My take is "sell access to TPUs on Google cloud" is the nice side effect.
1. TPU's are a serious competitor to nvidia chips.
2. Chip makers with the best chips are valued at 1-3.5T.
3. Google's market cap is 2T.
4. It is correct for google to not sell TPU's.
i have heard the whole, its better to rent them thing, but if they're actually good selling them is almost as good a business as every other part of the company.