AMD stubbornly refuses to recognise the huge numbers of low- or medium- budget researchers, hobbyists, and open source developers.
This ignorance of how software development is done has resulted in them losing out on a multi-trillion-dollar market.
It's incredible to me how obstinate certain segments of the industry (such as hardware design) can be.
AMD is doing just fine, Oracle just announced an AI cluster with up to 131,072 of AMD's new MI355X GPUs.
AMD needs to focus on bringing rack-scale mi400 as quickly as possible to market, rather than those hobbyists always find something to complain instead of spending money.
we're talking about the majority of open source developers (I'm one of them). if researchers don't get access to hardware X, they write their paper using hardware Y (Nvidia). AMD isn't doing fine because most low level research on AI is done purely on CUDA.
I am really sympathetic to the complaints. It would just be incredibly useful to have competition and options further down the food chain. But the argument that this is a core strategic mistake makes no sense to me.
AMD has demonstrably not even acknowledged that they needed to play catch-up for a significant chunk of the last 20 years. The mistake isn't a recent one.
There are plenty of research institutions that can easily spend >$250k on computational resources. Many routinely spend multiples of that volume.
They'll be fine.
Look at China. A couple of years ago people thought people in China weren't doing good AI research, but the thing is, there's good AI research from basically everywhere-- even South America. You can't assume that institutions can spend >$250k on computational resources.
Many can, but many can't.
AMD is very far behind, and their earnings are so low that even with a nonsensical pe ratio they’re still less than a tenth of nvidia. No, they are not doing anywhere near fine.
Are hobbyists the reason for this? I’m not sure. However, what AMD is doing is clearly failing.
If you design software for N00000 customers, it can't be shit, because you can't hold the hands of that many people, it's just not possible. By intending to design software for a wide variety of users, it forces you to make your software not suck, or you'll drown in support requests that you cannot possibly handle.
this guy gets it - absolutely no one cares about the hobby market because it's absolutely not how software development is done (nor is it how software is paid for).
AI research used to be fringe and not well funded.
Back in those days, 99.9% of hardware was Xeon.
If you don't need 8, then that's exactly why we offer 1xMI300x VM's.
We see it now with 8x UBB and it will get worse with direct liquid cooling and larger power requirements. Mi300x is 700w. Mi355 is 1200w. Mi450 will be even more.
Certainly amd should make some consumer grade stuff, but they won’t stop on the enterprise side either. Your only option to get super computer level compute, will be to rent it.
That said, I am confident that Nvidia will continue serve those of us who want our own hardware.
But 355 has fp4/6 added in, which until udma comes out, likely won’t get emulated.
It is fine if you dont need the features of newer hardware, but if you do… then desktop won’t help you.