For AI chips... also probably not, unless AMD can compete with CUDA (or CUDA becomes irrelevant)
And for AI, CUDA is already becoming less relevant. Most of the big players use chips by their own designs: Google has its TPUs, Amazon has some in house designs, Apple has it's own CPU/GPU line and doesn't even support anything nvidia at this point, MS do their own thing for Azure, etc.
You are basically making the Intel will stay big because Intel is big for Nvidia. Except of course that stopped being true for Intel. They are still largish. But a lot of data centers are transitioning to ARM CPUs. They lost Apple as a customer. And there are now some decent windows laptops using ARM CPUs as well.
I think that AMD could do it, but they choose not to. If you look at their most recent lineup of cards (various SKUs of 9070 and 9060), they are not so much better than Nvidia at each price point that they are a must buy. They even released an outright bad card a few weeks ago (9060 8 GB). I assume that the rationale is that even if they could somehow dominate the gamer market, that is peanuts compared to the potential in AI.
While on Windows it has been hit and miss with their SDKs and shader tooling, anyone remembers RenderMonkey?
So NVidia it is.
they played that part beautifully in the past decades for Intel
cause the team they have the last decade is clearly retarded.