Except now, Apple with it's shared VRAM/RAM model now has better deals especially past 24GB of VRAM than you get with Nvidia now (for inference at least).
A Macbook or Mac Mini with 32GB as a whole system is now cheaper than a 24GB Nvidia card.
That's an interesting point about unified memory. But I assume most people will use their graphics card for video games rather than machine learning inference. And most non-console games are programmed for Windows and non-unified memory.
A Macbook or Mac Mini with 32GB as a whole system is now cheaper than a 24GB Nvidia card.