BoredPositron parent
I wonder what we will do with all that compute if no one is going to use it...even if we achieve AGI it won't run on this generation of hardware. Are we just gonna brute force architectures or wtf is going on?
I assume there's at least one generation of re-sale value. There's a lot of smaller companies that could use GPU capacity and would if it was cheaper. I wonder how the math works out though -- are power requirements such that the actual costs aren't worth it? How long before the chips just get thrown away?
Cheap cloud gaming.
Those GPU's will finally push pixels again :)
> Cheap cloud gaming.
> Those GPU's will finally push pixels again :)
Have they ever? I wonder if, say, an H100 even supports graphics APIs.
H100s have certain tradeoffs that makes gaming not as energy efficient.
Funnily enough, H100s are already old hardware, and soon will all get fully depreciated.
Billions and billions of depreciated assets!
Cloud gaming sucks due to latency, not price.
Which is something we already knew a decade ago.
Just like we knew self-driving AIs are not reliable.
The magical thinking was assuming they would just get better.
Cloud gaming can have lower latency than local consoles. https://www.eurogamer.net/digitalfoundry-2022-geforce-now-rt...
Played forza motorsport 5 on xbox cloud gaming and could not notice it. Of course I am the most non-serious player ever, but I really tried to notice, and for my standards it was fine.
maybe I'm just over sensitive to it, but I can't even stand playing when the tv isn't on game mode. can't imagine waiting for controller input to make a server roundtrip
Not even an issue anymore. I have no issue playing FPS games like call of duty multiplayer via GeForce Now and can be decently competitive. I do live close to the servers though.
They won’t. High end AI GPUs get stomped by mid tier gaming GPUs on pixels
There will definitely be takers though. Can see scientific community for example loving some cheap GPUs