Preferences

Nvidia thinks that Moore's Law is dead. https://arstechnica.com/gaming/2022/09/do-expensive-nvidia-g...

Intel, by contrast, says that Moore's Law is still alive. But Intel is technologically behind, and it is easier to improve when there is someone to learn from, so maybe there is a wall that they haven't yet hit.

Regardless, it is a very different law than when I was young, when code just magically got faster each year. Now we can run more and more things at once, but the timing for an individual single-threaded computation hasn't really improved in the last 15 years.


> but the timing for an individual single-threaded computation hasn't really improved in the last 15 years

The clock speeds haven't really gone up anymore, but computations still got considerably faster. From an i7 2700k (2011) to an i7 13700k single core benchmark scores went up 131%

https://cpu.userbenchmark.com/Compare/Intel-Core-i7-2700K-vs...

First, that's the kind of change we used to get in 2 years. Having it happen over a decade is barely noticeable compared to where we used to be.

Second, over that time period we've had a lot of changes in tooling. Some make code faster. Most make code slower. (Examples include the spread of containerization, and adoption of slow languages like Python.) The result is that programs to do equivalent things might wind up actually faster or slower, no matter what a CPU benchmark shows.

Yeah, the original Pentium released at 60 MHz in 1993, and ten years later there were 3 GHz Pentium 4s. The '90s were pretty nuts.
If you read the article, it is clear Jensen Huang is commenting only on the price aspect. It used to be the case that as density goes up, price goes down. Huang is saying that price is no longer going down, and he is probably right. But density is going up. It is even unclear whether it is slowing down.
Nvidia is trying to reset expectations of consumers.

Jensen aims to charge more for more GPU computing power into the future.

This is because Nvidia has close to monopoly power this is able to break Moores Law single handedly.

Hell, they are able to sell graphic cards, well not nVidia themselves but still, in excess of 2k bucks. They'd be stupid not to try to keep prices that high, now the market seemlingy accept them.
They’re losing a lot of customer love.

IMO if Intel stay in the game it’ll be sorted out within a few years.

Nvidia may have good software but people like paying less money.

Paying less will win out.

The 4090 sold strong at first, but the rest of Nvidia’s 40-series offerings haven’t sold nearly as well. It’s still not clear if their attempt to reset expectations toward higher prices will work out in the long term, but in the short term it’s been a bit of a failure.
They might be putting themselves in a similar position as Intel though (maybe worse since I don’t recall Intel ever being as greedy..) if their competitors eventually catch up.

Unlike in gaming in the data center initial cost + performance per watt are the only thing that really matter (besides software, Nvidia has a huge moat there..). And in relation to how much Nvidia is charging per GPU total power costs are close to zero.. So 4 ‘worse’ but much cheaper chips might be a better deal than buying an A/H100 etc.

> Unlike in gaming in the data center initial cost + performance per watt are the only thing that really matter

That's not true from a hardware perspective either. You can't just plug in 4 worse cards in the same rack. The savings on the graphics cards become less significant if you need to double/quadruple all other hardware to increase the number of racks. A 1U blade can easily cost $10000 without a graphics card.

You can plug in multiple GPUs on a computer. AWS has fleets with 4 and 8, see https://docs.aws.amazon.com/dlami/latest/devguide/gpu.html for pricing.

For anything datacenter related, customers are very sensitive to price per performance. And datacenters are happy to oblige.

You're missing the point GP made though. He wants to replace one good card with 4 worse ones. It's not like that rack has 3 or 7 additional slots just unused. They're also already taken by the setup. And in the link you provided 5 out of 6 offerings are still Nvidia GPUs.

Of course data centers are happy to oblige to customer demands, but initial cost per GPU and performance per watt are certainly not the only relevant factors.

I can build a GPUless microATX tower for $600 and the 3sqft of extra space costs $400. Somebody is overcharging you for 1U blades.
But does Nvidia have their own AMD-like foe waiting around the corner to fight them for the crown? I'm not sure there is in the near future anyone capable to sustain such a fight...
I don't understand the part about the market accepting the price, or rather, I find it hard to believe that it's sustainable. I've played PC games my whole life and used to enjoy building and re-building my gaming PC every so often. Paying 2k for a single piece of hardware just doesn't seem like the right choice anymore. Makes more sense to buy a console (or two) these days.
With little progress in performance, they're gonna compete with secondary market aka used devices.
interesting take

otoh it can be said that gpu's just look like they improved longer than cpu's because their workload is vastly more susceptible to parallelism.

Nvidia and Intel are in very different markets and their product ranges barely overlap.

And well Nvidia is almost a monopoly at this point so they have barely any incentives to continue innovating as opposed trying to extract as much money as possible from their clients.

On the other hand look at what happened with CPUs over the last few years. Huge improvements in efficiency (including Intel)

> hasn't really improved in the last 15 years.

I don’t think that’s even close to being the case in almost all use cases. Increasing complexity/bloat has obfuscated that to a large degree though.

Dennard Scaling is dead https://en.wikipedia.org/wiki/Dennard_scaling which is why things stopped magically getting faster (chips started getting too hot)
> Intel is technologically behind, and it is easier to improve when there is someone to learn from

This is interesting to me, how did they end up like this?

By being so much ahead of everyone else that they thought that didn’t have to do anything anymore.

On desktop, anyway. Mobile is a different store, back in the mid 2000s they had everything to dominate the market for the next 10+ years (e.g. the fastest ARM chips) yet choose not to due to reasons..

They put BK as CEO and hired that Murthy guy.

This item has no comments currently.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal