Preferences

I think it will be a good thing actually. Engineers, no longer having the luxury of assuming that users have high end system specs, will be forced to actually write fast and efficient software. No more bloated programs eating up RAM for no reason.

The problem is that higher performing devices will still exist. Those engineers will probably keep using performant devices and their managers will certainly keep buying them.

We'll probably end up in an even more bifurcated world where the well off have access to lot of great products and services that most of humanity is increasingly unable to access.

Have the laws of supply and demand been suspended? Capital is gonna pour into memory fabrication over the next year or two, and there will probably be a glut 2-3 years from now, followed by retrenchment and wails that innovation has come to a halt because demand has stalled.
We're not talking about growing tomatoes in your backyard.
Industrial production is no different, it just runs on different scales.
> Have the laws of supply and demand been suspended?

There is the law of uncertainty override it eg trade wars, tariffs , etc.

No 1 is going all in with new capacity.

If "performant" devices are not widespread then telemetry will reveal that the app is performing poorly for most users. If a new festure uses more memory and sugnificantly increases the crash rate, it will be disabled.

Apps are optimized for the install base, not for the engineer's own hardware.

What is the point of telemetry if your IDE launching in under 10s is considered the pinnacle of optimization?

That's like 100B+ instructions on a single core of your average superscalar CPU.

I can't wait for maps loading times being measured in percentage of trip time.

If your IDE isn’t launching instantly you have a bad IDE.
Because you don't want to regress any of the substeps of such a loading progress to turn it back into 10+ seconds of loading.
Can confirm, I'm currently requesting as much RAM as can fit in the chassis and permission to install an OS not too divorced from what we run in prod.

On the bright side, I'm not responsible for the UI abominations people seem to complain about WRT laptop specs.

How many "great products and services" even need a lot of RAM, assuming that we can live without graphics-intensive games?
Some open source projects use Slack to communicate, which is a real ram hog. Github, especially for viewing large PR discussions, takes a huge amount of memory.

If someone with a low-memory laptop wants to get into coding, modern software-development-related services are incredible memory hogs.

IRC is far superior than Slack when it comes to RAM usage. Projects should just switch to that.
Or even Jabber/XMPP, which has video call and inline images support and it would run on machines a magnitude slower.
Image, video, and music editing. Developing, running, and debugging large applications.
The last three sounds to me like self-inflicted issues. If applications weren't so large, wouldn't less resources be needed?
I'm not optimistic that this would be the outcome. You likely will just have poor running software instead. After all, a significant part of the world is already running lower powered devices on terrible connection speeds (such as many parts of Africa).
> a significant part of the world is already running lower powered devices

but you cannot consider this in isolation.

The developed markets have vastly higher spending consumers, which means companies cater to those higher spending customers proportionately more (as profits demand it). Therefore, the implication is that lower spending markets gets less investment and less catered to; after all, R&D spending is still a limiting factor.

If the entirety of the market is running on lower powered devices, then it would get catered for - because there'd be no (or not enough) customers with high powered devices to profit off.

In general what you are saying makes sense. But there are specific counter examples, such as Crysis in 2008 or CyberPunk 2077 some years ago.

Both didn’t run great on the “average consumer hardware”.

But I’ll admit this is cherry picking from my side :)

I don't think it's going to happen in this day and age. Some smart people will but most barley know how to write there own code let alone write efficient code
I think the actual outcome is they will expect you to rent servers to conduct all your computing on and your phone and pc will be a dumb terminal.
At the same time, AI has made it easier than ever to produce inefficient code, so I expect to rather see an explosion of less efficient software.
It has also made it easier than ever to build native applications that use the os provided toolkits and avoid adding a complete web-tech stack to everything.
Why are you celebrating compute becoming more expensive? Do you actually think it will be good?
Good luck with that.

This item has no comments currently.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal