Preferences

I'm reminded of how Carmack talked about the extra efficiencies available when targeting consoles, because you knew exactly what hardware was available.

It's great that the efficiencies available can be shown to be extractable. The real, much harder, trick is putting together a sufficiently smart compiler to enable them for heterogeneous compute setups.


bombcar
The demoscene also is an example of how much you can do if you can be absolutely sure exactly what hardware you’re running on.

The problem is that even for things like consoles, it's usually more "cost efficient" to write normal fast-to-write code that isn't maximally effective, let the compiler do its magic, and call it good enough.

Sometimes I dream of what the world would do if we were mystically stuck on exactly the processors we have today, for twenty years.

nostrademons
I've wondered sometimes what software would look like if a crisis took out the ability to build new semiconductors and we had to run all our computing infrastructure on chips salvaged from pregnancy tests, shoplifting tags, cars, old PCs, and other consumer electronics. We'd basically move backwards about 20 years in process technology, and most computers would have speeds roughly equivalent to 90s/00s PCs.

But then, this still wouldn't incentivize building directly to the hardware, because of the need to run on a large variety of different hardware. You're still better off preferencing portability over performance, and then making it up by cutting scope and ease of development.

wat10000
You might enjoy Dusk OS and its more extreme subling Collapse OS: https://duskos.org https://collapseos.org
Fabricio20
Funny you say this... This exact thought experiment was going on last month! Laurie Wired [0] a cybersec youtuber asked it on twitter and got some interesting replies too!

[0]: https://www.youtube.com/watch?v=L2OJFqs8bUk

brailsafe
This sounds kind of similar to what I've heard about Cuba's relationship with cars and probably technology after the U.S embargo. Not sure how true it was/is though.
nyarlathotep_
> I've wondered sometimes what software would look like if a crisis took out the ability to build new semiconductors and we had to run all our computing infrastructure on chips salvaged from pregnancy tests, shoplifting tags, cars, old PCs, and other consumer electronics. We'd basically move backwards about 20 years in process technology, and most computers would have speeds roughly equivalent to 90s/00s PCs.

Don't forget disposable vapes: https://www.hackerneue.com/item?id=45252817

jacquesm
Be careful what you wish for...
Optimizing for the hardware you are on is demonstrably an effort and skill issue. Everyone understands that with enough time and engineers, any piece of software could be optimized better. If only we had large volumes of inexpensive "intelligence" to throw at the problem.

This is one of my back-of-mind hopes for AI. Enlist computers as our allies in making computer software faster. Imagine if you could hand a computer brain your code, and ask it to just make the program faster. It becomes a form of RL problem, where the criteria are 1) a functionally equivalent program 2) that is faster.

ryandrake
This is what I was thinking, too. For so long, the default mode of operating a software company has been:

"Developer time is so expensive, we need to throw everything under the bus to make developers fast."

The kinds of things often thrown under the bus: Optimizations, runtime speed, memory footprint, disk image size, security, bug fixing, code cleanliness / lint, and so on. The result is crappy software written fast. Now, imagine some hypothetical AI (that we don't have yet) that makes developer time spent on the project trivial.

Optimistically: There might be time for some of these important software activities.

Pessimistically: Companies will continue to throw these things under the bus and just shit out crappy software even faster.

IgorPartola
My favorite part of this phenomenon is every company that interviews developers on data structures and algorithms, then puts out a calculator app that takes half a gigabyte of storage and nearly as much RAM to run.

I have not had to use Windows in ages but every time I touch it I am amazed at the fact that it takes like 10-15GB for a bare installation of the latest version, while it does about the same amount of work as XP was able to do in under 1GB. Yes I am aware assets are a thing but has usability increased as a result of larger assets?

typpilol
To be fair, windows has so much backwards compatibility, I'm sure there's a ton of stuff there that's not used by 99.9% of people.

That's a good or a bad thing depending on your perspective

IgorPartola
I am fairly certain that if you install every Debian package available it will still be less than 16GB. Windows 10 is a bare OS at that size.
thenthenthen
The latest iOS update (!) is more than 16gb… a mobile OS…
naasking
It ships with just as many features as Windows 10 which is also in that range, so it's not too surprising.
jonhohle
> functionally equivalent

Who confirms what is functionally equivalent?

electronvolt
You can, with some programming languages, require a proof of this (see: Rocq, formerly 'coq').

I think a more interesting case might be showing functional equivalence on some subset of all inputs (because tbh, showing functional equivalence on all inputs often requires "doing certain things the slow way").

An even more interesting case might be "inputs of up to a particular complexity in execution" (which is... very hard to calculate, but likely would mean combining ~code coverage & ~path coverage).

Of course, doing all of that w/o creating security issues (esp. with native code) is an even further out pipe dream.

I'd settle for something much simpler, like "we can automatically vectorize certain loop patterns for particular hardware if we know the hardware we're targeting" from a compiler. That's already hard enough to be basically a pipe dream.

Yeah restructuring for autovectorization with otherwise equivalent results would be a great example and step
vardump
Notably for example C/C++ code is not necessarily functionally equivalent, when it's compiled on different platforms.
It's not even guaranteed to be functionally equivalent when compiled on the same hardware with the same compiler etc. Undefined behaviour can do what it wants. (And implementation defined behaviour also has a lot of leeway.)

However, if you stick to only defined behaviour, they are 'functionally equivalent', if your compiler doesn't have a bug.

switchbak
The Magic does, of course!
It's not just being sure exactly what the hardware is, in demos you have the additional luxury of not being interactive. So you can plan everything exactly out in advance.
saagarjha
This is true of inference too.
potatolicious
Consoles are pretty heterogeneous IRL too, though. You have multiple SKUs (regular and Pro, for example), not to mention most games will also target multiple consoles (PlayStation + Xbox + Switch is a common combo).

So in reality the opportunities to really code against a specific piece of hardware are few and far between...

Heck, then you get into multiple operating modes of the same hardware - the Nintendo Switch has a different perf profile if it's docked vs. not.

SonOfLilit
This used to be less true at the time Carmack said it :>
djmips
The original Switch was launched in 2016 that's plenty of time with a stable platform. The multiple operating modes in practice can be approached by coding against un-docked ( handheld) and then adding bonus quality for docked.
Agentlien
This is exactly what I've been doing when optimizing games for the switch.
A handful of variants for consoles is not nearly as bad as the almost limitless variety on PC.
ortusdux
>Sometimes I dream of what the world would do if we were mystically stuck on exactly the processors we have today, for twenty years.

Reminds me of the old American cars in Cuba - https://en.wikipedia.org/wiki/Yank_tank

richardw
Cubans benefited from the cars being older, simpler, and robust. Imagine freezing car tech now, with so many electronics, far more parts and built to be replaced relatively quickly!
These older cars broke down all the time. There's a reason old American sit-coms have at least some characters always tinkering with their cars: you needed to do that. Nowadays, cars just work.
Agentlien
> it's usually more "cost efficient" to write normal fast-to-write code that isn't maximally effective, let the compiler do its magic, and call it good enough.

For the last six years my full time job has largely been optimizing games where most of the team has been working with this mindset. Sometimes someone spends a few days of just getting things done, followed by others building on top of it. This leads to systems which are not fast enough and take me weeks or even months to optimize.

We even got together at my last job and created a series of lectures on performance and best practices for everyone, including artists, to get ahead of this type of issues. It was apparently very appreciated, especially among the non technical staff who said it was valuable and they had no idea.

Damogran6
That's what you got with BeOS...throw out backward compatibility and build to current Best practices...it's ability to extract performance out of an 133 Mhz processor was amazing.
pureagave
Even better was the BeBox running BeOS. That was a cool use of a fast dual CPU PowerPC platform with great graphics. Amiga vibes. But turns out that humans need software applications more than they need efficient use of the hardware.
bombcar
It was the story with so many things "back then" - even Itanium was a beast on custom-coded perfect applications.
rahimnathwani

  The problem is that even for things like consoles, it's usually more "cost efficient" to write normal fast-to-write code that isn't maximally effective, let the compiler do its magic, and call it good enough.
This wasn't always the case. I have a friend who used work on games for the original Playstation. I remember him telling me that part of his job (or maybe his whole job) was optimizing the machine code output by the C compiler.
fdupress
And don't forget that Sony and Microsoft have compilers teams, working on specialised GCC and LLVM backends, and sometimes upstreaming general improvements.
cptskippy
> The problem is that even for things like consoles, it's usually more "cost efficient" to write normal fast-to-write code that isn't maximally effective, let the compiler do its magic, and call it good enough.

Given all the time and money, there's also a skills gap.

You can use money and time to buy skills.
cptskippy
Unlimited time and money will not make someone like me a John Carmack level programmer. There are a finite number of individuals operating at his level or above and having them hyper optimize code is a poor use of their time.
Oh, I meant more like: if you have enough money, you can employ John Carmack (or similar) for a while.
Dylan16807
For hardware that isn't pre-framebuffer, demos seem to be mostly about hyperoptimizing the code in a portable way, much less optimizing to specific hardware timings and quirks.
LarsDu88
Rust jobs would actually touch more hard tech rather than being concentrated in crypto scams.
pjmlp
Unless it happens to something like a PS3 or Saturn.
cyanf
Despite sentiments around Mojo being negative on HN due to the stack not being OSS, this is the ultimate goal of Modular.

https://signalsandthreads.com/why-ml-needs-a-new-programming...

I listened to that episode, by chance, last week. It was well worth the time to listen.

This item has no comments currently.