# Work
Currently the FOSS & cloud correspondent for the Register.
https://www.theregister.com/Author/Liam-Proven
Work email: lproven+hn@sitpub.com
# Personal
Comments here (or anywhere else, unless otherwise stated) are personal opinion and not on behalf of any employer.
Based in Douglas, Isle of Man.
Quick info:
https://about.me/liamproven/
Personal blog:
http://lproven.dreamwidth.org/profile
Tech blog:
http://liam-on-linux.dreamwidth.org/profile/
Socials:
https://social.vivaldi.net/@lproven
https://bsky.app/profile/lproven.bsky.social
https://x.com/lproven
https://meet.hn/city/im-Douglas
- > I only use turn signals when there are other cars around that would need the indication.
That is a very bad habit and you should change it.
You are not only signalling to other cars. You are also signalling to other road users: motorbikes, bicycles, pedestrians.
Your signal is more important to the other road users you are less likely to see.
Always ALWAYS indicate. Even if it's 3AM on an empty road 200 miles from the nearest human that you know of. Do it anyway. You are not doing it to other cars. You are doing it to the world in general.
- No, because an LLM cannot summarise. It can only shorten which is not the same.
Citation: https://ea.rna.nl/2024/05/27/when-chatgpt-summarises-it-actu...
- Totally agreed regarding appearance etc.
However, the one thing I'd take issue with:
> As a programmer, I can point out all the many, many flaws with its technical architecture.
I think, since we started out on history here, we must consider the history and its context.
1. Apple does the Lisa: a cheaper Xerox Alto, minus the networking and the programming language. Multitasking, hard disk based, new app paradign. The Future but 1/4 of the price of the original.
It's not cheap enough. It flops, badly.
2. Jobs repurposes the parallel information-appliance project into a cheaper Lisa. Remove the hard disk and the slots and all expansion, seal it up, floppy only, remove the fancy new app format & keep it simple: apps and documents. Smaller screen but square pixels. Keeps most of the Lisa good stuff.
It's still expensive but it's cheap enough. It sells. It gets Pagemaker. It changes the course of the industry.
But to get a GUI OS into 128kB of RAM, they had to cut it brutally.
It worked but the result is significantly crippled, and Apple spent the next decade trying to put much of that stuff back in again.
Remarkably enough, they succeeded.
By MacOS 7.6 it had networking, network-transparent symlinks, TCP/IP, a HiColour GUI, usable multitasking, virtual memory, and more. It was actually a bloody good OS.
Yes, it was very unstable, but then, remember so was DOS, so was Windows 3.
The snag is, that time was 1997 and by then MS had surpassed Windows NT and Windows 95 with NT 4.
NT 4 had no PnP, no power management, no working 3D except vastly expensive OpenGL cards, it lost a lot of NT 3.x's stability because of the frantic desperate bodge of putting the GDI in the kernel, but it was good enough, and it made Apple look bad.
Apple was ploughing its own lonely furrow and it made a remarkably good job of it. It was just too slow.
When Jobs came back, he made a lot of good decisions.
Junk most of the models. Junk all the peripherals. Make a few models of computer and nothing else.
Junk Copland, Pink, Taligent, all that.
Meanwhile, like Win9x + NT, 2 parallel streams:
[a] Win9x parallel: salvage anything good that can be stripped out of Copland, bolt it onto MacOS 7.x, call it 8.x and kill off the clones.
[b] NT parallel: for the new project, just FFS get something out the door ASAP: Rhapsody, then Mac OS X Server. All the weird bits of NeXTstep that were to avoid Apple lawsuits (vertical menus, scrollbars on the left, no desktop icons, columnar file browser, etc.): remove them, switch 'em back to the Apple way.
Meantime, work on a snazzy facelift for the end-user version. Make the hardware colourful and see-through, and do that to the OS too.
I think, looking at the timeline and the context, all the moves make sense.
And I used MacOS 6, 7, 8 and 9. All were great. Just such a pleasure to use, and felt great. I didn't care that NT was more solid: that was a boring reliable bit of office equipment and it felt as exciting as a stapler. NT 3.51 was fugly but it worked and that's what mattered.
- 1 point
- OK. A very good response indeed, and I can't really counter any of it.
Well, I mean, I can -- e.g. I loved classic MacOS. But that's a personal judgement call.
I think I've seen Homer's Car in meme format, now you come to mention it.
- Aha! Thank you!
- Exactly. I rather miss the 15.6" Toshiba Satellite Pro A300 I had when I emigrated, a decade back.
It wasn't very portable, no, but around the house, it was great. Good sized full-travel keyboard, numeric keypad, lots of ports, and a nice big clear comfortable eye-friendly screen. Two SATA bays, so I could have the affordable combination (a dozen years ago) of a small fast SSD for the OS and a huge big cheap HDD for the data. Tiny trackpad, but I used a mouse.
There is a 17" classic Thinkpad before they went to nasty thin fashion-follower keyboards, but they only seem to be available in the USA and even given my fondness for old Thinkpads, I am not willing to pay £1000 for a second-hand decade-old one.
- Bluetooth is a PITA.
This is the 21st century version of an axiom: it's an XKCD.
Pairing is a pain, charging is a nuisance, battery life is a constant worry, responsiveness is dodgy... there is nothing good about it. Give me something built-in, cabled, and always-on.
Wireless is for fashion victims.
- > Asimov seems to have been a very modest man...
I never met him -- he hated travel, and I never could afford to go to a US convention -- but from all I've read, no, the absolute opposite was the case.
- Good correction. This is the important point here. And there is a sub-point which is nearly as important:
The 8086 was out there and selling for years. AT&T ported UNIX™ to it, meaning it was the first ever microprocessor to run Unix.
But even so, DR didn't offer an 8086 OS, although it was the dominant OS vendor and people were calling for it. CP/M-86 was horribly horribly late -- it shipped after the IBM PC, it shipped about 3-4 years after the chip it was intended for.
The thing is, that's common now, but late-1970s OSes were tiny simple things.
Basically the story is that there was already an industry-standard OS. Intel shipped a newer, better, more powerful successor chip, which could run the same assembly-language code although it wasn't binary compatible. And the OS vendor sat on its hands, promising the OS was coming.
IBM comes along, wanting to buy it or license it, but DR won't deal with them. It won't agree to IBM's harsh terms. It thinks it can play hardball with Big Blue. It can't.
After waiting for a couple of years a kid at a small company selling 8086 processor boards just writes a clone of it, the hard way, directly in assembler (while CP/M was written in PL/M), using the existing filesystem of MS Disk BASIC, and puts it out there. MS snaps up a licence and sells it on to IBM. This deal is a success so MS buys the product.
IBM ships its machine, with the MS OS on it. DR complains, gets added to the deal, and a year or so later it finally ships an 8086 version of its OS, which costs more and flops.
The deal was very hard on Gary Kildall who was a brilliant man, but while MS exhibited shark-like behaviour, it was a cut-throat market, and DR needed to respond faster.
- This seems strangely parochial to me. It reads a little like an American who knows San Francisco and so knows about trams has tried to imagine what a European city and country is like, and hasn't quite made the pieces fit together.
It has what I guess are American references that are meaningless to me. What is or was The Homer? In what universe are mopeds some sort of unsuccessful trial? Much of Asia has travelled by mopeds for ~75 years now; the Honda C90 is the best-selling motor vehicle of all time, and it's not even close.
As a super-extended metaphor for computing, I don't think the timeline fits together: it has Xerox, Apple, and IBM in the wrong order, but I'd find that hard to nail down. There was overlap, obviously.
It feels to me like the big influences are squeezed in, but not the smaller ones -- possibly because they mostly aren't American and don't show up on American radar. Wirth and Pascal/Modula-2/Oberon, the Lilith and Ceres; Psion; Acorn; other Apple efforts notably the Newton and things it inspired like Palm; Symbolics and InterLisp.
Nice effort. I respect the work that went into it, but it doesn't fix Stephenson's effort -- it over-extends it until it snaps, then tapes the bits together and tries again.
- The firmware in question being Microsoft's ThreadX. This was made FOSS a few years back but that doesn't help with the Pi.
https://www.theregister.com/2023/11/28/microsoft_opens_sourc...
- I played with a friend's ReMarkable 2 tablet a few years ago.
It almost made me weep, it was so primitive and so basic compared to a Newton.
A modern e-ink tablet like that with NewtonOS 2 on it would be a thing of great beauty and elegance.
- I endorse this. Please do take whatever measures are possible to discourage it, even if it won't stop people. It at least sends a message: this is not wanted, this is not helpful, this is not constructive.
- 1 point
- Excellent point.
When I lived in London I helped clients donate a lot of kit to ComputerAid International:
And what's now Computers4Charity:
- It is still slow.
Try Alpine. It's amazing.
Xubuntu 22.04 took nearly 10 GB of disk and half a gig of RAM. I measured it:
https://www.theregister.com/2022/08/18/ubuntu_remixes/
Alpine takes 1.1 GB of disk and under 200 MB of RAM.
https://www.theregister.com/2025/12/05/new_lts_kernel_and_al...
Both running a full Xfce $CURRENT desktop, in a Virtualbox VM.
- Yeah it is.
It fits into under 700 MB and runs in well under 100 MB of RAM. The default Ubuntu image is about 6 GB now and takes a gig of RAM.
Have you not tried it? I have:
https://www.theregister.com/2024/02/14/damn_small_linux_retu...
- > There will never be a scenario where you need all this lightweight stuff
I think there are many.
Some examples:
* The fastest code is the code you don't run.
Smaller = faster, and we all want faster. Moore's law is over, Dennard scaling isn't affordable any more, smaller feature sizes are getting absurdly difficult and therefore expensive to fab. So if we want our computers to keep getting faster as we've got used to over the last 40-50 years then the only way to keep delivering that will be to start ruthlessly optimising, shrinking, finding more efficient ways to implement what we've got used to.
Smaller systems are better for performance.
* The smaller the code, the less there is to go wrong.
Smaller doesn't just mean faster, it should mean simpler and cleaner too. Less to go wrong. Easier to debug. Wrappers and VMs and bytecodes and runtimes are bad: they make life easier but they are less efficient and make issues harder to troubleshoot. Part of the Unix philosophy is to embed the KISS principle.
So that's performance and troubleshooting. We aren't done.
* The less you run, the smaller the attack surface.
Smaller code and less code means fewer APIs, fewer interfaces, less points of failure. Look at djb's decades-long policy of offering rewards to people who find holes in qmail or djbdns. Look at OpenBSD. We all need better more secure code. Smaller simpler systems built from fewer layers means more security, less attack surface, less to audit.
Higher performance, and easier troubleshooting, and better security. There's 3 reasons.
Practical examples...
The Atom editor spawned an entire class of app: Electron apps, Javascript on Node, bundled with Chromium. Slack, Discord, VSCode: there are multiple apps used by tens to hundreds of millions of people now. Look at how vast they are. Balena Etcher is a, what, nearly 100 MB download to write an image to USB? Native apps like Rufus do it in a few megabytes. Smaller ones like USBimager do it in hundreds of kilobytes. A dd command in under 100 bytes.
Now some of the people behind Atom wrote Zed.
It's 10% of the size and 10x the speed, in part because it's a native Rust app.
The COSMIC desktop looks like GNOME, works like GNOME Shell, but it's smaller and faster and more customisable because it's native Rust code.
GNOME Shell is Javascript running on an embedded copy of Mozilla's Javascript runtime.
Just like dotcoms wanted to dis-intermediate business, remove middlemen and distributors for faster sales, we could use disintermediation in our software. Fewer runtimes, better smarter compiled languages so we can trap more errors and have faster and safer compiled native code.
Smaller, simpler, cleaner, fewer layers, less abstractions: these are all goods things which are desirable.
Dennis Ritchie and Ken Thompson knew this. That's why Research Unix evolved into Plan 9, which puts way more stuff through the filesystem to remove whole types of API. Everything's in a container all the time, the filesystem abstracts the network and the GUI and more. Under 10% of the syscalls of Linux, the kernel is 5MB of source, and yet it has much of Kubernetes in there.
Then they went further, replaced C too, made a simpler safer language, embedded its runtime right into the kernel, and made binaries CPU-independent, and turned the entire network-aware OS into a runtime to compete with the JVM, so it could run as a browser plugin as well as a bare-metal OS. Now we have ubiquitous virtualisation so lean into it: separate domains. If your user-facing OS only runs in a VM then it doesn't need a filesystem or hardware drivers, because it won't see hardware, only virtualised facilities, so rip all that stuff out. Your container host doesn't need to have a console or manage disks.
This is what we should be doing. This is what we need to do. Hack away at the code complexity. Don't add functionality, remove it. Simplify it. Enforce standards by putting them in the kernel and removing dozens of overlapping implementations. Make codebases that are smaller and readable by humans.
Leave the vast bloated stuff to commercial companies and proprietary software where nobody gets to read it except LLM bots anyway.
The point of indicating is that it's even more important to the people you didn't notice.