I had to return to Windows as a daily work platform after a long time away (on Macs). I already knew that it had devolved into a grotesquely defective, regressive parade of UI blunders and deleted functionality... but its actual performance is TERRIBLE. I'm waiting for simple operations that I wouldn't have expected to wait for 20 years ago, even on bog-standard office desktop machines.
a 128GB SoC m4 pro max can do pretty wild data science with close to a terabyte/second speed without the latency of typical offloading/back-and-forth
My point being, with time performance might go up. But instead of that making my device faster/long-lasting, developers use that extra performance to cram in more stuff, at the end of which I come out only slightly better if not worse (as is in my case)
An eleven year old computer is still useful, which is kind of cool, but also kind of bothers me in that apparently we haven't made enough progress in software to justify buying new hardware, apparently.
I think the better way is honestly just to make something competitive, preferably FOSS, and I actually do think we're getting there. Blender, for example, is an extremely decent animation tool nowadays, Krita is a very good digital art program, OpenToonz/Tahoma2D are pretty ok 2D animation programs, Godot is a decent-enough game engine, etc.
Yeah there are still gaps and I'm not claiming everything has parity with everything with awful pricing models, but I think we're getting there, and I think that's a more sustainable model than piracy.
Both performance and performance-per-watt continue to improve with each new generation of CPUs.