Its called externalized cost and its as real in software as it is IRL
So the cost is there, it's just not paid (directly) by the developer. But we all end up paying someone else's externalized cost, included said developer that is paying some other developers' externalized costs.
Yeah. I’ve been thinking of writing a blog post doing the math on that. If I spend $2000 on a computer, and that gets me a certain amount of ram and cpu and so on, we can figure out a dollar figure on that bloat.
Then multiply by the number of people who use a piece of software (eg slack) and we’d get a figure for the externalised cost of a piece of software.
Consider also the missed market opportunity: my personal devices are 13yo laptop and 9yo phone. If an app isn’t compatible or makes it lag, I delete it and download a competing one. I’m not alone, and yes: I have money to spend on your app. I just don’t want/need to upgrade hardware that often.
Kudos for keeping your devices for so long, I also try to have mine last as long as it's practical, but so far i didn't manage to have them last so long. Unfortunately, you're in a minority. Most people would change their phone once the apps they're using aren't compatible with it anymore. So devs don't consider this aspect much.
Worst offender being Google, who toggled on VP8 / VP9 decoding on YouTube despite the vast majority of devices only having h264 hardware decode.
The aggregate waste in battery wear and watts spent is pretty staggering when you think about it, all so google could spend a few cents less per 100 streams.
Or they could just send the video uncompressed and then it would take even less hardware resources to decode on the client side. Why, in a sense it would be a lot more like decoding analog television signals at that point. (Not least of which since few clients would have the network bandwidth to handle more than 360-480p of that ;)
It wouldn't have taken less hardware resources, because would you look at that.. we find h264 hardware decode even in bottom-of-the-barrel mobile CPUs. Pure CPU and even GPU decoding of video codecs is enormously expensive powerbudget-wise.
Not to mention the fact that a mobile radio would have to be kept on high power constantly to pull in that 1Gbit/s stream.
You can be snarky all you want, it was a terrible move by Google.
;)
I think this is not the case. E.g., we replace our computers every few years, but not because the new ones can do things that you can't do with your current computer. It's because the software you use to do the same things keeps getting more resource-hungry.