These restrictions extend outside the particular device. It must also be illegal as a commercial entity to enforce security schemes which involve remote attestation of the software stack on the client device such that service providers can refuse to service clients based on failing attestation. Service providers have other means of protecting themselves, taking away users control of their own devices is a heavy handed and unnecessarily draconian approach which ultimately only benefits the ad company that happens to make the software stack since they also benefit from restricting what software users can run. Hypothetically, they might be interested in making it impossible to modify video players to skip ads.
1. Devices should be allowed to display a different logo at boot time depending on whether the software is manufacturer-approved or not. That way, if somebody sells you an used device with a flashed firmware that steals all your financial data, you have a way to know.
2. Going from approved to unapproved firmware should result in a full device wipe, Chromebook style. Possibly with a three-day cooldown. Those aren't too much of an obstacle for a true tinkerer who knows what they're doing, but they make it harder to social engineer people into installing a firmware of the attackers' choosing.
3. Users should have the ability to opt themselves into cryptographic protection, either on the original or modified firmware, for anti-theft reasons. Otherwise, devices become extremely attractive to steal.
Not sure how to phase this legally, but please also add a provision against manufacturers making the "custom firmware" logo hideously ugly on purpose to discourage rooting - like e.g.Microsoft did for Surface tablets.
> 3. Users should have the ability to opt themselves into cryptographic protection, either on the original or modified firmware, for anti-theft reasons.
Full agreement here. I very much would like to keep the bootloader locked - just to my own keys, not the OEMs.
I think it's a difference in mindset whether you view custom firmware as a grudging exception for techies (with the understanding that "normal" people should have a device under full control of their respective vendor), or whether you want an open OS ecosystem for everyone.
Another thought on that point: Why of all things is manufacturer approval so important? We know manufacturers often don't work for - or even work against - the interests of their end users. Manufacturer approval is not an indicator for security - as evidenced by the OP article.
If anything, we need independent third parties that can vet manufacturer and third party software and can attach their own cryptographic signatures as approval.
I should note Google has such an attestation scheme, and there are reliable defeats for it in most situations given root access. Apps have been able to insist on hardware-backed attestation which has not been defeated for some time, but that isn't available for old devices. Almost none do so.
If this had a meaningful impact on fraud, more apps would insist on the hardware-backed option, but that's quite rare. Even Google doesn't; I used Google Pay contactless with LineageOS and root this week. I'm currently convinced it's primarily a corporate power grab; non-Google-approved Android won't be a consumer success if it doesn't run your banking app, and the copyright lobby loves anything that helps DRM.
The web app has been running with this security model for decades on PCs, and it has been fine. The whole narrative about remote attestation being necessary to protect users is an evil lie in my opinion, but it is an effective lie which has convinced even knowledgeable IT professionals that taking away device ownership from users is somehow justified.
The bank’s bad processes are not an end device fault.
I'm alright with limiting liability for an unlocked/customized phone (for things that happen from that phone) - but that's a legal/contractual thing. For that to work, it's enough for a judge to understand that the phone was customized at that time - it doesn't require the app to know.
Won't this also forbid virus scanners that quarantine files?
> This pertains to all programmable components on the device, including low-level hardware controllers.
I don't think it's reasonable to expect any manufacturer to uphold a warranty if making unlimited changes to the system is permitted.
There might be a couple messy edge cases if applied at the software level but I think it would work well.
Applied at the hardware level it would be very clear cut. It would simply outlaw technical measures taken to prevent the user from installing an arbitrary OS on the device.
Regarding warranties, what's so difficult about flashing a stock image to a device being serviced? At least in the US wasn't this already settled long ago by Magnuson-Moss? https://en.wikipedia.org/wiki/Magnuson%E2%80%93Moss_Warranty...
Yes, I think that would cover most cases if we take it to its logical conclusion of wiping all device state (hard disk). OTOH, a few points:
1. I would accept the need to wipe the hard disk if I had messed with firmware or the OS, but not if a couple of keys on the keyboard had stopped working. This implies that (for me at least) a meaningful distinction remains between these two "levels" of warranty service. Do you agree?
2. Activities like overclocking or overvolting a CPU have the potential to cause lasting damage that can't be reversed by re-flashing. Under the policy you're suggesting, it would be illegal for manufacturers to offer users the option "You can pull this pin low to overclock outside the supported range, but you will void the warranty by doing so", and too expensive for them to endlessly replace parts damaged by these activities for free under warranty, so that consumer option, rare as it already is, would go away completely.
3. I still think there may be some devices that are impractical to completely re-flash. According to this 2021 Porsche article [0], modern cars contain 70-100 ECUs (microcontrollers), each of which will have its own flash/EEPROM.
[0]: https://medium.com/next-level-german-engineering/porsche-fut...
1. I expect wiping any given component to be entirely up to the manufacturer's discretion. If doing so is not trivial and is legitimately required for the repair to proceed then I'd expect the user to be charged for the additional service.
2. Violating manufacturer specifications and being at fault for damages are sometimes distinct. A manufacturer arbitrarily saying "you must not do X" should not necessarily mean that doing X will void the warranty. It might though. Discretion is obviously required.
3. If your car stops working after you mess with the firmware and you take it in to the dealer I imagine they'd charge you to reflash things since the issue was caused by your own actions. That doesn't mean they should be able to decline to cover entirely unrelated defects.
Also I don't think vehicle firmware would be caught up by the original proposal in the first place since cars aren't generally intended to run third party software. There's a grey area with infotainment systems that have an app store depending on if those are viewed as standalone or part of the larger vehicle. However reframing the proposal to revolve around intent would likely leave the firmware on unrelated embedded components in the clear to be locked down so long as those components don't interfere with the ability to freely use the general purpose computing element.
Personally I'd like vehicle firmware to be covered by similar protections but I recognize that falls outside the scope of a proposal about products intended for use as general purpose computing devices.
I don't like the "intended for general purpose computing" concept so much. For one, it seems to offer lots of easy wiggle room to manufacturers: Just say that your product is not intended for that, but for something marginally more specific. For another, it's not clear to me why general purpose computing ought to enjoy consumer protections that other manufactured devices do not. (One exception I'd grant is for safety reasons: If tinkering with a device could make it cause injury, fine, that device can be in a different class.)
Yes. If I really _want_ to execute malware on my device, I should be allowed to do so by disabling the antivirus or disregarding a warning.
> I don't think it's reasonable to expect any manufacturer to uphold a warranty if making unlimited changes to the system is permitted
It is very reasonable and already the rule of law in "sane" jurisdictions, that manufacturer and mandated warranties are not touched by unrelated, reversable modifications to both hard- and software.
I agree.
> already the rule of law in "sane" jurisdictions, that manufacturer and mandated warranties are not touched by unrelated, reversable modifications to both hard- and software.
Do you have any examples of such jurisdictions? I think whether this is reasonable turns on how "reversible" is interpreted. If it means "reversible to factory settings", including wiping all built-in storage media, then it seems reasonable to me that manufacturers should support this (possibly modulo some extreme cases like cars that have dozens of CPUs). But I would not be happy with having my hard disk wiped if I sent in my laptop for repairs because a couple of keys stopped working, which tells me that (to me) there remain at least two classes of "problem that should be fixed for free under warranty by the manufacturer".
Words written on toilet paper. Only thing that exists today are “billionaire rights”.
But even the DRM that is already there often only uses copyright laws as suggestions. E.g. YouTube's takedown guidelines are defined through their TOS, not through the DMCA.
Watching copyrighted stuff on general purpose computers is a very new phenomena, and it's still quite atypical IMO.
The crazy thing is that on all the devices I've had AVB is implemented on top of secureboot. Being able to set your own secureboot keys is bog standard on corporate laptops. The entire situation makes absolutely no sense.
Also for the record I think it's a silly attack vector for the average person to worry about. A normal person does not have secret agents attempting to flash malicious images to his phone while he's in the shower.
No, but millions of women have controlling partners or friends who betray their trust and, for example, many people going through U.S. Customs are being asked to surrender control of their devices so they can be used without their knowledge. There’s a well-funded malware industry with a lot of customers now.
Oh that's pretty cool, wasn't aware.
> The crazy thing is that on all the devices I've had AVB is implemented on top of secureboot. Being able to set your own secureboot keys is bog standard on corporate laptops. The entire situation makes absolutely no sense.
Hold on, could you elaborate a bit on this? I thought it was an either/or type deal cause they do the same thing.
It's possible this has changed or was never widespread in the first place. I have a very limited (and historic) sample size.
In other words, DRM.
https://en.wikipedia.org/wiki/Trusted_Computing#Criticism
(I knew from the beginning that this was known as the Palladium project, and until recently, a search for "Palladium TCG" would find plenty of information about that history, yet now references to that group and its origins in DRM have seemingly disappeared from Google. Make of that what you will...)
https://www.tcgplayer.com/product/593140/yugioh-quarter-cent...
Bizarre, I did find it on bing though..
If I want my device to be secure, I want this trust. If I want to sell a copy of my virtual asset to only be used in ways I approve of, I want this trust. You can't have only one of these at the same time, either your device can provide this trust or it cannot. That's not the battle in my view. The battle is to implement this appropriately, such that e.g. if we're representing access control, identity, and ownership, then that representation should match reality. So if I'm said to own a device, the device can and will attest so, and behave accordingly. It's just that instead of that, I'm always somehow just being loaned these things, only have some specified amount of control over these things, and am just a temporary user somehow. That's the issue. And that these systems are not reimplementable, and as such entitlements do not carry around.
Device security and mediated trust between mutually distrustful entities are separate things.
> If I want to sell a copy of my virtual asset to only be used in ways I approve of, I want this trust.
I don't want you to be able to do that. At least not with general purpose computing devices (ie my phone). Maybe for something like a game console or set top box but that doesn't seem to be what's being discussed here.
> either your device can provide this trust or it cannot
It is entirely possible for device firmware to do nothing more than verify that the bootloader was signed with a particular user configurable key.
Especially in Africa, where privacy and consumer rights are probably less relevant than the US/EU.
Well, then it's high time the laws of ownership in just about evey country in the world were updated.
As it stands, if I buy something then I own it.
That's the point: you can't buy it, only license.
The minute Apple sees a clear path to get away with it, iPhone will essentially become licensed devices.
Then other phone makers will jump through the opening, at some point it becomes the standard, and we'll laugh at the "voting with your wallet" joke again.
> software
We're already full in licensing books, as truly the most pragmatic choice. Amazon opened the door, and many other ebook stores have jumped on the bandwagon.
To say it's unlawful is moot. Apple may have jurisdiction in the US but not across the globe, there are plenty of places I can think of to send an iPhone to have it fixed the way I want (and I'd do so the moment that market is established). There's no way Apple can police what people do with their hardware once it's in their hands, it's fanciful to think otherwise.
Open hardware is on the move, eventually considerably cheaper open products will become popular just on price alone. Competition will then be fierce, Apple will have to change its policies if changes to laws don't beat them to it. Remember also the US isn't the whole world, so those changes are likely to be enacted first outside the US. If Apple wants to sell there then it'll have to comply with those laws just as it did with USB-C in Europe.
Also keep in mind Apple, Google, Microsoft etc. have become the richest and fastest growing corporations in human history—they even beat out the previous contenders the Dutch and British East India Companies of the 17th and 18th Centuries.
These corporations became so rich so quickly because of a confluence of circumstances—the new tech paradigm of the personal computer, the wow factor that took the world by storm and a compete lack of regulations worldwide. Without regulations to keep these corporations in check they simply ran amuck.
That's now over. Yes, it will be some while before they're brought to heel but they'll never get such a straight run again.
Apple is on top now but let's see where it'll be in 20 years.
Similarly it is pretty messed up when people say stuff like “fire can burn you if you aren’t careful” because so many people rely on fire for food and warmth.
Cooking animal products at home poses a health risk. You should be sure to only ever consume animal products prepared by a duly licensed establishment.
The chauffeur's union would like to take this opportunity to remind you that amateurs operating their own motor vehicles risk serious injury and even death.
The FSD alliance would like to point out that hiring a licensed chauffeur also poses a non-negligible risk. Should you choose to make use of a personal vehicle it is strongly recommended that you select one certified by the FSD alliance. Failure to do so could potentially impact your health insurance premium.
Good tongue in cheek post, but in the US Magnuson-Moss prohibits warranty claim denials merely on the basis of non-OEM parts and service. It also puts the burden on the manufacturer to demonstrate the defect or failure was the direct result of the non-OEM part. Other jurisdictions have similar laws on the books.
Right to repair already exists in certain aspects and needs to be expanded (and enforced. Tons of those ‘will void warranty’ stickers are lies and you have legal rights to poke around)
The problem is getting the companies to change their act, and they probably won't without a class action lawsuit, and I have no idea if there's enough financial incentive there for a law firm to tackle it.
We can get so bogged down with “things that are real” and “exist in this universe” that we completely fail to focus on the vital stuff like “Bigfoot is circumcised” and “Who did it?” and “Why?”
Or do you dispute that you could be hospitalized for salmonella if you botch cooking poultry at home? Or perhaps you feel that there is no straightforward way to inadvertently endanger your life by servicing your vehicle incorrectly?
I genuinely do not understand the last two sentences. Are you pro- or anti- “telling people that salmonella exists” ? Is saying “salmonella exists and can be a problem” FUD or what? Do you think salmonella isn’t real
For starters, in most places, warranty is a legal requirement and the manufacturer isn't allowed to void it for whatever reason they want. If my phone's battery starts getting really hot in normal use, or I start getting dead pixels on my screen or whatever else, the fact I have a custom OS on my phone isn't relevant to the warranty claim any more than having it in a case or putting some stickers on it. Yes, it'll make claiming it more difficult, but that doesn't mean it's void, just that you'll have to fight through a few more tiers of support agents to get it fixed.
More importantly, rooting is only a security risk in the sense that it increases the attack surface for exploits. The same can be said for any other system-level software. Like if you buy an Nvidia graphics card in your computer and that loads its kernel driver, malware now has one more place to exploit. Are Nvidia graphics cards a security risk?
We've come an incredibly long way from just dropping /xbin/su and calling it a day. Modern (as in the last 10 years) root solutions have caller checks based on a user-defined whitelist and really modern implementations use kernel-level checks to make sure the app wanting root access is allowed to get it. The only way this can be dangerous is if one of those apps or the root solution itself has a code execution exploit. But again, the same can be said for the plethora of system-level bloatware vendors install these days.
This only makes the statement untrue if you use “can” and “will” interchangeably.
>More importantly, rooting is only a security risk in the sense that it increases the attack surface for exploits.
This is a good point. What even is “attack surface” anyway? Does anybody actually consider it when “evaluating security posture”? If I simply choose not to care about attack surface because I don’t want to, then doesn’t it simply become a factual nonissue? There are no answers to these questions
But if you really want a thorough reset, simply re-lock the bootloader and flash stock firmware from there. Nothing can persist through that without an exploit in the verification chain and if you have that kind of exploit, you don't need the bootloader to be unlocked in the first place.
Also, there are devices out there that let you enroll your own keys, like the Google Pixel series.
Some can, some can't. Even when it can persist, escalating to root after every reboot may be unreliable or noisy (e.g. 70% chance of success, 30% crash) compared to straight persistence as root without verified boot.
> Also, there are devices out there that let you enroll your own keys, like the Google Pixel series.
This still applies to those devices. It's the main reason GrapheneOS (which exclusively runs on Pixels, with the bootloader relocked to a GrapheneOS key) is opposed to building in root access: Verified boot would be "enabled", but effectively bypassed. https://xcancel.com/GrapheneOS/status/1730435135714050560
Literally 0 here, have you really?
Like I literally do not know anyone who is even using Linux to begin with but also people do have “root” in their Windows and MacOS systems. I do not see anyone destroying their computers at random.
Also to steal someone’s information you don’t need root access or any administrative access - if you already tricked the user into running your code then you can steal their passwords or whatever, all of that is user-level data.
* Pedantically speaking, you can not even log in as root, any root level access would have to go through sudo (which is indeed enabled for most users).
* But additionally, even as root, Macs by default have System Integrity Protection enabled, which makes most system files non-modifiable. Users still have full control in that they CAN disable System Integrity Protection, but that involves a reboot and some (documented) command line commands, so most users don't bother doing that.
I accept this metric. It means non-rooted devices are unsafe.
I'm career IT support. In the entire age of smartphones, 100% of the malware/crapware I've seen was on non-rooted devices - most of it pushed on users by manufacturers, carriers and OS devs.
To add on, almost all the money people I know who have lost to scams have been through non-rooted devices. Sending an OTP or making a bank transfer because "you're under police investigation" is cheerfully easy even without the user knowing what "root" is.
Also see: the recent phish on Krebs (on security). A malicious email and entering a password to a webpage does not need root access, for better or worse. In fact, a rooted device might block your bank app, actually making money transfer scams tougher, ironically.
Same here. It's manufacturers and software vendors such as Google and Microsoft that we need to most guard against.
Fully agree wirh your second paragraph, I've only seen viruses on non-rooted devices and I've never had a virus on any of the many rooted phones I've owned over the years.
Sure there are viruses and they can be troublesome but when you look below the surface much of the hype about locking down one's devices comes from manufacturers and software vendors, Google, MS et al, who benefit financially from not allowing users to control what runs on their phones.
It's not only phones, what Microsoft has done with TPM and Windows 11 and the deliberate obsoleting of millions of perfectly good PCs/forcing users to buy new hardware when it's unwarranted is simply outrageous.
Microsoft ought to be sued for committing environmental vandalism. …And that's just for starters.
It’s also important to learn how the modern abuse industry works. Since the 2000s, malware has grown into a multi-billion dollar highly professional industry used by governments around the world and the scammers have professionalized as well. You should look at some of the YouTube videos of scammers social engineering people into giving them remote access, approving bank MFA challenges, or talking them into making cryptocurrency purchases - and while we might sneer and say they’re uneducated or careless, most of them are distracted or old, just like most of us will be some day. If there’s a prompt, millions of people will approve it and if it means their device can no longer be trusted that’s a lot of money and e-waste.
I don’t like any of this. I want to have root on every device because I grew up with unfettered PCs (first installed Linux .9 using a disk editor, etc. etc.) but the landscape has changed since then. We can’t pretend otherwise, but we could call for regulation to balance the interests of owners and device manufacturers just as we allow people to customize their cars without giving up the concept of safety or emissions testing.
Computers were utopia 20 years ago as compared to today - especially when it comes to privacy, security and user-control.
Oh, the Matrix is also parasitic, certainly; before it was smoothed over for mass appeal it was I think a story much more obviously inspired by They Live, the central conceit being that the system both runs on and exploits human neural cognitive capacity, ie the brains are the thing being farmed as components of the Machines' own computers, with the rest of the human (including consciousness and experience!) basically tolerated as the best available life support system for the 500 grams or so of brain tissue that's actually worth having. But a cow can live a long and happy life on a farm, be genuinely loved, and still end up as cutlets. Looking at it even from Daisy's end, how unjust can we honestly call that deal?
For you and me, the gunslinger's life has a decided appeal, sure. If that and Buy-n-Large World are the only two options on the table - which so far they have been, though I agree the real answer is to add a better third - can we really say that, for everyone, the Matrix isn't the less worse of the two?
However, all this comes with the caveat that SafetyNet will flay you alive. The cat and mouse game with Magisk and other methods to maintain root undetected is moot when I've used apps these days that make a fuss when you have developer settings enabled. To be honest, that seems acceptable to me, I can do what I want with my device, software vendors like banks and the like have a say in how I choose to access their more convenient services. I can play nice with them if I want, even using a second phone perhaps, but I have a choice.
I disagree. I don't understand how it's fine that I can access my banking services with my Gentoo machine, with everything compiled from source by myself, but it's somehow a problem when I'm not using either Apple or Google certified OS on my phone.
I'm sure they want to prevent the first scenario, like various streaming cartels already do, but I hope something like EU throws a fit if they do.
Because it's a bank there's going to be insurance behind the scenes to cover them if something goes wrong, and I assume part of that is ticking off enough points to be confident a transaction is secure or different payment limits on confidence levels.
Isn’t this just a second device? How can you hold a manufacturer liable if the user was given unsupervised time as root?
PCs had root access by default, so why wasn't it a significant problem for them? Banking is possible on a PC without a banking app.
As Noam Chomsky has said, as in politics, manufacturers and OS vendors such as Google and Microsoft have been deliberately "manufacturing concent" — a widespread belief in the population of users that benefits them to the disadvantage of many of said users.
These additional restrictions are not there for security despite what we are told.
I've had to cloak the rooted state from an app or two or they'd choose to withhold functionality. That was a couple of phones ago. I've not had trouble with banking, payments, etc since.
I think they're supposed to prevent people from reverse-engineering banking app APIs and writing bots that perform millions of requests per second, trying to brute force their way into peoples' accounts.
As an extra protection, SafetyNet also makes it harder to distribute apps that repackage your genuine banking app, but with an extra trojan added.
Making it easy to root phone makes it easy for scammers to ask people to unlock it.
It should not void warranty if you unlock the phone. But security concerns are real. Mobile banking apps refuse to run on rooted phones.
I would agree.
> Making it easy to root phone makes it easy for scammers to ask people to unlock it.
I would also agree, so then: don't make it easy.
> Mobile banking apps refuse to run on rooted phones.
... but they do run on my web browser. On a computer using open-source software without even secure boot enabled. So, it seems to me this is a cop-out by said banks. They shouldn't require client-side absolute trust to run, and evidently they actually, practically, today, do not require that. It's simply a choice they made, presumably out of laziness or greed.
Can be given control [by handset manufacturers] is an unfulfilled potential. And it will always be unfulfilled - because otherwise, users could protect themselves from manufacturers/providers foistware.
Given their reality, users root.
That doesn't give me any less power than root, but does give those apps less power and limits the potential impact if one gets compromised. I think when most people say the device owner should be able to get root, they mean that the owner, rather than the manufacturer or OS vendor should have the final say in all cases, not that it has to literally work just like root on Unix.
Yes, this is kind of approach of coming up with a design to security instead of going with the easy route of everything being allowed is harder to do and takes more time, but it leads to better security.
I mean, we all agree that such permissions are not required during everyday operations, but there should be a way for the consumer to have control over the software being used. And I mean all aspects of the software: firmware should be updatable, the OS should be replaceable, and the security concepts within the OS should be customizable by the user as well. I have no problem with hiding such functionality and requiring users to read the documentation to find out how it can be done, but it should still be possible.
Historically, computers have not granted you access to everything. Most home computers used to have ROM cartridges, which could not be modified, at least not by an average user. Also, when using unrestricted operating systems, such as as MS-DOS, a simple virus could wipe all your hard work.
In our current time, devices are connected to other machines, and the problem of security and privacy has increased dramatically. Unfortunately, we still don't have operating systems that are secure enough to be used by untrained persons. It makes perfect sense to lock down these devices.
I basically see only two ways out:
1. Allow developers exclusive access to development systems, similar to how console development works.
2. Implement a secure operating system.
It will take an extreme amount of effort to do the latter, and it might even be impossible to gradually absorb the mess of interfaces that people and companies expect to work.
So that probably leaves us with the first option. Personally, I would love devices to be locked down more, so that the crazy threats from hackers will be less severe. But I would also love to keep developing software. Having to jump through some hoops is probably unavoidable. The situation could be compared to requiring a driver's license in order to safely drive on the shared infrastructure.
As much as I agree with your sentiment to have freedom, it still seems somewhat overly optimistic to expect this to work in our complex society.
Anything else and you lose freedom, and the whole ethos that enabled the advanced IT landscape of today.
Of course you lose freedom, but that is exactly what is needed, because some people just cannot help themselves from exploiting that freedom.
Unless someone figures out a way where we can safely share computing power and connections to real-life services (e.g. banking, having an identity, communication in general), I think there is no real alternative.
Perhaps having separate internets for various purposes would be an option. Ond where we can socialize anonymously, but not trust each other, and one where it's pretty boring, but where you can safely buy goods using your paycheck.
>Unless someone figures out a way where we can safely share computing power and connections to real-life services (e.g. banking, having an identity, communication in general), I think there is no real alternative.
I think the opposite is true. We don't have adequate sandboxing of userspace on most desktop OSes. If your malware has access to the victim's home directory and can phone home, they've been pwned for all intents and purposes. Root access would matter if userspace programs were well sandboxed.
On OSes where this is true like android, you have terrible interoperability of userspace programs and it's impossible to get "real work" done. Not to mention that without root access, you are just relying on the corporation to manage your system for you, which isn't tenable for a democracy.
You don't need all of this trusted computing stuff to have secure, private payments. Chaumian ecash and cryptocurrencies have known this for a while. Just use a digital signature scheme instead of relying on open-source information.
I totally agree that user space is not as much of a useful concept on a single-user device. Originally, it helped to shield users of the same system from each other. Most of this was based on file system authorization. This hasn't been extended to internet access in a very useful way.
However, even on single-user devices, having root access makes it easier to hide malicious processes. Granted that in modern operating systems it is already totally unclear what most processes are doing, so one can simply hide in plain sight.
I'm still not convinced we can get by without a lot of trusted computing stuff to have secure payments.
Having root access is not in the interest OR benefit of most regular users. Rooting your phone is a footgun for 99% of people who install random apps and will get hacked and have their life savings transferred or ransomed.
For them the article does the right thing. For everyone else, like you or me, we will not care what this article says anyway.
That's why what Samsung does is double bad. Noot rooting phone is good hygiene if your phone respects you. But if it comes with malware then thats a stab in the back.
What about desktop OSes for the last 40/50 years?
Sure they aren’t the foam-padded locked down phone OSes, but isn’t this fear a case of leaving said padded room?
If you talk to regular non IT savvy people many of them don't bother and correctly assume that at some point it will "get a virus" or something. And it is fine for them because almost no one uses desktop for critical stuff like payment or finance. But majority do use phones for that. They jumped from cash straight to phones and now it's a lucrative attack vector.
Edit to reply because throttled by downvotes: yea I'm in your boat, we live in a bubble. It's hard to believe. But now I'm using a payment system that literally has "get app" on its site and no other way to manage money or even sign up. And apps like that can be the only way for many people to get some sort of plastic card to pay cashless
And I see how it happened. Many people have no personal desktop computers. Many payment vendors don't trust desktop computers because an ordinary person's windows machine is a malware breeder.
So many people in the world depend on mobile security (especially underprivileged people). Anyone who wants them all to get fucked for own libertarian ideal of "hardware ownership" is basically a psychopath to me. Especially considering that he is literally free to root his device and not make it a problem for others.
I'm not saying this is wrong (in fact I assume it is accurate), but relative to my life experience this is crazy to me.
Mu mother-in-law does not have a laptop or desktop. She barely uses her iPad. If it’s not on the phone, it might as well not exist. My father-in-law has a PC at work and a Mac laptop, but he uses them only for work - his casual internet use is entirely on the phone. My wife uses multiple iPads and her phone, but only uses a desktop at work or when working at home.
Most people I know don’t actually own personal computers other than their phone or tablet.
What? This makes no sense. For something where security matters, using the desktop is the only rational choice. I never, ever, allow any sensitive information through the phone since it is not a trusted device.
Stop parroting the corporate propaganda that put us into this stupid situation in the first place. Having root access on devices you own should be a fundamental right, as otherwise it's not ownership.