Preferences

One thing that regulators need to be very careful about is how "security updates" are defined, and exactly what manufacturer obligations for issuing security updates should be. CVEs are a notoriously terrible representation of actual security risks, so a measure like "manufacturer must issue new releases that include any released patches for CVEs with a severity rating greater than 9" would be a clear non-starter.

There are also often practical issues related to security patching embedded devices: for example, a downstream supplier's driver can make it impossible to upgrade a kernel unless/until the supplier provides a fix. Of course, strong regulation here could help to drive bad practices like that out of the industry, but I'm not going to hold my breath on that one. The effect of regulation like this would make it harder for manufacturers who don't have the market power to lean on their suppliers to provide security patches.

Finally, it's important that any regulation that mandates or strongly encourages software updates also mandates that the update system itself be implemented in a secure way. This is my specific area of expertise, and I can tell you that it's very often done very badly. A bad update system is a gigantic, flashing red target for attack. So something like mandating signatures (and sig validation) on software update images would be a good start. Mandating the use of TUF-compliant repositories would be even better.


Thank you for these thoughtful points. Some relevant responses from other threads:

From https://www.hackerneue.com/item?id=37394188 :

I think you're right that it would be difficult for the FCC to precisely define exactly when security updates are required. This is a problem in law generally, one that is usually resolved by imposing a reasonableness standard. Maybe here, a vulnerability needs to be patched if it might reasonably be expected to allow an attacker to take control of a device, or to do so when combined with other known or unknown vulnerabilities. Or maybe a different standard. Then when enforcement/lawsuits come around, the judge/jury/regulator has to evaluate the reasonableness of the manufacturer's actions in light of that standard. We'd love to see commentary on the record as to what the right legal standard might be.

From https://www.hackerneue.com/item?id=37394793 :

Agreed. Building an automatic firmware update system from scratch would be burdensome for many IoT makers, but as it becomes necessary or encouraged, we would expect the market to provide a packaged solution/framework that manufacturers could fold into their products. It would be really helpful have to discussion of this on the record. How generalizable do you think such a solution could be? We are aware of the Uptane project, an OTA firmware update framework being jointly worked on by several car manufacturers, but would love to hear more about the feasibility of a solution for IoT devices generally, or particular classes of IoT devices.

From https://www.hackerneue.com/item?id=37393926 :

[...] companies wanting to put a label on their product would probably want to extract similar guarantees up their supply chain. Especially with a voluntary program like the one the FCC is proposing, good practices won't become the norm across the market overnight. But maybe, at the very least, the segment of product and component makers that take security seriously will begin to grow. I encourage you to share your thoughts in an official comment.

> How generalizable do you think such a solution could be? We are aware of the Uptane project, an OTA firmware update framework being jointly worked on by several car manufacturers, but would love to hear more about the feasibility of a solution for IoT devices generally, or particular classes of IoT devices.

One thing to be aware of: a decent number of connected devices are white label devices or "lightly" tweaked forks of a reference design. The consumer-facing company may have no power to actually update anything. If the originating company only provides proprietary versions of some critical component and can't/won't ship updates, the consumer-facing company can only patch issues with _their_ portion of the final software running on the device.

A _requirement_ that the consumer-facing company be able to update any/all portions of the software stack for $someTimeFrameAfterSale might start to change this but expect a fight from every link in the software-supply-chain on this front.

>we would expect the market to provide a packaged solution/framework that manufacturers could fold into their products.

These kinds of solutions exist, see for instance: https://docs.aws.amazon.com/freertos/latest/userguide/freert...

My concern is that these firmware update platforms will become oligopolies/monopolies because they will control a legal barrier and naturally accumulate the obligations of many manufacturers.

You're the lawyer guy? What statutory authority are you drawing on that you believe allows you, the FCC, to regulate this stuff?

Thanks!

Good question. The Notice of Proposed Rulemaking has a Legal Authority section that discusses this issue https://www.fcc.gov/document/fcc-proposes-cybersecurity-labe.... I also touch on it here https://www.hackerneue.com/item?id=37393316
Thanks, but, that FCC document clearly says it's about a "voluntary labeling program", and, the title of this HN post has the word "regulation" and the text has language like "require" [0]. And the phrase "oppose[...] even voluntary ones", which clearly sounds like someone's proposing non-voluntary stuff.

I read your linked HN comment too, but: "legitimate interest in" [1] a thing and actual "authority" to do a thing are not the same thing.

I feel like I'm being bamboozled here. The fcc.gov "Notice", and this HN post, seem like they're talking about substantially different proposals.

[0] "I’ve advocated for the FCC to require device manufacturers to support their devices with security updates for a reasonable amount of time"

[1] "...we think that the FCC has a legitimate interest in just about any vulnerability on a wireless device"

Nathan's post and the proposed rulemaking are both quite explicit that the proposal under comment is a voluntary labeling scheme. Perhaps the intro could be better written to be clearer, but I don't really understand your complaint. There's no bamboozle.

From above:

"I’ve advocated for the FCC to require device manufacturers to support their devices with security updates for a reasonable amount of time [1]. I can't bring such a proposal to a vote since I’m not the chairman of the agency. But I was able to convince my colleagues to tentatively support something a little more moderate addressing this problem.

The FCC recently issued a Notice of Proposed Rulemaking [2] for a cybersecurity labeling program for connected devices. If they meet certain criteria for the security of their product, manufacturers can put an FCC cybersecurity label on it. I fought hard for one of these criteria to be the disclosure of how long the product will receive security updates. I hope that, besides arming consumers with better information, the commitments on this label (including the support period) will be legally enforceable in contract and tort lawsuits and under other laws. You can see my full statement here [3]."

Thanks! Sorry for any lack of clarity. My initial draft was way over the character limit and I had to cut a lot prior to posting. Thanks for highlighting the relevant language and clearing things up.
Maybe, reach out to the FTC over the fraud that's being perpetuated with this cloud-locked (other peoples' servers) *rental* being sold as a *sale* ?

If these companies are selling defective goods and preventing individuals to fix it themselves (in other words, the selling company holds material control of the device), that's a *rental* .

Properly reclassifying consumer garbage with company-locked electronics as a rental would be the big kick-in-the-pants that nearly every company is playing now. And that includes the cellphone-on-wheels (Tesla), the stunts being played by most other car manufacturers ($$$ for heated seats, etc), Apple holding control over what approved software a general purpose computer can process, and loads more.

I don't think the FCC can require firmware updates other than in radio based units, to require regulatory requirements for specific frequencies (2.4GHz no channel 12/13 in USA, 10 minute wait on a part of 5.8GHz for ground radar). But the FTC could force it by clarifying cloud-crap is a rental, and not a sale.

To expand on this, since it's not explicit in Marco's comment: The statutory authority is section 302(a) of the Communications Act, which authorizes the FCC to regulate devices that can interfere with radio communication. Their reasoning is that IOT devices fit this category, so regulations on security updates are within scope.

Full quote from the notice of proposed rulemaking: "In particular, section 302(a) of the Communications Act authorizes the FCC “consistent with the public interest, convenience, and necessity, [to] make reasonable regulations (1) governing the interference potential of devices which in their operation are capable of emitting radio frequency energy by radiation, conduction, or other means in sufficient degree to cause harmful interference to radio communications; . . .” While this program would be voluntary, entities that elect to participate would need to do so in accordance with the regulations the Commission adopts in this proceeding, including but not limited to the IoT security standards, compliance requirements, and the labeling program’s operating framework. We tentatively conclude that the standards the Commission proposes to apply when administering the proposed labeling program fall within the scope of “reasonable regulations… governing the interference potential of devices….” We seek comment on this reasoning."

> There are also often practical issues related to security patching embedded devices: for example, a downstream supplier's driver can make it impossible to upgrade a kernel unless/until the supplier provides a fix. Of course, strong regulation here could help to drive bad practices like that out of the industry, but I'm not going to hold my breath on that one. The effect of regulation like this would make it harder for manufacturers who don't have the market power to lean on their suppliers to provide security patches.

This. We were building an IoT product that was effectively stuck on a derivative of Ubuntu 18.04; we couldn't upgrade because vendor wouldn't rebase on a new LTS for a very long time. As our project was being developed in Python, we were stuck on 3.6, and as it reached EOL, many third-party libraries dropped support and wouldn't even release security fixes; we needed to stay on that particular OS because of hardware support; and moving off the distribution-provided Python packages would increase maintenance burden beyond what we were able to handle.

Even if the vendor would continue to provide security updates to the base OS and its packages, any real-world software solution will rely on third party packages, which may choose to drop support.

I would love it if the lawmakers considered this scenario.

This is an honest question to these arguments, but as a consumer (and as an extension the FCC protecting them) why should I care? Would you accept the same arguments from your car manufacturer, "sorry we can't fix your broken brakes, our supplier uses a process that isn't supported by new brake standards so just don't brake"?

I suspect not, so why not because the car is more expensive?

I would argue that the purpose of regulation is exactly to root out this sort of practice. If it was cheap and effortless to do this we likely wouldn't need regulation.

The issue is that it's currently not a regulatory requirement. So when you go to the chip maker and demand that their chip have drivers in the Linux kernel tree so it will continue to support newer kernel versions, they turn you down. Most of their customers don't care about this and they would have to pay a developer to produce drivers of the quality that would be accepted by the Linux kernel maintainers. Then you're stuck using what you can get.

If you had a rule saying that device makers have to produce security updates, now the device makers will all demand this because they need it to satisfy the regulatory requirement, and not be willing to take no for an answer.

I don't understand your argument, are you agreeing with me that regulation will cause this to happen? So why is that an argument against regulation?
It's an argument for getting the regulation right.

For example, one of the obvious ways around these requirements is you set up Sell To Retailers, LLC which nominally does the final assembly, is responsible for the update requirement and then files for bankruptcy whenever anyone tries to enforce it against them.

The bad way to get around that is to try to hang the requirement on some kind of larger entity, like the retailer. Then every retailer bans every kind of smaller device maker who might not be around to make updates in ten years and you have a rule that unintentionally causes catastrophic market concentration.

The good way is to require that the customer can flash custom firmware to the device and the hardware has sufficient published documentation for a third party to make drivers for it (the easiest way to satisfy which would be to publish open source drivers and firmware).

That way if the manufacturer goes bust, as some of them will even independent of trying to get out of the requirement, someone else can still patch the device. And that someone will be more likely to exist, because communities like DD-WRT will have already produced custom firmware for the device and be there to patch serious vulnerabilities even if the manufacturer is gone.

The same thing happened to my car — they discontinued support for the cellular module it shipped with. I had to bring it in (and I believe pay something) to have the module updated. I did not and now it no longer has the online functionality.

Brakes are not internet-connected, but where the line is between features or functions that might be lost and those that represent the core of the product is an interesting question.

That's the thing though: most IoT devices shouldn't be Internet-connected, and most definitely should not depend on a vendor cloud (or increasingly, a cloud of a different vendor that sold white-label IoT solution to the "vendor" you you bought the device from). It's an unnecessary limitation, a combination of laziness (going over cloud is easier than figuring out local-first and standardizing on some VPN solution) and abusive business (the cloud on the other side of the world is holding your Internet-connected air conditioner hostage, better play nice).

If brakes are not Internet-connected, that's mostly because they were established before Internet - and given the trends in car manufacturing in general, it's only a matter of time.

(In some sense, we're already there - if you have cloud-connected self-driving, and that self-driving can override your command to apply brakes, then your brakes are de-facto Internet-connected, even if connectivity isn't a hard dependency in all cases just yet.)

Brakes are fundamentally both a safety-critical system, and one that is both relatively well isolated from other systems, and dead simple in principle (a bike has simple mechanical brakes and a 3yro could explain why they work).

The issue with software OTOH, is that a security hole in one trivial component (e.g. resize images to make thumbnails) can often lead to a full system compromise. Even if you don't get full root, you can still use a compromised system to your advantage: steal personal data, use it in a botnet, serve malware, mine proof of waste, etc.

On top of that, adding a dependency is often made very easy by modern package managers, and as the number goes up it gets rather difficult even to vet your direct dependencies, let alone transitive. Installing brakes in a vehicle doesn't automatically pull in a kitchen sink, but in the software world it's widely accepted, almost inevitable. You can spend your time removing the 90% of that library that you don't need, and rewriting the remaining 10%, or you do the "reasonable" thing and just ship.

Under sensible regulation you wouldn't get to blame a third party here. You would have signed a contract with your vendor to give you updates in line with what the regulation demands, and your insurance company would cover your liability if the vendor goes out of business and you have to pay through the nose to replace them or settle a class action lawsuit. Your expenses would go up and those would be passed on to the consumer, but everyone cheering for this regulation is OK with that. Hopefully the marginal cost of insurance and better vendors would be only slightly above the cost of providing this kind of long term support.
> stuck on a derivative of Ubuntu 18.04 [...] as our project was being developed in Python, we were stuck on 3.6

I might be missing something but why do you need to rely on the OS provided Python version? Newer versions that 3.6 should run on older Ubuntu versions. You could have installed newer versions using the deadsnake PPA for example onto 18.04 up until earlier this year (since LTS only has a 5 year support window, and deadsnakes only supports active LTS versions).

If we had the resources to disentangle the entire Python situation, trust me we would. Unfortunately the web of dependencies for that project was quite intricate, and at one point you just need to swallow the vendor's proprietary libraries that they've built against what they've shipped in the base OS. (L)GPL is good on paper, but the effort to actually make use of the freedom it grants is disproportionate.

(Which is why I'm a firm believer in the suckless philosophy: if the software is too complex to fully understand, source access or even copyleft aren't worth much.)

GPL is burdensome for businesses to comply with, so I would support public funding for drop in replacements for common GPL tools and libraries
> I would love it if the lawmakers considered this scenario.

You're building on quicksand, and you're asking for us to give you leeway when the building collapses.

Either do the work of making all of those security fixes yourself, or pick a better platform to build on top of.

> pick a better platform

Unfortunately there isn't all that much competition in this space. The choice was try building on quicksand, or let the idea die. I'm glad we tried it.

Until there are consequences for building on quicksand, the vendors have no reason to improve their offerings.
I don't understand what are you trying to imply here? That we should be punished for building a prototype? Or that had we shipped it in its current state, upstream vendor should be punished for stalling on updates?
As the support schedule for python is known ahead of time, this scenario seems pretty well covered by "disclosure of how long the product will receive security updates": just choose the EOL for the relevant python version in the date-picker.
Then fire your shitty vendor or refund your customers.

Nothing will change unless everybody changes.

You can't fire your SoC vendor especially once the product ships. And their are all PITA about security updates.
If you buy from a supplier with a contract that stipulates security updates then you certainly would define the damages which failure to fix will cause you, wouldn't you?
One of the issues is that the upstream vendor goes out of business. What you really need is to have the source code for the firmware, ideally in the public mainline kernel tree so that new kernel versions continue to work on the hardware.
Certainly true. Source code escrow should be part of any kind of company selling internet connected devices.
> regulation like this would make it harder for manufacturers who don't have the market power to lean on their suppliers to provide security patches.

Thought question (I’m asking, I don’t know the “answer”):

Today, many of these devices are marketed and sold by a company that has little to no involvement in the creation of the firmware or software, besides maybe sending over an image of their logo to be rolled into some turnkey “app.” Would we actually be better off if companies couldn’t really afford to basically dropship some sketchy white-label Chinese product, and instead could only sell a product here if they were confident they (acting alone) would be fully capable of supporting and updating it for a reasonable lifetime? Yes, it would raise the barrier to entry above basically the floor where it is today, but I don’t imagine there is a way to have it both ways.

My two cents is that this would be an excellent comment on the record -- I'd love a discussion at the level of defining security risks to be part of the official federal commentary, because this is going to be a thorny implementation problem.
It would be great for people to post an update like "comment submitted" on threads like this one to make sure it was entered as a comment into the official record.

I'm sure these comments in themselves are helpful to @SimingtonFCC individually, but having them be part of the official record gives the FCC legal grounds to consider them and incorporate them into rules.

Completely agree! The public record in this case is going to be what agencies and industry looks to, far more than whatever I might happen to personally believe. I'm going to get as much information from this discussion as I can, but every participant should feel free to comment on the record, or to get their employers, companies, trade associations, ad-hoc working groups, concerned citizens congregating on Discord to complain, etc. to do so as well.
I'm curious about your thoughts on balancing the damage of another Mirai with the damage of another SolarWinds. A regulation where every IoT device must accept a signed OTA update would make update servers an extremely valuable target for supply chain compromises.

On the one hand, without updates, a world of IoT devices will inevitably get infected slowly and permanently (as long as they're physically active).

But on the other hand, with mandatory updates, a world of IoT devices can get infected all at once (in the case of a supply chain attack) and possibly just as permanently (if the attacker's payload can disable or re-route the update system)?

Do you think that prevailing security standards for IoT manufacturers are good enough that this balance falls in favor of a mandatory-update regulation?

I don't know about a mandatory update regulation -- one way or the other, that isn't on the table right now. I would love extensive discussion on the record, however, of the costs and benefits of requiring updates to get the label.
While I acknowledge that CVE scoring of risk can be inconsistent and sometimes wildly wrong, what would you suggest in its place?
That's the problem, there isn't a good objective measure. Some type of "reasonableness" standard is usually invoked in situations like this, but that kinda just takes us back to square one: what's currently considered reasonable in the industry is pretty terrible.
I'm not sure we will ever have a universally accepted objective measure of risk. Risk is, by its nature, somewhat subjective.

Most organisations will use CVEs and the CVSS system as a starting point, but will triage them and produce their own assessment of the actual risk to them and their products given how the software is used.

I don't think a legal reasonableness standard would be the same as "common industry behavior." Regulation would hold companies to a real reasonableness standard, as determined in the text of the regulation or by a court.
just go by past incidents. Quite often it is not software vuln that enables hacker's attack - it is insecure default config that user never changes and manufacturer supplies same default user/pw with each device.

also insecure backdoors left by developers for debug purposes (or is it really debug or maybe espionage?)

> also insecure backdoors left by developers for debug purposes (or is it really debug or maybe espionage?)

It should be made clear that any "backdoor" is a criminal offense under the "unauthorized access" provision of the Computer Fraud and Abuse act, unless the device is covered by an explicit remote maintenance agreement which imposes duties upon the maintainer.

This item has no comments currently.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal