The Cyber Resilience Act Threatens Open Source

Society and governments are struggling to adapt to a world full of cybersecurity threats. Case in point: the EU CRA — Cyber Resilience Act — is a proposal by the European Commission to enact legislation with a noble goal: protect consumers from cybercrime by having security baked in during design. Even if you don’t live in the EU, today’s global market ensures that if the European Parliament adopts this legislation, it will affect the products you buy and, possibly, the products you create. In a recent podcast, our own [Jonathan Bennett] and [Doc Searles] interview [Mike Milinkovich] from the Eclipse Foundation about the proposal and what they fear would be almost a death blow to open source software development. You can watch the podcast below.

If you want some background, you can read the EU’s now closed request for comments and the blog post outlining the problems from opensource.org. At the heart of the issue is the need for organizations to self-certify their compliance with the act. Since open source is often maintained by a small loose-knit group of contributors, it is difficult to see how this will work.

Here’s the concern in a nutshell. Suppose you write up a cool little C++ program for your own use. You aren’t a company, and you didn’t do it for profit. Wanting to share your work, you post your program on GitHub with an open source license. This happens all the time.

Meanwhile, another developer of a large open source program — let’s say the fictitious open source GRID database server decides to incorporate your code. That’s allowed. In fact, it is even encouraged. That’s how open source works.

The problem is when the GRID database has a problem that causes a data breach. The problem turns out to be a vulnerability in your code. Under the proposed law, it is possible you’d be left holding the bag for a large sum of money thanks to your generous hobby project that didn’t earn you a cent. The situation is even more complex if your code has multiple contributors. Was it your code that caused the breach or the other developer’s code? Who “owns” the project? Are all contributors liable? Faced with this, most people would probably stop contributing or levy a license making it illegal to use their code in jurisdictions where laws like this apply.

[Milinkovich] points out that hobbyists will likely be expressly exempted, so the above scenario isn’t probable. But, he asserts that hobby programmers do not make most open source software that matters (his wording). Important software is often created by paid developers working as part of a foundation or a sponsor organization. The EU mentions “commercial activity,”  and the fear is that major software like Apache, Linux, and other important open source projects would fall under this umbrella.

The consensus is that the EU doesn’t want to cripple or kill open source. But there is still time for the act to have changes that will make the law more palatable. Similar efforts are going on in other countries, as well. We understand the desire to protect consumers and critical systems from cybersecurity vulnerabilities, and [Mike] agrees it has some good points. But we also know that killing open source software won’t be helpful. We hope some revisions in the act and similar efforts in other countries will help protect open source code so it can continue to help drive innovation.

60 thoughts on “The Cyber Resilience Act Threatens Open Source

    1. The situation is exactly the same as with any tangible product: the company introducing it into the retail market has to satisfy itself that it complies with the regulations, and put a ce mark on it.

      If somebody making e.g. mains plugs puts a ce mark on them without checking that the plastic he’s buying in doesn’t degrade and crack, then he’s the one who carries the can: not the plastic manufacturer (who might have failed their own certification requirements, but weren’t the ones putting something onto the retail market).

      If somebody e.g. setting up a website that handles credit cards uses an open-source library that turns out to be insecure, then it’s similarly his responsibility for not checking the robustness of the components he was using.

      The upshot is that while open source projects could usefully demonstrate compliance with coding standards etc., they don’t have to. But woe betide them if they claim certification that they don’t actually have.

      Of course, the interesting thing here is the status of Github and the rest, who might find themselves regarded as retailers of the projects they host. And that’s particularly the case if they claim to offer “best practice” project management facilities that developers aren’t obliged to use.

      Hence the real upshot might be that the EU ends up forcing all users of Github to conform to MS’s development rules, or risk being ejected into the Outer Darkness.

  1. This sounds like the latest iteration of a trend that pops up every decade or so: it used to be ‘requiring software developers to have licenses and maintain certain quality standards’.

    Those who propose such ideas seem to think they’ll make programmers cower in fear and promise to do anything they’re told. They tend to be surprised when the programmers say, “you mean the layers of management and marketing who *don’t* have such a license will no longer be able to yank me around demanding new features and forcing me to ignore bug fixes? Well slice me off a piece of that action!”

      1. Engineers are not cheap. Especially so when the support contract is the rest of my life.
        Is the EU going to be paying my engineer level salary for the next few decades? Or is this just a thinly veiled attempt at slavery

    1. For certain jobs, welders have to have certs about their skills. What’s even more – *gasp* – ALL of the critical beads they place on a project are x-ray inspected and approved (or not).

      Similar thing with electricians, natural gas plumbers and other tradesmen. Why don’t we treat code the same way?

      Year after year, code, which run some of our most critical infrastructure is far too often done by clueless CS majors straight out of university. It’s not uncommon their work is never formally inspected and approved. Been there, done that, left some SQL injection holes which are still working 10 years on 😯

      Even worse if that work is outsourced to a certain 2nd most populous country in the World where “certificates” and “university degrees” are often simply bought.

      Maybe it’s time to finally end the hippie mindset of 1970s MIT hackers. Raise software quality standards and hold programmers accountable for their work, like other engineers are. If welder did a botch job on a crane arm by burning holes and leaving slag inclusions, and inspector signed off those welds as valid, they both could end up in jail after the crane failed, killing the operator.

      1. I would suggest that you’re actually proposing ‘programmers’ take on a whole lot more personal risk than engineers. The work of engineers is about building something to last within certain well defined parameters. The work of programmers is only rarely like that, perhaps in some unconnected embedded scenarios. You wouldn’t expect an engineer’s crane arm to remain standing if the ground beneath the crane became liquid, and you wouldn’t expect it to withstand an onslaught from even a single other engineer.

        1. Nobody is expecting this.
          As with engineering you specify in which circumstances the crane is supposed to keep standing. As a programmer, you vouch for the circumstances your code will run.

        1. I’m always impressed by hobbyists that build their own armatures!
          All that metal cutting, coil winding, soldering, and bearing selection would move it from hobbyist to masochist level very quickly fo me!

        2. You would be surprised.
          In Belgium, PoE powered network infrastructure is regulated and the installation should be officially inspected.
          While normal electricity works should be inspected before use, you can start using PoE equipment right a way and you have a long time to get the paperwork done.

          Offcourse nobody will do this for personal use, but legally they should.

          This is thanks to the inept installers that caused risky situations in gas stations with some dangeroes PoE camera installations.

      2. If someone left some mechano outside their house with a “free to take and use” sign, you would not expect them to be held liable if a person took it to support their crane’s weight.

        1. Depends on the laws I guess. In my country, Belgium:

          – If I _lend_ you my ladder and something happens, I’m responsible. What you did or didn’t do is irrelevant.
          – If i _rent_ you my ladder for any non-zero price, I’m no longer responsible when something happens to you.

  2. I’ve been considering switching to releasing my toy projects under a non-commercial license. As it seems the classic MIT/ISC warranty isn’t going to indemnify us anymore. I always had trouble finding a reliable template for such a license, and most of the people on stack exchange are unhelpful, only responding that I should use GPL because it’s “better”. Even though GPL doesn’t satisfy my requirement of non-commercial.

    1. Just set up a chain of dummy corporations: “Linux foundation shell number 0001”, “blah number 0002”, etc. The shell company owns the code and releases it as Open Source. Something breaks, and “shell number 0001” gets sued to oblivion. Somewhere in there the lawyers discover that the shell corporation owned some index cards and a few paper clips, and had just enough cash to pay their annual fee to maintain incorporation.

      The old term for that was “judgement proof.” If you haven’t got it, you can’t loose it. For bonus points, have the dummy corp owned by an entity in the Cayman Islands.

      This is precisely why my school’s student newspaper is owned by (paraphrasing) “Big University Media, Inc.” The total assets are some furniture, a few laptops, maybe some office supplies. Sue ’em for libel if you want, you won’t begin to recover more than a sliver of your legal fees.

      Curious: does anyone know the relationship between Indiana Sports Corporation and Indiana University? Is this another firewall situation?

      1. > This is precisely why my school’s student newspaper is owned by (paraphrasing) “Big University Media, Inc.”

        Interestingly, this was a side effect of my school’s student newspaper being owned by “Big University Student Publishing Foundation”. The real reason, at the time, was to remove the newspaper from the university’s control. Of course, since the foundation rents office space, equipment and everything else it needs, it has only whatever it’s current bank balance is. (Donations and other funding are funneled through a trust fund controlled by another, separate foundation.) If it ever got sued, it would just close, offer up the little money it had, then another foundation created.

  3. THIS again. Every time someone tries to legislate security, what they’re really after is building in back doors for Big Brother. Open source is pretty much the only defense against this.

    But also, trying to enforce software quality is somewhat akin to enforcing bans on 3D printing weapons.

  4. It makes no sense to certify a work, book or otherwise, especially if it is open source. Even bloggers don’t spell check their headlines.

    Corporations seeking to gain a revenue could certify a software like UL/CSA/E, but we know what happens to a hammer for the military, it costs a lot more. It will basically mean that corporations will list all their software as non compliant and for a number of years until AI can certify software, it’s a non starter.

    We can’t currently certify any software without a test specification that’s bigger and more extensive than a design specification and a functional specification (see MIL-STD-2167), and then some. Sizes are per my experience.

  5. I think it is more simple than that. Fundamentally, open source is nothing more than an idea, principle or inspiration. It’s very much like a recipe on the internet which you can follow verbatim or alter to suit but at the end of the day, it is your responsibility to perform due diligence because the variability in use cases mean that there is almost no way of passing on the responsibility. The only potential would be a scenario where exactly the same conditions (hardeware, software, usage, environment, location etc) were used but would this ever be a defencibly reasonable case?

    Another example is where does the liability lie if someone gets injured from a video created by one of these content farms on social media: youtube, content farm, user, parent, another?

    The first sentence of the article really bugs me too. The threats have always been there as has the level of ignorance. To pretend the internet is anything other than a total free-for-all is poor judgement.

    1. More like “EdgeOS” on a PROM with Windows-as-a-service running on Microsoft servers.
      Given how many businesses are already shifting their IT operations to MS Azure, it’s the logical step to take.
      At my employer, only the engineers have “real” PCs – because the development tools we use require high speed, real time interfaces to the hardware under development. The rest of the employees are issued $200 ChromeBooks to access office365.com via webbrowser (office365.com actually works with Chrome webbrowser).

  6. Thanks both to Hackaday and Al Williams for writing about this issue, as well as all the others who are standing up for open source development.

    I deeply hope that Europe (and other large legislative blocs) will realise that open source software allows anyone to see what it does (or commission an expert to do so), and that the need for *mandatory* certification lies with closed source software, where there is no mechanism through which an end user might ever verify the code.

    There is an interesting case in US law (Macpherson v. Buick Motor Company – discussed here https://www.lawteacher.net/free-law-essays/contract-law/strict-product-liability-in-the-auto-industry-contract-law-essay.php ) where – if I understand correctly – the end-user of a tyre could not (non-destructively) assure the quality of the product, and so the court held it reasonable that liability for quality control must sit with the manufacturer. I realise it has zero bearing on European law, but it illustrates the thought I hope European legislators might settle on: producers who publish all their code will not be expected to go through a certification process.

    (I write this as a human coding ignoramous, not shoggoth the large language model that wants to read more peoples’ code ;)

    1. That is a very sensible distinction, not sure it would quite fly with ‘publish to have no liability at all’, but it does have a large degree of sense to it still. I’d suggest an amendment something along the lines of ‘any product for sale must have at least one independent code review of its source before it can be considered fit for sale’. As that means the big companies actually do have some liability for shipping terrible code, so they probably won’t.

      I’d expect then for most devices to be shipped blank and ‘unlocked’ so you can load your own OS of choice in effect – as those are provided free so not under the scope of this legislation, or ship really cut down, simplified functionality but refined black boxes so the code review is cheap and the likelyhood of flaws is reduced. Both of which are good for the consumer, should reduce e-waste, and mean the leech off FOSS companies might well contribute properly from now on so their products can actually be good.

      1. Sure thing, they’re only interested in having decent jobs done. Definitely nothing to do with more government control under the guise of “safety”.
        And people will swallow it yet again. “Sure, I want to be safe! Who wouldn’t? Please have complete control of my computer through every software I install!”
        Same kind of idiocy as “if you didn’t do anything wrong you don’t have anything to hide”.

  7. Funny that Hackaday overlooks the hardware aspect of CRA.

    It instantly makes all tindie.com items with MCU illegal, unless the makers self-certify. And that’s hoping that none of the items fall into the categories that require third party certification.

  8. What about the simple fact that it is impossible to be perfectly certain that any non-trivial code is free of flaws – functional, efficiency, or security.

    Hanse mentions above that enormous volumes of detailed specifications in military projects, but even then, the specification can’t be certified to be free of defects, so an even larger specification for the specification would be needed, in infinite regression.

    Legislation that is impossible to enforce is worse than useless.

      1. Or you could try to enforce some personal responsibility for a change.
        The person taking unvetted hobbyist code and putting it into critical infrastructure is the one that holds all the responsibility for their actions and professional failure.
        The hobbyist they stole code from isn’t the person that caused the problem.

      2. There are standards, and there is “better” code. If you look at the code needed for safety critical systems it’s specified, tested, and implemented in excruciating detail. Those are systems that are usually pretty straight forward but they’re incredibly incredibly expensive as a result. There is just no way that software in general can sustain such an effort. You’d make even the most trivial of programs beyond the reach of the general public.

      3. I don’t think you understand the nature of software. Software is intangible, so its very difficult to make it completely free of vulnerabilities. This is not a physical object that you can just pass under an X-ray to detect any flaws.

        1. Except in many cases we can. Maybe not for the most complex of issues, but there are tools available to scan code and the produced system for common issues. It can also be reviewed by more experienced developers and security analysts. Like many others are suggesting, the correct answer is to make those selling or supplying complete software responsible for paying for this analysis. Software isn’t some sort of magic that nobody understands. I’m fed up of having to deal with some of the shoddy software that’s floating around in the education market.

  9. The answer is bold and open defiance. Wrongful laws don’t exist if a wide enough proportion of the population simply ignores them. The answer might also be a heavily secured pseudonymous “github” style website for open source projects. You can’t have money stolen from you by corrupt courts on acount of a bug discovered in your code if your coding page cannot be traced back to your physical world person.

      1. Mine was supposedly inspected, but I found the inspection overlooked properly grounding the entire system. The house was wired by licensed professionals. The first house we owned was wired by the previous owner who installed AC systems for a living. He used wires for the dryer 2 sizes smaller than code, and used the bare ground wire as the neutral just so he didn’t have to run another wire. And in the addition he wired, he chopped off all the outlet ground wires so there wasn’t enough to attach to the outlets.

        I’m not an electrician, I read a little book summarizing the (US) National Electrical Code for homeowners. And yet “inspection” didn’t catch it.

  10. OH HORRORS!!… expecting a programmer not to spurt out a pile of garbage…. oh the stress.
    40 years in the business has shown that most programmers…. aren’t programmers.

    1. My guy you didn’t come out of the womb writing flawless code, and if you did you’d be too rich to spend time in comment sections.

      If I post something to GitHub that somehow winds up in a big project (hypothetical since some of us know our limitations), only to find out it blew up in someone’s face, that’s not my issue. If you bolt mystery parts together don’t be upset if you wind up with a car that looks like the sort of thing johnny cash sang about.

  11. It is the use of code by the third party that results in the potential liability, not its creation, therefore if you use FOSS in your project and that code turns out to be broken then it is your problem as you failed to audit the code that you decided to use, in that particular context. It was your wilful act that actually resulted in a problem, not the original coders. All of the normal licenses are still applicable, at most a simple disclaimer may need to be added.

    1. Only if code can be inspected by the end user or a third party of repute.
      Unfortunately making all code inspectable allows bad actors to audit and perforate your installation.

      1. This has been repeatedly proven wrong. While having the source code allows for code auditing, source code is not a precondition for running an attack.

        Code auditing by honest people who then patch a vulnerability is much more common than the scenario you describe, while software compiled from non-public code can still be exploited by bad actors.

        The approach you describe is called “security by obscurity” (https://en.wikipedia.org/wiki/Security_through_obscurity: “In January 2020, NPR reported that Democratic party officials in Iowa declined to share information regarding the security of its caucus app, to “make sure we are not relaying information that could be used against us.” Cybersecurity experts replied that ‘to withhold the technical details of its app doesn’t do much to protect the system.'”)

        1. I am familiar with obscuring, not what I meant. But thank you for the info.
          I was thinking that someone could run code inspection tools or AI or fancy search tools on sources looking for exploits a lot faster today than in the past, looking for buffer bugs, etc.
          I used to use Numega BoundsChecker in the day, and that runtime tool (not even source tool) found library issues.

Leave a Reply

Please be kind and respectful to help make the comments section excellent. (Comment Policy)

This site uses Akismet to reduce spam. Learn how your comment data is processed.