Could You Be Sued for Bugs in Your Application?

By Craig Buckler
We teamed up with SiteGround
To bring you the latest from the web and tried-and-true hosting, recommended for designers and developers. SitePoint Readers Get Up To 65% OFF Now

An article which recently appeared on TechRepublic will strike fear into the heart of all developers and software manufacturers: Should developers be sued for security holes?

The question was posed by University of Cambridge security researcher Dr Richard Clayton. Software security losses cost billions per year and he wants vendors to accept responsibility for damage resulting from avoidable flaws in their applications. He argues that companies should not be able to rely on End-User License Agreements which waive liability.

While no legislation has been passed, committees in the UK and Europe have been considering the requirement for several years. Clayton wants applications to be assessed to consider whether the developer has been negligent. He argues that the threat of court action would provide an incentive to minimize security holes:

If you went down to the corner of your street and started selling hamburgers to passers-by they can sue you [in the case of food poisoning].

It’s not going to be easy. There’s going to be a lot of moaning from everybody inside [the industry] and we’re going to have to do it on a global basis and over many years.

Understandably, the software industry has fought back with several points:

  • No one purposely makes insecure software, but the complexity of code can introduce unforeseen errors.
  • When a home is burgled, the victim doesn’t usually ask the maker of the door or window to compensate them.
  • Legislation would stifle innovation and manufacturers would prevent application interoperability to guard against undesirable results.
  • Who would be liable for open source software?

Litigious Lapses

Clayton’s primary concern is security holes, but what does that mean? Bugs. It doesn’t matter whether they are caused by the coder’s inexperience, lack of testing or unforeseen circumstances owing to a combination of factors.

However the legislation is worded, if someone can sue for security issues, they can sue for any bug. Did an application crash before you saved 20 hours of data entry? Did an email or Twitter message reach an unintended recipient? Did Angry Birds cause distress by failing to update your high score?

Burgers vs Browsers

Let’s use Clayton’s burger analogy. Preparing a burger involves sourcing good-quality (OK — acceptable quality) meat and throwing any which is past its best. You won’t have problems if the ingredients are kept cool until required then cooked at a high enough temperature for a long enough time.

I don’t want to berate the fast food industry but there are a dozen variables and you only deal with two or three at a time. Nearly all are common sense — if the meat smells bad or looks green, it won’t be fit for human consumption. A burger costs a couple of dollars but, eat a bad one, and it will kill you.

Let’s compare it to a web browser. Conservatively, a browsing application could have 10,000 variables. There’s no linear path and each variable could be used at a different time in a different way depending on the situation. The browser is running on an operating system which could have one million lines of code and another 100 thousand variables. It could also be interacting with other software and running on a processor with its own instruction sets. It’s complex.

However, a browser is completely free at the point of use. It may be the worst application ever written. You may lose time, money and hair. But no one will die. There are risks, but are they more than outweighed by the commercial benefits?

Terminal Software

It is possible to limit programming flaws. Consider avionic software: a bug which caused a plane to fall out of the sky will lead to death. Failure is unacceptable.

Aircraft software development is rigid, fully documented, optimized for safety, thoroughly tested, reviewed by other teams and governed by legislation. It takes considerable time, effort and focus. Airbus won’t demand a cool new feature mid-way through coding. Boeing won’t rearrange interface controls one week before deployment.

The software is incredibly complex, but it’s one large application running on a closed system. The development cost is astronomical — yet failures still occur. They’re rare, but it’s impossible to test an infinite variety of situations in a finite period.

Assessing Developer Negligence

There’s only one way to learn programming: do it. Learning from your mistakes is a fundamental part of that process. You never stop learning. And you still make mistakes. I cringe when I examine code I wrote last week … applications written ten years ago scare the hell out of me.

While education is a start, it takes time, patience, and real-world problem solving to become a great developer. How could you gain that experience if you weren’t being paid? If you’re being paid, it stands to reason someone is using your software.

Anyone who thinks applications can be flaw-free has never written a program. Even if your code is perfect, the framework you’re using won’t be. Nor is the compiler/interpreter. What about the database, web server, operating system or internal processor instruction set?

But let’s assume lawyers found a way to legally assess developer negligence. Who in their right mind would want to become a programmer? Fewer people would enter the profession and daily rates would increase. Those developers prepared to accept the risk would have to adhere to avionic-like standards and pay hefty insurance premiums. Software costs would rise exponentially and become an expensive luxury for the privileged few.

Clayton’s proposal may be well-meaning but it doesn’t consider the consequences. His suggested legislation would kill the software industry. Ironically, that would solve all security flaws — perhaps that would make him happy?

We teamed up with SiteGround
To bring you the latest from the web and tried-and-true hosting, recommended for designers and developers. SitePoint Readers Get Up To 65% OFF Now
  • The old saying “Everyone makes mistakes” comes to mind. I think being able to sue for bugs just gives sue-happy people another way to get money they don’t deserve. Depending on how many lines of code are in an application, it’s safe to say that there is almost always going to be a bug somewhere.

  • Mr. Buckler, you make some very valid points.

    Allow me to add another degenerative effect of such legislature: less and less free-to-use websites.

    To defray the cost of those things you have mentioned, site operators would NEED to start charging ALL members. No longer would there be premium membership plans because the cost to develop the entire site would be a premium.

    Under such legislation, even the best and the brightest developers would feel pressured (by fear) to hire outside security specialists to validate a website. These security people would charge a premium knowing the value of their signatures.

    Are Dr. Clayton’s concerns altruistic? Alternatively, could this legislation, in fact, be a way to open a market for security specialists by using the same fear tactics used by car alarm salespersons to push their product?

    Maybe this is a matter for Sir Tim Berners-Lee to tackle because such legislation would surely have a huge affect on the Open Web by placing the web in the hands of those with deep enough pockets to build it.

    Goodbye startups and fresh innovation.

  • pippo

    Academics talking about software are like catholic priests talking about marriage…

    Since bugs are unavoidable, for mission critical software people get an insurance and costs are invoiced to the customer.

    If someone gets upset cause his highest score at angry birds does not get recorded…well it is his business and he needs mental therapy not money for free.

  • Pedro H.

    @Brian Temecula: You got that right. But you can argue that the underlying hardware caused the malfunction ( RAM, processor issues ) and the base operating system can also be the cause of the bugs. Other things I usually do is insuring the project for 6 months free maintenance for repairing any residual issues that may arise after the project its deployed, but only errors that go in the scope of the project, not fixing stuff that were not in the original budget to be made. One cannot sue you if they did not tell you what they want exactly and we all have been in that road.

    I would never give up making software because of fear of being sued.
    As long both the developer and the software:
    a) Meets the client’s requirements and he is happy with the software.
    b) Meets all requirements for a optimal server deployment
    c) Informs the client’s that due to unpredictable issues like server/software upgrades in the server or even hosting provider politics, the software may fail but there is always chance to reach us for debugging.
    d) Others that might fit here and so on…

    I don’t think there is no reason for us developers to be liable for the bugs. After all, I can’t sue parents and teachers which raise and educate people to kill other people… If I could, I would make a whole lot of money and then yes, I would quit programming xD


  • gofry

    I think it’s a great idea and it should be implemented into the law together with the law that says you can sue your security teacher/researcher that he did not teach you security well enough, therefore it’s his fault!

  • At first, let lawmakers and lawyers prove that law is perfect.

    I think that someone just got wind of another source of money. But what about freedom of agreements? Do software companies force anybody to use their products? No. Are we all obliged to obey the law irrespective of whether it has “bugs” or not? Yes.

  • Milo Tsukroff

    Interesting argument: “When a home is burgled, the victim doesn’t usually ask the maker of the door or window to compensate them” I have argued for years that if Microsoft’s C++ compiler was made to add input length checking, then all those buffer overrun security holes would be instantly eliminated. But that’s never been done. (Yes, I’ve mentioned this directly to Microsoft employees.) It’s like a door manufacturer not bothering to even make a place to put a lock on a door, forcing homeowners to go out and buy and install their own door locks. Would you purchase a door that doesn’t have a place to put a lock in? Would you purchase software that has been compiled without buffer overrun protection?? If you own a Windows computer …. you just did!

  • Ignatius Teo

    Nice in theory. There’s a practical reason why liability waivers exist. Perhaps the good Dr should get some practical exposure to real-world software engineering project and contract management first.

  • Jason

    “…applications written ten years ago scare the hell out of me”

    I know how you feel :). Though for me its more like 10 months ago. Technology keeps on changing!

  • Maybe a better idea would be to allow suing companies / developers that have been informed of critical bugs or security flaws, but do nothing to fix them.

  • When you buy a car, you buy the risk of an accident too. This doesn’t mean that you will crash but you might do. Same way goes with everything.

  • Software patenting made lawyers rich and stifled innovation because there is no reasonable way for a developer to ensure that their idea or a portion of it does not infringe on someone else’s patent.

    This law would do the same, as there is no reasonable way for a developer to guard against all security holes in all circumstances. As for software patents, trolls will follow the money. Once a company starts to scale it will become a target. Hackers will be employed to pick holes in web services and the trolls will follow up on the results.

    The only people able to afford legal protection will be the large players. Small firms and individuals will be squeezed further from the market.