Could You Be Sued for Bugs in Your Application?

Share this article

Key Takeaways

  • Developers could potentially face legal consequences for security holes in their software, according to Dr Richard Clayton, a security researcher at the University of Cambridge. He believes that software vendors should be held accountable for damages resulting from avoidable flaws in their products, rather than relying on End-User License Agreements to waive liability.
  • The proposed legislation could potentially extend beyond security issues, allowing individuals to sue for any software bug. This could include application crashes that lead to data loss or miscommunication due to software errors. However, the software industry argues that the complexity of code can introduce unforeseen errors and that legislation could stifle innovation and application interoperability.
  • Critics of the proposal argue that it doesn’t consider the consequences for the software industry. They believe it could discourage people from becoming programmers due to the risk of legal action, increase software development costs, and make software an expensive luxury. Furthermore, they argue that even with robust testing processes, it is impossible to completely eliminate software bugs due to the complexity of software systems.
An article which recently appeared on TechRepublic will strike fear into the heart of all developers and software manufacturers: Should developers be sued for security holes? The question was posed by University of Cambridge security researcher Dr Richard Clayton. Software security losses cost billions per year and he wants vendors to accept responsibility for damage resulting from avoidable flaws in their applications. He argues that companies should not be able to rely on End-User License Agreements which waive liability. While no legislation has been passed, committees in the UK and Europe have been considering the requirement for several years. Clayton wants applications to be assessed to consider whether the developer has been negligent. He argues that the threat of court action would provide an incentive to minimize security holes:
If you went down to the corner of your street and started selling hamburgers to passers-by they can sue you [in the case of food poisoning]. It’s not going to be easy. There’s going to be a lot of moaning from everybody inside [the industry] and we’re going to have to do it on a global basis and over many years.
Understandably, the software industry has fought back with several points:
  • No one purposely makes insecure software, but the complexity of code can introduce unforeseen errors.
  • When a home is burgled, the victim doesn’t usually ask the maker of the door or window to compensate them.
  • Legislation would stifle innovation and manufacturers would prevent application interoperability to guard against undesirable results.
  • Who would be liable for open source software?

Litigious Lapses

Clayton’s primary concern is security holes, but what does that mean? Bugs. It doesn’t matter whether they are caused by the coder’s inexperience, lack of testing or unforeseen circumstances owing to a combination of factors. However the legislation is worded, if someone can sue for security issues, they can sue for any bug. Did an application crash before you saved 20 hours of data entry? Did an email or Twitter message reach an unintended recipient? Did Angry Birds cause distress by failing to update your high score?

Burgers vs Browsers

Let’s use Clayton’s burger analogy. Preparing a burger involves sourcing good-quality (OK — acceptable quality) meat and throwing any which is past its best. You won’t have problems if the ingredients are kept cool until required then cooked at a high enough temperature for a long enough time. I don’t want to berate the fast food industry but there are a dozen variables and you only deal with two or three at a time. Nearly all are common sense — if the meat smells bad or looks green, it won’t be fit for human consumption. A burger costs a couple of dollars but, eat a bad one, and it will kill you. Let’s compare it to a web browser. Conservatively, a browsing application could have 10,000 variables. There’s no linear path and each variable could be used at a different time in a different way depending on the situation. The browser is running on an operating system which could have one million lines of code and another 100 thousand variables. It could also be interacting with other software and running on a processor with its own instruction sets. It’s complex. However, a browser is completely free at the point of use. It may be the worst application ever written. You may lose time, money and hair. But no one will die. There are risks, but are they more than outweighed by the commercial benefits?

Terminal Software

It is possible to limit programming flaws. Consider avionic software: a bug which caused a plane to fall out of the sky will lead to death. Failure is unacceptable. Aircraft software development is rigid, fully documented, optimized for safety, thoroughly tested, reviewed by other teams and governed by legislation. It takes considerable time, effort and focus. Airbus won’t demand a cool new feature mid-way through coding. Boeing won’t rearrange interface controls one week before deployment. The software is incredibly complex, but it’s one large application running on a closed system. The development cost is astronomical — yet failures still occur. They’re rare, but it’s impossible to test an infinite variety of situations in a finite period.

Assessing Developer Negligence

There’s only one way to learn programming: do it. Learning from your mistakes is a fundamental part of that process. You never stop learning. And you still make mistakes. I cringe when I examine code I wrote last week … applications written ten years ago scare the hell out of me. While education is a start, it takes time, patience, and real-world problem solving to become a great developer. How could you gain that experience if you weren’t being paid? If you’re being paid, it stands to reason someone is using your software. Anyone who thinks applications can be flaw-free has never written a program. Even if your code is perfect, the framework you’re using won’t be. Nor is the compiler/interpreter. What about the database, web server, operating system or internal processor instruction set? But let’s assume lawyers found a way to legally assess developer negligence. Who in their right mind would want to become a programmer? Fewer people would enter the profession and daily rates would increase. Those developers prepared to accept the risk would have to adhere to avionic-like standards and pay hefty insurance premiums. Software costs would rise exponentially and become an expensive luxury for the privileged few. Clayton’s proposal may be well-meaning but it doesn’t consider the consequences. His suggested legislation would kill the software industry. Ironically, that would solve all security flaws — perhaps that would make him happy?

What are the legal implications of software bugs for developers?

Software bugs can have serious legal implications for developers. If a bug in an application causes damage or loss to a user, the developer could potentially be held liable. This could result in lawsuits, financial penalties, and damage to the developer’s reputation. However, the extent of the developer’s liability often depends on the specific circumstances, including the nature of the bug, the damage caused, and the terms of any contracts or agreements in place.

Can developers be sued for application bugs?

Yes, developers can be sued for application bugs, especially if these bugs result in significant damage or loss. However, whether a lawsuit is successful often depends on various factors, including the nature of the bug, the extent of the damage, and the terms of any contracts or agreements in place.

How can developers protect themselves from legal action due to software bugs?

Developers can protect themselves from legal action by implementing robust testing and quality assurance processes to identify and fix bugs before software is released. They can also use contracts and agreements to limit their liability, although this may not always be effective. Professional liability insurance can also provide protection against legal claims.

What is the role of contracts in software bug liability?

Contracts can play a crucial role in determining liability for software bugs. A well-drafted contract can limit a developer’s liability and set out the responsibilities of both parties. However, the effectiveness of a contract in protecting a developer from legal action often depends on the specific terms and conditions, and the laws of the jurisdiction in which the contract is enforced.

What is the difference between a software bug and a software defect?

A software bug is a specific type of software defect. While a defect refers to any aspect of the software that does not work as intended, a bug refers to a defect that is the result of a coding error. However, in practice, the terms are often used interchangeably.

How can software bugs be prevented?

While it is impossible to completely eliminate software bugs, they can be minimized through robust testing and quality assurance processes. This includes unit testing, integration testing, system testing, and user acceptance testing. Code reviews and pair programming can also help to identify and fix bugs early in the development process.

What is the impact of software bugs on businesses?

Software bugs can have a significant impact on businesses. They can cause systems to crash, result in data loss, and lead to security vulnerabilities. This can result in financial loss, damage to reputation, and potential legal action. Therefore, it is crucial for businesses to invest in robust testing and quality assurance processes.

How are software bugs classified?

Software bugs are typically classified based on their severity and impact. This includes critical bugs that cause systems to crash, major bugs that significantly impact functionality, minor bugs that have a limited impact on functionality, and cosmetic bugs that do not impact functionality but may affect the user experience.

What is the role of software testing in identifying and fixing bugs?

Software testing plays a crucial role in identifying and fixing bugs. It involves checking the software to ensure it works as intended and identifying any defects. Once bugs are identified, they can be fixed and retested to ensure the fix is effective.

What are some examples of software bugs that have led to legal action?

There have been several high-profile cases of software bugs leading to legal action. For example, in 2015, a bug in the software of a Jeep Cherokee led to a recall of 1.4 million vehicles. In another case, a bug in the software of the Therac-25 radiation therapy machine resulted in several patient deaths and led to a lawsuit against the manufacturer.

Craig BucklerCraig Buckler
View Author

Craig is a freelance UK web consultant who built his first page for IE2.0 in 1995. Since that time he's been advocating standards, accessibility, and best-practice HTML5 techniques. He's created enterprise specifications, websites and online applications for companies and organisations including the UK Parliament, the European Parliament, the Department of Energy & Climate Change, Microsoft, and more. He's written more than 1,000 articles for SitePoint and you can find him @craigbuckler.

developmentlegallegislationprogramming
Share this article
Read Next
Get the freshest news and resources for developers, designers and digital creators in your inbox each week