The Ethics and User Experience Behind Programming Cars
Cars have come a long way from the days of the Ford Model T. While it used to be enough for a car to have Bluetooth and phone integration, software now controls the entire driving experience. From entertainment systems to engine operations, safety systems and more. Technology has led to innovations such as intelligent cruise control, parallel parking assist and even automatic overtaking, yet none of these are as impressive as the self-driving features Google and some Tesla vehicles sport.
Programming cars used to be limited to engineers with years of experience in the automotive industry. Today, with advanced programming interfaces, coding for cars is much more accessible. It’s still not as easy as coding a website, but at least you don’t need to master embedded systems programming to make modifications to cars. Unfortunately, that’s a whole different topic beyond the scope of this article. Instead we’re going to focus on a few non-technical challenges to programming cars.
Self-driving cars are supposed to be safer than traditional cars, however the recent batch of Google cars are proving to be too cautious. From cyclists causing cars to go haywire and cars being rear ended at traffic lights, programming cars to drive like humans is a major hurdle to these machines.
One of the biggest challenges to self-driving cars can’t be solved by technology. Nothing in life is foolproof and as self-driving cars gain popularity, one of the most pressing issues is how the car should handle worst case scenarios. The MIT Technology Review uses the example of a self-driving car losing control and heading towards a crowd of people. The car could go into the crowd or swerve into a wall. It’s easy to say it’s best to sacrifice the driver to save lives, yet you probably won’t have that attitude if you were behind the wheel.
People aren’t going to buy cars which turn them into martyrs, yet people don’t want to be killed by cars when they lose control. The decision is tough enough in a traditional car, but self-driving cars mean programmers determine who lives and dies. Right now that’s a question which remains unanswered, but it illustrates how much code can impact lives.
Intentionally Hiding Defects
Car manufacturers are now able to cover up gaps in safety just by adding additional code to their systems. Volkswagen recently came under fire because their diesel engines detected emissions test equipment and adjusted performance for those machines. Although emissions aren’t directly a safety issue, if you can program a car to fool emissions systems, what’s stopping a programmer from hiding defects in engine sensor accuracy? Drive-by-wire means virtually all the mechanics within modern cars are controlled by code. From the throttle to the brakes, everything is controlled through digital signals.
Whether the car is a self-driving or traditional vehicle, you can’t cut corners when writing code for vehicles. In a world where a sliding floor mat causes cars to accelerate uncontrollably, it’s not like fixing code is going to be easier to correct. That’s the biggest thing here. Physical defects in cars cause enough trouble. Fixing faulty code could require more effort than installing a hook to hold a car mat in place.
Technology Is Ahead of Its Time
Autonomous cars might be safer than human drivers, yet the recent launch of the Tesla autopilot feature illustrates what happens when drivers are given powerful features without the right training. For example, auto steer isn’t useful if it causes you to nearly drive off a highway offramp. This isn’t a case of programming gone wrong. It’s a case of drivers overwhelmed by new technologies. Just because you can code something doesn’t mean it’s worth doing.
When it comes to traditional apps, the worst that comes from new features is confused users switching to something else. On the other hand, when you’re programming a car, overwhelming users can have fatal results. Even the user experience of in-car entertainment systems plays a huge role in distracted driving. When it comes to automotive technology it’s best to only add essential features, rather than putting in half-baked code (or novelty features) to make your car sexier.
The Silver Lining
Of course there’s plenty of positives to cars becoming more advanced. Self-driving cars can virtually eliminate car accidents, while advanced sensors in cars results in more robust safety systems. Technology isn’t a bad thing. It’s just a whole new field which hasn’t been tapped before. Until the space becomes more mature, it’s best to keep your development cycles moving at a modest pace, especially with driving assist technologies. The Google cars show self-driving cars are practical. It’s really a matter of training the driver, as shown with the case of Tesla Autopilot.
What are your thoughts?