Thursday, March 28Royal Holloway's offical student publication, est. 1986

Running On Autopilot

Driverless cars are taking the world by storm. Self-driving or driverless cars take different forms, with the most famous likely being Uber previously trialling driverless vehicles in cities like Pittsburgh, Pennsylvania and Tempe, Arizona. Self-driving cars are being developed by companies all over the world, including Tesla, Uber, GM, Google’s Waymo and more. But with more and more accidents involving driverless cars, and even a recent death, can these technologically advanced vehicles really be trusted on the roads?

According to the World Health Organization, over 1.25 million people died from “road traffic crashes” while an estimated 20 – 50 million more people are injured non-fatally. A 2008 study conducted by the National Highway Traffic Safety Adminstration in the US estimated that at least 94% of traffic accidents are caused by human error. So, the aim of driverless cars are to solve this exponential problem by removing the unpredictable element of people from driving. The idea is that, without the fickle actions of someone turning the wheel suddenly or braking in the middle of a motorway without warning, roads will be much safer and accidents will occur less frequently.

An admirable goal from the technology and car giants of the world. But, there have been many incidents involving self-driving cars in recent months, with deaths being connected to both Uber and Tesla’s self-driving cars.

Uber made headlines recently when one of their self-driving cars struck a pedestrian in Tempe, Arizona, where Uber was testing their driverless cars. The car in question had a passenger as well as a “safety driver”, someone hired to sit in the passenger seat and watch for any unexpected issues that may force her to take control of the car herself. The video released of the interior and exterior view showed that the “safety driver” was not looking at the road at the time of the incident but equally, the pedestrian seemed to come out of nowhere. It is unclear who was truly at fault and whether a human driver could have prevented the death. But it does bring up questions of the safety of driverless cars, where technology may be fallible in detecting hazards appearing on the roads at any given time.

Meanwhile, Tesla, headed by the lauded Elon Musk, is being investigated for the recent crash of a Model X car that was running on autopilot mode. The crash killed the owner of the car, Walter Huang after the vehicle hit a barrier on the Sillicon Valley freeway. It subseqently caught fire. The National Transportation Safety Board (NSTB) are investigating Tesla for two other crashes that occurred in 2017. Preliminary reviews by the Minami Tamaki lawfirm suggests that the navigational system of the autopilot did not function correctly and failed to detect the necessary hazards.

Now, driverless cars are also hitting the UK, with prototypes being tested in Milton Keynes by the consortium, UK Autodrive. The government has invested at least £250m into researching driverless cars, hoping to bring them into the mainstream in the UK.

Clearly, while driverless cars appear to be revolutionary, the errors in its technology are definitely cause for concern. And they shouldn’t be trusted on the roads until we are sure that incidents like the ones in California and Tempe won’t happen.