Plan Insurance Blog

Self-Driving Tesla and Uber Fatalities… Is This a Sign?

With Tesla facing a lawsuit due to a fatal crash last month, Uber back in the frame for all the wrong reasons, are we looking down the barrel of what’s to come when self-driving vehicles are commonplace on our roads?

Should these accidents be cause of widespread concern, or can consumers and manufacturers alike keep faith with this technology that we could soon be so reliant on?

The last months have witnessed two horrific accidents seemingly attributed to flawed autonomous technology. In one instance, Tesla are denying any responsibility despite a continuing investigation by the NTSB (America’s National Transportation Safety Board). In the other incident, after a fatality due to a self-driving Uber vehicle, the technology giant voluntarily suspended all of their test programmes in relation to driverless technology.

Tesla Autopilot California Crash

On March 23rd, a new Model X vehicle crashed into a concrete roadside barrier fatally injuring the driver. So far the data that Tesla has been able to access confirmed that the Autopilot function was enabled at the time of the accident.
That could have and possibly should have been the deciding factor as far as responsibility goes. However, Tesla then released a statement which while apologising for the family’s loss, essentially stated that the victim of the accident, whose hands were not detected on the wheel in the six seconds before the impact, “wasn’t paying attention to the road, despite the car providing multiple warnings to do so”

Where is the line between consumer and manufacturer responsibility?

When quizzed, Tesla have rejected any proposals to supply further in-depth data in relation to the accident, including how many time the vehicle actually issued alerts to the driver prior to impact, and also interestingly the company refused to comment when asked how many alerts should occur before the manual pilot re-engages.
Elon Musk’s company has since been removed from the investigation, over a row with the NTSB about the amount of information released in the aftermath of the accident.
 

Uber Self-Driving Crash

While there have been accidents involving cars using self-driving technology, the incident concerning Uber’s driverless vehicle is thought to be the first fatality of a pedestrian. On March 18th, an autonomous Uber failed to slow down and avoid an accident whilst the self-driving feature was enabled.

The Volvo (pictured above) is the model Uber was using at the time. Given the vast sums involved in their driverless programme it’s more than likely they will resume testing in an effort to roll out the driverless technology on mass. Although a former Uber employee has controversially claimed that the technology had safety weaknesses, some of which were avoidable. If this concerned employee turns to be a whistle blower whose testimony stands up to cross examination, could we be looking at the first charge of corporate manslaughter relating to driverless technology company?
Following the fatality, the company released a statement, “Our cars remain grounded, and we’re assisting local, state and federal authorities in any way we can.”
While Uber have possibly acknowledged the potential for a degree of responsibility for the accident by suspending their driverless programme in Nevada (where the accident took place). The US National Highway Traffic Safety Administration and the National Transportation Safety Board have started a thorough investigation and Uber is claiming to be co-operating. However, the company has so far declined to give a narrative for the accident.

Is a special licence needed for automated tech in vehicles?

Some parties have raised the interesting prospect of introducing a licence that would be needed by users of this particular technology, rather like with manual/automatic gearbox licences. This could be a possible option in order to cut risks even further, the education from such tests would involve making the driver aware of the technology’s limitations and capabilities, depending on whether it’s operating in normal road conditions or hazardous circumstances. This would be supplemented with vital knowledge prior to the driver operating vehicles with driverless technology installed.

Aftermath…

In what is probably the most remarkable action taken by a company after these two incidents, Toyota have also suspended their driverless vehicle tests, stating they were concerned about the “emotional effect” the recent incidents would have on their test drivers. With an original forecasted launch of their automated driver tech for 2020, Toyota made it clear that this is a short pause and that they will continue testing in other countries. Nvidia, who Uber have been using to develop their technology have also suspended testing just as they had announced the creation of a “cloud-based system” for testing driverless vehicles. Late last week, Elon Musk then declared that Tesla’s autopilot system will “never be perfect” at preventing accidents, and that it was never designed with the intention of replacing humans behind the wheel of a car. In conclusion, it seems that autopilot and self driving tech still have some big challenges to overcome. Will this be up to the drivers or the tech companies? Only time will tell. But such accidents will only continue to increase the growing debate on the ethics and driving standards when operating new technology.