In the past few years, various automakers and technology firms have been competing to develop a safe autonomous vehicle. Among the companies involved are many well-known automakers, namely Mercedes, BMW, and Tesla, as well as tech firms like Google. The concept grew from the logic that computers should be able to more safely operate vehicles than humans who commit errors or unsafe driving behaviors frequently. This premise may be under scrutiny after a deadly automobile accident involving a self-driving car. The accident occurred on May 7 in Williston, Florida and involved a Tesla Model S electric sedan. The driver of the Tesla sedan was killed while the car was in self-driving mode. The National Highway Traffic Safety Administration made a statement about the incident saying a tractor-trailer made a left turn in front of the vehicle, and the car failed to apply the brakes. This is the first known incidence of a fatal crash in which the vehicle was driving itself by means of computer software. The driver was identified by Florida Highway Patrol as Joshua Brown, 40, of Canton, Ohio. Brown was a Navy veteran who owned a technology consulting firm. Tesla made a statement on Thursday saying Brown was a man who “spent his life focused on innovation and the promise of technology and who believed strongly Tesla’s mission.” Brown had previously posted several videos of himself using the autonomous Tesla vehicle. In one, he applauded the technology for successfully preventing an accident involving his car.
The release of this story has been detrimental to Tesla’s efforts in expanding its product line from pricey electric vehicles to more conventional models. It is still unclear whether the car the driver, or both were to blame for the lethal accident. In a news release, the company said, “Neither autopilot nor the driver noticed the white side of the tractor-trailer against a brightly lit sky, so the brake was not applied.” Many critics of self-driving cars have noted that this is evidence that computers cannot make “split-second, life-or-death decisions” as humans often need to. Companies have been conducting tests using self-driving vehicles in private courses as well as public roads. However, it does not seem that the technology has been tested and developed enough for the government to sign off on the autonomous cars. The National Highway Traffic Safety Administration has recently been working on new regulations concerning testing these self-driving cars on public roads which are anticipated to be released sometime this month. Continue reading