In May, the first fatality in the U.S. involving a self-driving car occurred in Florida. The driver of a Tesla S sports car, who was using the vehicle’s automated driving system at the time, died after a collision with a truck. Currently, the government is investigating the design and performance of Tesla’s operating system.
The accident occurred when a tractor-trailer made a left turn in front of the Tesla at an intersection of a divided highway where there was no traffic light. The Tesla driver died due to injuries he sustained in the crash. The accident occurred southwest of Gainesville.
According to Tesla, neither the driver nor the automated driving system noticed the white side of the trailer, which was perpendicular to the vehicle, against the brightly lit sky. Neither the automated driving system nor the driver applied brakes. The trailer was high and it was positioned across the road in a manner that allowed the vehicle to collide with the bottom of the trailer.
Tesla says this is the first known death in over 130 million driving miles of its automated driving system. The National Highway Transportation Safety Administration is investigating to see if the system worked as expected. Tesla says that before its driving system can be used, drivers have to acknowledge that the system is an “assist feature” only, and requires a driver to keep both hands on the wheel at all times. Drivers also have to maintain control and responsibility of the vehicle while it’s in operation, and must be prepared to take over driving at any time.
Tesla’s automated driving system, which is called Autopilot, makes frequent checks, making sure the driver’s hands are on the wheel. If hands aren’t detected, it gives visual and audio alerts, and will slow the car down until a driver responds. According to Tesla, although Autopilot is not perfect, when it is used together with oversight by a driver, it reduces the effort by the driver and results in an improvement in safety.
This death comes at a time when the National Highway Traffic Safety Administration is taking steps to ease the way onto the nation’s roads for self-driving cars. Although no technology will ever be able to reduce accidents entirely, it is expected that self-driving cars will have a huge positive impact on safety, since they will eliminate human errors. Human error is responsible for almost 95 percent of accidents.
Currently, the law is not prepared for what happens when a self-driving car is involved in an accident. Obviously, if a driver makes an error and causes an accident, the other individuals who are harmed can sue. However, if an accident occurs because of a mistake made by a self-driving car, it’s not clear who is responsible. If an individual is required to monitor what the self-driving system does, as was the case in this accident, and fails to respond correctly, it’s possible the individual may be found at fault. In some cases, the individual driver as well as the computer could share responsibility.
The law in the area of self-driving technology will be forced to evolve over the next decade or so. Regardless of what happens in this area, if a person is injured in an automobile accident through the fault of another driver or a computer, that individual deserves compensation. If you have been injured in an accident in the San Francisco Bay area, call the San Francisco automobile accident attorneys at Liberty Law at 415-896-1000 or 510-645-1000. We can help. You may be entitled to compensation for your damaged vehicle, lost wages, pain and suffering, and more. Call us today to learn more about your legal rights or to schedule your free consultation.