Tesla Model S – An Insurer’s Nightmare?
We’ve all heard the buzz and witnessed the excitement over Tesla’s Model S, considered a technological marvel that has an “autopilot” type feature and has many confused about what it will and won’t do for the driver. The term “autopilot” does not appropriately apply since this function in the Tesla helps the driver of the vehicle, not take over for the driver of the vehicle.
To be accurate, the technology is about assisting the driver, not replacing the driver. The “autopilot” function when engaged, helps keeps the car centered in the lane and keeps pace with the car ahead, but pulls back when you attempt a lane change while a car is in the other lane. Drivers are still required to stay alert, watch the road, and keep their hands on the wheel.
First Fatal Crash
In May of this year, Joshua Brown, a Tesla Model S owner, became the first fatality in over 130 million miles driven, when his Tesla Model S failed to brake appropriately and slammed into the side of a tractor-trailer. Tesla immediately notified the National Highway Traffic Safety Administration (NHTSA), and an investigation was immediately opened.
After its preliminary investigation, Tesla is claiming “pilot error” because Mr. Brown may have been distracted, but further investigation leads to conflicting reports between the car-maker and the official police report. The semi driver told police that he heard a Harry Potter movie playing on the car’s audio player when he approached the vehicle immediately after the crash . However, the police report makes no mention of his statement.
The Black Box
Tesla Motors goes to great lengths to record data in their vehicles by using event data recorders (EDRs) to document the vehicle speed, seat belt usage, pedal position, and whether a driver’s hands were on the steering wheel immediately before a crash. If anything, this data can be used in court if a liability action is brought against the carmaker after a crash. To date, no claims are pending against Tesla, but certainly, insurers remain at risk.
Tesla has, in fact, when responding to a separate crash event in Pennsylvania, claimed that data from the vehicle’s EDR indicated the crash could have been avoided had the Autopilot system been engaged. This certainly flies in the face of claims that the Autopilot system can cause a crash.
Historical Pilot Errors
Seasoned agents may recall the true story about Mrs. Merv Grazinski. Mrs. Grazinski, an Oklahoma City resident, purchased a 32- foot Winnebago motor home that she intended to use for attending OU football games. On her first trip with her new Winnebago, Mrs. Grazinski was driving on the highway when she engaged the cruise control at 70 miles per hour, left the driver’s seat to make a sandwich, and promptly crashed the vehicle.
Mrs. Grazinski sued the manufacturer for not putting in the vehicle manual that it would be dangerous to leave the driver’s seat while the cruise control was engaged. The court found in her favor and awarded Mrs. Grazinski $1.75 million. Winnebago immediately added the cruise control warning to their manual and gave her a new motor home.
The Jury’s Still Out
From an agent’s perspective, we know “self-driving” cars are coming. How insurers will respond is yet to be seen. The biggest event will be when a company like Amazon or FedEx makes the decision to deploy thousands of these technological wonders into the marketplace. Those “in the know” are saying the event it is already on the drawing board, and we are closer than ever before to having 4-wheel and 18-wheel drones on the highways.
Whether we consider passenger vehicles on auto-pilot or driverless semi-trucks, the jury is still out when it comes to how large commercial insurers will embrace or reject the idea of a 40,000-pound piece of technology screaming down our highways. When you think about, aren’t most vehicle crashes the result of driver error? Remove the driver, eliminate the crash?