By: Eric Chaffin
According to recent media reports, Apple’s self-driving car program, “Project Titan,” experienced its first accident in September 2018. No one was hurt in the crash, but the vehicles did suffer moderate damage. Though Apple has largely kept its program secret, it is required by law to report any crashes occurring on public roads in California.
Apple’s Autonomous Fleet Suffers First Crash
According to Phys.org, on August 24th in Sunnyvale, California, the Apple self-driving Lexus RX450h SUV was proceeding at an extremely slow speed of under one mile per hour. The weather was clear, and there were no unusual road conditions present.
The car was preparing to merge onto Lawrence Expressway, “waiting for a safe gap” according to the report, when it was rear-ended by a human-operated Nissan Leaf going about 15 miles per hour. There are no details as to whether the Apple vehicle was going too slow for the conditions.
Biz Journals reports that Apple first started putting its self-driving cars on the road in late 2017. As of September 2018, they were testing 70 self-driving cars with 139 safety drivers in the State, making them the third-largest company testing autonomous fleets, behind General Motors Co. and Waymo (owned by Alphabet Inc.). GM now has 175 cars and 467 drivers in its self-driving fleet.
So far it’s unclear what Apple’s end-goal is—whether it plans to create luxury cars with cutting-edge features or to create an operating system that it licenses to other car companies.
Self-Driving Cars Don’t Mimic Human Behavior
The issue of self-driving cars being rear-ended may be a trend, according to a BBC report. Waymo has apparently had technology issues because of what may be called “over-cautious” driving by self-driving cars. Waymo vans have been rear-ended in scenarios where human drivers would normally move on through, because the Waymo vans stopped dead.
In October 2017, Consumer Affairs reported that the technology powering autonomous vehicles does not yet mimic human driving behavior, which makes its behavior difficult to predict. Though GM and other companies are quick to blame human drivers for “failing to maintain an adequate stopping distance,” Dr. Phil Koopman, software engineer and Carnegie Mellon professor, stated it is not that simple.
In another report by The Information, over 20 people familiar with Waymo’s self-driving cars in Phoenix said the cars were likely to make sudden stops, jerky movements, and overly cautious and hazardous driving. One woman noted that she nearly ran into a Waymo van because it stopped suddenly in the middle of an intersection.
Some more serious accidents have been reported. In March 2018, an Uber car hit and killed a woman in Tempe, Arizona. The self-driving car was in autonomous mode at the time of the crash. The woman was allegedly walking outside of the crosswalk when the accident occurred.