Free Case Review (248) 540-9270
Researchers at the University of Michigan’s new M-City are beginning to test driverless cars in a city setting. Cityscapes, it turns out, are one of the most difficult parts of programming an autonomous vehicle. Errors in this programming could lead to a rash of lemon law injury cases where drivers, passengers, and pedestrians suffer from the poor decisions of an autonomous vehicle.
Engineers and developers from Ford, GM, and Nissan are eager to begin testing their driverless cars on the University of Michigan’s new cityscape simulation, called M-City. The 32-acre track provides simulated urban and suburban driving experiences, like intersections, traffic signs, and pedestrians, and allows researchers to develop the technology needed to make driverless cars safe. Engineer Dillon Funkhouser with the University of Michigan Transportation Research Institute, told NPR’s, Jason Margolis:
“It’s the kind of thing you would drive on in a car and not even think twice about it, but if you’re an automated vehicle, something like that looks very scary.”
The upside for the driverless car industry is huge. Once fully developed, they could provide mobility and independence to the disabled and the elderly who might not otherwise be able to safely operate a vehicle. Particularly in places like Michigan where public transportation is often inadequate to meet non-drivers’ needs. In order to meet those needs, manufacturers need to take the time to properly develop autonomous driving programs and make sure that they put safety over profits.
Safety concerns include not releasing this technology into the marketplace prematurely. If stop signs and pedestrians don’t seem scary to you, consider what happened earlier this year as spectators looked on at a Volvo self-parking car.
(Video Source: Dimelo Dominicano)
The car, with passengers inside, reversed, stopped, and then sped forward into a crowd of on-lookers, injuring two. Here’s what Volvo spokesperson, Johan Larsson, had to say when Fusion asked about the incident:
“It seems they are trying to demonstrate pedestrian detection and auto-braking. . . . Unfortunately, there were some issues in the way the test was conducted.”
According to the spokesperson, it appears this particular owner of the Volvo XC60 had received the “City Safety System” which automatically brakes to protect the car from collisions in stop and go traffic. This is a standard safety feature for the vehicle. They didn’t pay extra for the “pedestrian detection functionality,” which could have initiated the automatic brake and stopped the car from hitting the onlookers. Larsson also states that even the pedestrian detection functionality could have been overridden if the driver accelerated at the onlookers.
Once driverless cars hit the streets, these kinds of incidents could become more common. Sometimes, motorists won’t know what their semi-autonomous cars will or won’t do in a situation (like the Volvo driver). Other times, flaws in the programming and detection systems on the driverless car will put people in and near these cars at risk.
As drivers do less of the work and the vehicle does more, this could shift liability from at-fault drivers to auto manufacturers. Developers like Ford, GM, and Volvo have a duty to be certain the cars they put on the roads are safe. “Bugs” in the autonomous driving systems could be dangerous, and even fatal. When something goes wrong these manufacturers could be on the hook for hundreds of thousands of dollars in personal injury damages.
Dani K. Liblang has been handling lemon law cases for over 30 years. She knows what to do when a car is defective and how to help clients recover when that defect causes injuries or wrongful death. If you or someone you know has suffered a serious accident because of an unsafe vehicle, contact The Liblang Law Firm, P.C., for a free consultation.