Texas motorists who have followed the development of autonomous cars may be aware that in Arizona in March, a self-driving Uber car killed a pedestrian. According to a professor at Arizona State University whose research includes examining how computers control systems such as autonomous cars, the safety problem with self-driving vehicles is that they are being to taught to drive like humans. This means they make the same mistakes that a human driver would.
He cites the fatal accident as an example. In that case, the video footage shows that the pedestrian stepped into an area where there was no light and no pedestrian crosswalk. The vehicle was proceeding the same way a vehicle would driven by humans under the assumption that there were no obstacles ahead despite being unable to confirm this visually. According to the professor, the car should proceed at a speed that would allow it to stop the moment something appeared that was outside its visual range. In other words, if it was unable to detect whether or not an obstacle was ahead, it would proceed as though there was.
The professor is also working on developing autonomous systems for cars. He says they are working on how to make it possible for a car to brake within a millisecond if it detects an obstacle.
It is predicted that self-driving cars will make roads considerably safer. However, accidents may happen in both autonomous and nonautonomous vehicles as a result of some part of equipment failing. For example, a part in a steering wheel, brake or airbag could malfunction and cause an injury. If this happens, the company that manufactured the product might be liable. A person who is injured in such an accident might want to contact an attorney to discuss how to document the incident and seek compensation through a products liability lawsuit.