While self-driving cars are designed to help reduce roadway accidents throughout Illinois and the rest of America, some autonomous technology may come with an inherent safety flaw. This is because the programming, which is created by humans, directs the vehicles to drive as humans would.
One notable computer science professor asserts that this attempt to be "human" is the crux of the problem with autonomous vehicles. Because humans can make mistakes, the programming is often copying unsafe driving behaviors.
He states that the autonomous vehicle industry is attempting to ensure safety while providing a human-like driving experience. That human driving is the standard is the very reason safety is at risk.
According to the professor, autonomous vehicles should only be allowed to travel at speeds that allow them to come to a complete stop before the end of their vision range. If an obstruction suddenly appears, the vehicles should be able to stop before a collision occurs.
Humans and autonomous vehicles are held at very different standards in regards to driving. Humans are known to be fallible, and if they are responsible for an accident, it is viewed as very regrettable but expected. However, the expectation with self-driving vehicles is that they make no mistakes. When autonomous technology is responsible for an accident, it reflects poorly on the entire industry. The professor says this double standard is necessary to encourage safer programing.
A personal injury attorney may litigate to obtain financial compensation for clients who have been injured in accidents caused by autonomous vehicles. Damages may be pursued by multiple parties, including a vehicle manufacturer.