Are Self-Driving Cars Dangerous?

Despite the promise of driverless cars, the technology is not perfect. Driverless cars are not prepared to stop for jaywalking pedestrians, and their computer vision systems are “deeply flawed.” In addition, Lithium-ion batteries are highly combustible, which is problematic for self-driving cars. As a result, these vehicles will likely require human intervention in order to stay safe.

Driverless cars aren’t prepared to stop for jaywalking pedestrians

Uber and other companies that are developing driverless cars have expressed concerns that the technology might burden pedestrians, according to reports by an experienced self-driving car accident lawyer. At the same time, driverless cars pose a novelty problem. People may not behave as naturally around these cars as they do around cars. For example, the company Waymo has encountered pedestrians who purposefully jaywalk. These incidents will only increase the need for autonomous cars.

Until the driverless cars are developed, there will likely be a number of safety issues associated with them. Some people assume they will play nice with other cars, but that assumption may be false. Some tech companies plan to program their vehicles to stop for pedestrians who “misbehave.”

Computer vision systems are “deeply flawed”

The underlying problem with computer vision systems is their ability to integrate sensor data and form a world model. This can be a major hurdle for driverless cars, which rely on computer vision systems to identify objects. Unfortunately, computer vision systems are “deeply flawed” in many respects. While it’s possible to train computer vision systems to be more accurate, the current standard is based on flawed research.

Because the system uses probabilistic reasoning and black box algorithms, it is impossible to prove that it performs better than humans in the same circumstances. It is difficult for engineers to determine whether an algorithm is better or worse than a human in these situations. Additionally, because computer vision systems are based on probabilistic reasoning, they are not tested for errors, omissions, commissions, or misidentification. So, even if an autonomous system does perform better than a human under similar circumstances, it is still not a good idea to deploy it without a human pilot.

Lithium-ion batteries are highly combustible

Fires in electric vehicles have been linked to lithium-ion batteries, which power the car’s electric motors. While other lithium-ion battery uses have generated high-profile fires, incidents in electric vehicles such as Tesla’s have not involved human drivers. Fires in Tesla battery packs are rare, and they cause no serious accidents or damage.

Moreover, lithium-ion batteries can spontaneously erupt, causing the car to burst into flames. This type of fire can be difficult to extinguish and spreads quickly. As more electric vehicles become widespread, these fires will likely become more common. To help combat this risk, companies should be sure to test lithium-ion batteries and install fire-fighting equipment.

They need human intervention to stay safe

According to a mobility expert, “The truth is, self-driving cars still need human intervention to stay safe,” says Jim McPherson, director of research at Waymo, an Alphabet self-driving car division. The company reported that in 2018, its testers intervened 114 times, or about once every 11,017 miles. That’s far from a safe rate.

The report was released before any other company began testing self-driving cars. However, it showed wildly different levels of success since September 2014, when on-road testing began. Although Google drove most of these cars during testing, the report cautioned that most of these tests were conducted during ideal weather conditions. Therefore, the data could be interpreted incorrectly.