Rethinking Automated Technology After Tesla Car Accidents
Is automated technology in automobiles actually making us safer on the roads in and around Chicago? While some forms of automated technology, such as automatic brakes or certain crash-avoidance systems, might help to limit traffic collisions caused by human error, recent Tesla car accidents suggest that fully automated cars might not yet be completely safe for everyday use. Indeed, according to a recent article in Reuters, federal regulators have voiced new concerns about the dangers of self-driving cars in the wake of another Tesla accident.
Additional Tesla Accident Overseas Raises Concerns About Self-Driving Cars in America
When you are headed to work from your home in Aurora or hop into your automobile to take a quick trip to the supermarket, would you be safer if you were in a self-driving car? According to the article, another Tesla self-driving car has crashed. This time, the accident occurred in Beijing, China, but it is serving as a strong reminder of the relatively recent fatal crash in the U.S. that resulted when a man’s Tesla collided with a semi-truck.
In case you have not heard about the accident in Florida, here is a brief recap: a man was driving a Tesla X vehicle with the car’s “Autopilot” engaged, according to a report from CNBC. While these vehicles are not “fully autonomous,” they are intended to “provide assistance while driving” and to prevent accidents caused by human error, including those resulting from distracted driving. Investigators believe that Tesla’s Autopilot mistook the side of the semi-truck for another object up above, such as a highway sign, and thus did not prevent the car from crashing into the big rig. As a result of the crash, the Tesla driver sustained fatal injuries.
Now that another collision has occurred in Beijing, analysts are contending that “Tesla Motors needs to better clarify how its ‘autopilot’ feature works,” or else more accidents may occur, according to the CNBC report. The recent crash in Beijing occurred after a driver engaged the Autopilot system and then proceeded to hit a parked car.
Although the vehicle sensed that the driver’s hands were not on the wheel, the “autosteer” assist feature did not kick into action. Tesla representatives responded to reports of the crash, emphasizing that the autosteer assist requires drivers to keep their hands on the wheel at all times. Yet drivers who have purchased these vehicles feel differently. As the accident victim explained, “the impression they give everyone,” meaning Tesla, “is that this is self-driving, this isn’t assisted driving.”
Is Autonomous Technology Really Making Us Safer?
As the Reuters article and the CNBC report underscore, Tesla insists that its Autopilot system can lead to fewer crashes than vehicles without any kind of automated technology. Yet the automaker has come under fire for crash-avoidance technology that does not always function as drivers expect, and for failing to make clear to drivers how to properly use these features. One analyst argued that Tesla’s advertising has been misleading, and the automaker needs to do something differently before more accidents happen.