Skip to content

Words Matter When It Comes to Safety of Autonomous Vehicles

Words Matter When It Comes to Safety of Autonomous Vehicles The phrase “self-driving car” may conjure the ability to check email, read, get some work done, or even indulge in a morning coffee and bagel, all while safely riding to work. It’s a dream come true to get some of that time back every day, especially for workers with long commutes. Unfortunately, that terminology is quite misleading – and it’s drawing increased attention.

Companies like Waymo have realized that labeling these vehicles as self-driving is essentially false advertising that can create a false sense of security and can potentially end in a car accident. By giving the consumer the impression that the vehicle doesn’t require any assistance or attention to be paid while in operation, these companies are opening themselves up to excessive liability. From a safety standpoint, it’s simply not true that you can climb in, sit back, and become a distracted driver.

Drivers must pay attention no matter the advertising language used

Advertising language is meant to be powerful and inspire action, but it is renewing the distracted driving problem. Waymo has acknowledged that while the company is currently developing a fully automated vehicle, they’re not quite there yet – and neither is anyone else. They’ve noted that for any car maker to use the term “self-driving,” ride share customers and car enthusiasts alike will assume the vehicle is safe for a completely hands-off trip to the store or even across the country.

Tesla has been using the term “full self-driving” to describe its autonomous commercial trucks, which has drawn concern from the company’s backers. The Owner-Operator Independent Drivers Association took issue with the National Highway Traffic Safety Administration (NHTSA) for not implementing any oversight over the specific language being used by Tesla. The point they attempted to drive home was that it creates a danger not only to cars, but other truck drivers. If everyone on the road assumes these trucks run on full autopilot, drivers will let down their guard and more collisions could follow.

The federal government needs to step into the autonomous driving arena

Because self-driving technology is still so new, the federal government has not taken the much-needed steps to regulate their use, or even how to officially classify their different levels of automation. According to the Insurance Institute for Highway Safety (IIHS), states have primarily created very basic laws covering:

  • The type of driving automation permitted on public roads
  • Whether the operator requires licensing
  • Whether the operator must be inside the vehicle
  • The requirement to hold liability insurance

That means that safety is left solely in the hands of those who stand to profit from the new technology: the manufacturers, which can be a potentially dangerous situation.

How Georgia measures up when it comes to autonomous vehicle safety

According to IIHS, “Georgia does not require a licensed operator for a ‘fully autonomous vehicle’ when the ‘automated driving system’ is engaged.” In other words, no driver’s license is required.

The state also permits semi-trucks to travel in the state without a human onboard. This would mean under the terminology used by Tesla, that the company is perfectly within the confines of the law to operate on the state’s roadways or that unlicensed drivers could be behind the wheel. In addition, when it comes to liability insurance, owners of self-driving vehicles in Georgia are only required to carry the basic minimum limit.

If you or a loved one has been injured by an autonomous vehicle, Harris Lowry Manton LLP can help. Schedule your free case evaluation today with one of our attorneys by calling our Atlanta office at 404-998-8847, contacting our Savannah office at 912-417-3774, or filling out our contact form.

 

 

Scroll To Top