Are Driver-Assistance Cars Safe? – Find Lawyer

Driverless Car Autopilot

On June 15, federal agencies released the first-ever report measuring car accidents related to driving assistance technology.

The National Highway Traffic Safety Administration (NHTSA) has provided the company number —392 accidents Including driver and vehicle Another 130 For self-driving cars — 10 months or more. But instead of providing an answer about technology security, the report was mostly confusing.

Are cars with Advanced Driver Assistance System (ADAS) technology safer than cars without them? Or do they make our roads more dangerous? Despite the fact that vehicles using this technology in various ways are driving on our roads, the report does not provide an answer.

The report does not claim that these numbers are just a tool to help agencies detect defects when considering regulations. Still, NHTSA appears to have responded in some quarters to criticisms that it is not sufficiently assertive in its regulation of assistive technology.

Tesla has been dominant in this area for many years with autopilot systems. Tesla car crashes and deaths have been a hot topic ever since the First fatalities in 2016.. until today, 35 crashes, Including 9 which caused the deaths of 14 people. Of those investigations, only three concluded that the autopilot was not responsible.

NHTSA took a broader step in June 2021 with an order requiring manufacturers and operators to report crashes related to ADAS technology. This year’s report details the answers to that order.

Five Levels Of Driving Assistance Technology

NHTSA is primarily interested in a driving assistance category called SAE2. 5 levels Created by the Association of Automotive Engineers. This category includes Tesla’s autopilot.

  • The Level 1 system includes a single feature such as Adaptive Cruise Control that helps the driver maintain a safe distance behind the vehicle.
  • The Level 2 system has full control over acceleration, braking and steering, but if the system does not respond properly, the driver must be ready to grab the steering wheel and intervene.
  • The Level 3 system has the technology to control the vehicle independently but requires the driver’s intervention if necessary. In May, Mercedes-Benz became the world’s first automaker to sell Level 3 cars. Germany gave it a green light In that country. Mercedes-Benz says it is working with regulators in California and Nevada and hopes to sell Level 3 cars by the end of this year.
  • Level 4 and 5 cars do not require humans to operate. Self-driving taxis are considered Level 4 vehicles and California regulators gave a go-ahead on June 2nd, to Cruise (a company owned by General Motors) to drive a self-driving car in an area of ​​San Francisco at midnight. Competitor Waymo already offers limited self-driving taxi services in San Francisco and several other locations, but there are backup drivers.

Why Are Automakers Promoting Them?

The reason is not clear the automotive industry has been pursuing driver assistance technology for many years. Suspicious people say there is no good reason for it, but the auto industry and many American politicians point out improved safety as a goal. Again, it’s important to remember that simple market demand is a big part of the reason. Consumers want a system to Make purchase decisions based on availability.

After all, as many predict, there are systems where all vehicles are self-driving. It is estimated that the majority of the 6 million car accidents that occur each year in this country are due to human error.

Leaving the work to the machine makes it safer for all of us. Or so the debate goes.

But first, a group of completely unmanned vehicles is Inevitably safe. Do they look like our drivers see? Do they make decisions quickly, as drivers learn from the experience? For example, do you slow down when a deer emerges from a nearby forest, or conclude that a child will follow you when the ball bounces off the road? And what about technical bugs?


At The Intersection Of Humans and Machines

Until that day, we need to determine how the interaction between human drivers and these automated systems works. As a result, vehicles equipped with Level 2 systems are receiving such attention.

Many of the headlines that follow NHTSA’s report suggest that they are questioning automakers’ commitment to using new technologies to improve vehicle safety. However, some argue that the 392 recorded crashes are a good number, given the total of nearly 6 million crashes per year.

The problem with the report is that there is no basis for comparison. NHTSA identified Tesla as the worst criminal and accounted for two-thirds of SAE2 accidents. But Tesla also obviously has more of these types of vehicles on the road than other automakers-about 830,000 of them. However, the report does not say how many comparable cars from other companies are on the road.

Also, the reporting requirements are not solid. Tesla has an automated reporting feature with vehicle telematics. Others rely on unconfirmed customer claims.

All Eyes of Tesla

Tesla was a hit, but it may not be fair due to a large number of cars and the more loyal answers in the report. But NHTSA already had Reasons to investigate Tesla A series of accidents in which an autopilot-enabled Tesla invades a police car, fire engine, or other emergency vehicles. These clashes injured 17 people and killed one.

Meanwhile, other studies have found a nasty flaw in Tesla. Consumer Reports engineers have discovered that Autopilot is optionally activated. Lane change function Dangerous and the system can be “fooled” into operation No one in the driver’s seat.

One of the biggest debates about driver assistance technology and safety is that these systems can increase the risk of highways by inadvertently driving drivers. last year, MIT research When Tesla’s autopilot was turned on, concluded that the driver wasn’t really paying much attention to the road or road conditions.

Safety experts claim that these drivers are not ready to take action if the system malfunctions or a situation that requires attention occurs.

Tesla’s Reaction

Despite naming the system Autopilot, Tesla clearly tells the driver that the system is not. completely Autopilot. “Autopilot is a hands-on driver assistance system intended for use only by fully attentive drivers,” the company told future buyers. “It doesn’t turn Tesla into a self-driving car, nor makes it autonomous.”

Still, Tesla’s ad, which includes the phrase “full self-driving,” has caught the attention of lawmakers who believe it promises to put something a little more at risk to future buyers. Last August, Connecticut Democratic Senator Richard Blumenthal and Massachusetts Edward Marquee asked the Federal Trade Commission To investigate Tesla’s deceptive marketing and unfair trading practices. On June 9, FTC Chair Lina Khan said: Reuters The problem raised in that letter is that it is “on our radar.”

At this point it may be worth remembering that FTC paid Volkswagen $ 9.8 billion to misunderstood buyers In 2016, he made an unjustified claim about the environmental performance of diesel vehicles.

Road ahead

When it comes to assistive technology, there’s a long way to go to see how safe these systems are.

There will definitely be more cases such as Current in Los Angeles A Tesla driver who ran through a red light while his car was on autopilot was involved and killed two at Honda. Drivers facing manslaughter blame Tesla and the autopilot. The trial is coming soon and Tesla will definitely point out the disclaimer it gives to all buyers. The autopilot requires a completely careful driver.

So what can we collect from this mess? Maybe this: Driving Assistive Technology may provide enhanced safety, but you yet the driver. And the driver has a serious responsibility.

Source link

Leave a Comment