14.2 C
Vancouver
Sunday, May 28, 2023

TESLA Model S involved in fatal crash.

A TESLA MODEL S driver using the carโ€™s semi-autonomous Autopilot feature died when the car hit an 18-wheeler, the first known fatality involving technology that remains in beta testing.

The collision occurred May 7 when the big-rig made a left turn in from of the Model S at an intersection on a divided highway in Williston, Florida. โ€œNeither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not appliedโ€ and the car drove under the trailer, the automaker said today.

The National Highway Transport Safety Administration sent a Special Crash Investigations Team to examine the vehicle and the crash scene. Experts from the agencyโ€™s Office of Defects Investigation plan to examine the design and performance of the Autopilot system. The agency said in a statement that opening an investigation โ€œshould not be construed as a finding thatโ€ the agency โ€œbelieves there is either a presence or absence of a defect in the subject vehicles.โ€

Tesla said drivers have racked up some 130 million miles using Autopilot, which uses radar, cameras, GPS, and ultrasonic sensors to keep a car centred in its lane and maintain a safe distance between other vehicles.

The Silicon Valley automaker points out that its Autopilot is disabled by default, and drivers can activate it only after acknowledging that the technology is still in beta testing. Drivers are instructed to keep their hands on the steering wheel at all times and be ready to assume complete control at any moment.

โ€œWe do this to ensure that every time the feature is used, it is used as safely as possible,โ€ Tesla says. And the automaker argues that even though itโ€™s not perfect, โ€œthe data is unequivocal that Autopilot reduces driver workload and results in a statistically significant improvement in safety.โ€

Still, something like this was all but inevitable. Tesla activated the feature via an over-the-air software update on October 15, and within days people were posting videos of them doing all kinds of stupid things, including sitting in the back seat and even sleeping. It wasnโ€™t long before three people drove a Model S cross-country in less than 58 hours, using Autopilot to barrel along at up to 90 mph.

The crash raises the long-anticipated liability question of autonomous driving: Who is at fault when someone dies? Tesla may describe the technology as a safety system designed to augment a driverโ€™s vigilance, but many drivers appear to consider it an autonomous system capable of taking over entirely.

Although there arenโ€™t any laws against cars driving themselves, regulations governing autonomous operation remain far from clear. โ€œCompanies can get away with a lot thatโ€™s in a legal gray area, as long as bad things donโ€™t happen,โ€ said Bryant Walker Smith, an expert on the technology at University of South Carolina School of Law. But regulators step in when something goes awry.

The fatality also underscores why most automakers are moving far more cautiously in rolling out semi-autonomous systems. Cadillac, for example, announced in January that it is delaying the debut of its Supercruise feature. โ€œTechnical development will only proceed to production when it is well and truly ready,โ€ company spokesman David Caldwell said at the time. โ€œWe wonโ€™t release it just to hit a date, nor will we โ€˜beta testโ€™ with customers.โ€ His comment was a blatant dig at Tesla Motors, and, in hindsight, a prescient awareness of what could go wrong.

And, they want Large trucks to be fully autonomous except use a driver for liability. If it can go wrong it will.