Passed out Tesla driver stopped by CHP sparks social media debate over autopilot
There’s been a lively social media debate over technology-assisted driving since the Highway Patrol busted a Tesla driver officers said was passed out drunk while the electric car cruised 70 mph on Autopilot down Highway 101 last month in Redwood City.
Some say technologies like Tesla’s Autopilot — which detects other cars and objects to help drivers avoid crashes but still requires their attention — encourage dangerous behavior like driving drunk or texting instead of watching the road.
But others see something miraculous: a motorist apparently passes out behind the wheel while speeding down a major highway and — with help from the CHP — the car comes safely to a stop without anybody getting hurt.
“The guy would be dead, and others maybe too if it wasn’t for AP,” Elisabeth Soechting posted on Twitter this month, using the shorthand for Tesla’s Autopilot technology.
Will not stop ppl from drunk driving but certainly saves lives. As we saw in this incident – the guy would be dead, and others maybe too if it wasn’t for AP.
— Elisabeth Soechting (@NuovaRealta) December 3, 2018
The Nov. 30 incident wasn’t a first: The California Highway Patrol made a similar arrest Jan. 19 on the Bay Bridge. In both cases, the CHP said Tesla’s Autopilot feature in the premium electric car that retails for upwards of $74,500 appeared to have been engaged, and nobody was hurt.
When u pass out behind the wheel on the Bay Bridge with more than 2x legal alcohol BAC limit and are found by a CHP Motor. Driver explained Tesla had been set on autopilot. He was arrested and charged with suspicion of DUI. Car towed (no it didn’t drive itself to the tow yard). pic.twitter.com/4NSRlOBRBL
— CHP San Francisco (@CHPSanFrancisco) January 19, 2018
The Palo Alto carmaker hasn’t officially commented on the cases, though CEO Elon Musk said Dec. 2 on Twitter that he is “looking into what happened here.” The company hasn’t disputed that Autopilot was engaged.
Autopilot isn’t the same technology as that used by fully automated self-driving vehicles, which are still in development. The Autopilot technology is rated Level 2 by the Society of Automotive Engineers on a scale in which 5 is a fully autonomous self-driving car without a human driver.
Other premium automakers like Mercedes-Benz and BMW also offer Level 2 features, but Tesla’s Model S is perhaps the best known. Tesla’s Autopilot includes a cruise control that maintains the car’s speed in relation to surrounding traffic and features that help the driver steer within a clearly marked lane and safely change lanes.
While the latest CHP arrests bolster the case for Autopilot effectiveness, it isn’t foolproof, and suffered bruising press from a couple of fatal crashes.
But Tesla has defended the technology’s safety record. The company, which introduced Autopilot in 2015, said in a statement Oct. 4 the technology has demonstrated reduced accident rates. Tesla drivers using Autopilot registered one accident every 3.34 million miles, while federal data show drivers without crash every 492,000 miles.
On its website, Tesla stresses that “Autopilot is intended for use only with a fully attentive driver who has their hands on the wheel and is prepared to take over at any time.”
Drivers must first agree to “keep your hands on the steering wheel at all times” before engaging autopilot, and if it senses “insufficient torque” from the motorist’s hands it sends “an escalating series of audible and visual alerts.”
“This is designed to prevent driver misuse,” the company says, “and is among the strongest driver-misuse safeguards of any kind on the road today.”
But clever drivers have posted workarounds on the internet, like stuffing a large orange into the steering wheel.
Bet there was an orange involved in the crime scene.https://t.co/4ZTKbeg9DW
— ;▄︻̷̿┻̿═━一 (@Earth1Citizen) December 3, 2018
The California Highway Patrol, needless to say, wasn’t amused.
“The driver is ultimately responsible for the safe operation of the vehicle,” said CHP Officer Art Montiel. “And they should not try these ‘hacks’ or any other ways to override the ‘driver assist’ feature since this is not how this featured was designed to be used.”
In the most recent CHP arrest, officers said they spotted a gray Tesla Model S traveling south on 101 near Whipple Avenue at 3:37 a.m. and noticed the driver “appeared to be asleep at the wheel.” The car was going about 70 mph.
After finding the driver “unresponsive” to lights and siren, the officers positioned their patrol car in front of the Tesla and began slowing “in hopes that the ‘driver assist’ feature had been activated and therefore the Tesla would slow to a stop as the patrol vehicle came to a stop,” which it did.
Officers woke the driver, and after a field sobriety test, arrested Alexander J. Samek, 45, of Los Altos, on drunken driving charges. He has not commented publicly. A court date is set for Jan. 4.
In his Twitter post after the incident, Musk said that while he is looking into the matter, “default Autopilot behavior, if there’s no driver input, is to slow gradually to a stop and turn on hazard lights.” Tesla, he said, “then contacts the owner.”
In the social media debate that followed, most seemed to share Soechting’s sense that the technology was a lifesaver, though several offered suggestions for updates that might frustrate drunks trying to use the system to get home. Pete Clay posted a suggestion that the car play music or vibrate seats to jolt the driver awake.
Could maybe use interior camera to help identify if driver’s fall asleep & then auto phone them, play music, play message on touchscreen or use music or emergency tone to assist person to wake up. Vibrating seats. Tiny pins shooting driver in back, cold or hot climate control…
— Pete Clay (@Pete_Clay) December 3, 2018
Others, however, suggested the technology encourages people to think they could just check out behind the wheel.
“Don’t enable stupid people,” Shay Smith posted.
This technology is way to confusing for stupid people. You know they think they don't have to do anything. This is dangerous and irresponsible , if you can't drive then you can't drive. Don't enable stupid people.
— shay smith (@scot1222) December 4, 2018
But fans of the technology like Johnna Crider say it could not only spare the lives in cases where drivers ignore laws prohibiting them from being drunk or texting, but also in medical emergencies.
“Hopefully not only will this encourage people to NOT drink and drive,” Crider posted on Twitter, “but maybe could help people driving who have issues such as epilepsy or medical conditions that would render them unconscious.”
Awesome thanks for replying 😊 Hopefully not only will this encourage people to NOT drink and drive but maybe could help ppl driving who have issues such as epilepsy or med conditions that would render them unconscious. Just throwing out ideas. Thanks again for listening. U rock
— J◉нɴɴα💎 (@JohnnaCrider1) December 3, 2018