Tesla has warned its customers that its full self-driving software “may do the wrong thing at the wrong time” following a number of incidents in which drivers have crashed their cars.
The newest version of the software which Elon Musk had previously announced as scheduled for release in 2018 had been delayed to technological challenges.
Tesla has been at pains to stress to its drivers that they must remain attentive and behind the wheel even when using its autopilot function – a few steps below the full self-driving software – which doesn’t fully control the car but offers the driver some assistance.
Earlier this year a motorist in the San Francisco Bay Area who was spotted in the back seat of his Tesla as it travelled down a freeway was arrested for reckless driving.
Police were alerted after receiving a number of calls describing a person seated in the backseat of a Tesla Model 3 without anyone in the driver’s seat while travelling on Interstate 80 across the San Francisco-Oakland Bay Bridge.
Tesla vehicles can be “easily tricked” into driving in autopilot mode with no one at the wheel, according to testers from a major US consumer organisation just days after a Tesla crashed in Texas, killing the two men in the car.
Officials say neither of the men were in the driver’s seat at the time of the crash, although Elon Musk claimed that the car was not in autopilot mode.
The autopilot setting on a Tesla enables the car to “steer, accelerate and brake automatically within its lane”.
It falls short of the full-self driving capability, which once complete will allow the car “to conduct short and long distance trips with no action required by the person in the driver’s seat”.
Musk said the newest update to the full self-driving capability “addresses most known issues” but added “there will be unknown issues, so please be paranoid”.
There are considered to be six levels of autonomous driving, starting at zero. Tesla’s current autopilot feature is what is known as Level 2 – partial automation.
When driving in autopilot, a Tesla is able to steer as well as control acceleration – but a human is needed to sit behind the steering wheel to take control at any time.
There have been a number of car crashes associated with drivers using Tesla’s autopilot feature when not paying attention to the road – some of which are being investigated by the National Transportation Safety Board in the US.
In one incident, an Apple engineer died when his Tesla Model X on autopilot hit a concrete barrier – something he had previously complained to his wife about the autopilot feature veering him towards.
By the time cars get to L5 automation, they wouldn’t even have any use for steering wheels or pedals to control acceleration or breaking.
The idea is that these autonomous vehicles would be capable of doing anything an intelligent and experienced human driver could do – including going off-road.