Tesla owners can now choose untested 'full self-driving' software
26th September 2021

Tesla owners can now choose ‘full self-driving’ software at the press of a button for the first time: Thousands could soon hit the road with the unregulated features

  • Tesla has rolled out a long-awaited software update that allows customers to request access to its controversial, Full Self-Driving Beta (FSD beta) program
  • Drivers who get a high safety score from Tesla will receive access to the software
  • FSD Beta gives drivers early access to new features that aren’t debugged yet, including ‘autosteer on city streets’ 
  • It enables drivers to automatically navigate around complex urban environments without moving the steering wheel with their own hands 
  • National Transportation Safety Board Chair Jennifer Homendy has voiced concern over the company’s plans for self-driving cars

Owners of Tesla vehicles are now able to activate ‘Full Self-Driving (FSD)’ software following its release early on Saturday, much to the horror of regulators who claim it to be both unregulated and largely untested.   

Chief Executive Elon Musk said Tesla drivers would be able to request a ‘beta’ version of its software starting Friday. but only those rated as ‘good drivers’ by Tesla’s insurance calculator would be able to use the system.

Owners will need to agree to have their driving monitored and only when their driving is deemed to be ‘good’ over a seven-day period, will ‘beta access will be granted.’ 

But the software comes all the while federal vehicle safety authorities are investigating the car maker for possible safety defects following a series of crashes into parked emergency vehicles. 

Tesla has rolled out a long-awaited software update that allows customers to request access to its controversial, Full Self-Driving Beta (FSD beta) program 

A disclaimer states ‘the currently enabled features require active driver supervision and do not make the vehicle autonomous.’ There is also a message that appears on the upgraded screen, warning drivers that ‘it may do the wrong thing at the worst time’

One Tesla driver posted some images of the software upgrade onto social media

Elon Musk has said the firm is starting full self-driving slowly and cautiously ‘because the world is a complex and messy place’

Tesla sparked controversy by testing the unfinished technology to 2,000 people since October on public roads, but Musk claims there have been no accidents with the beta users.

‘FSD beta system at times can seem so good that vigilance isn’t necessary, but it is. Also, any beta user who isn’t super careful will get booted,’ Musk tweeted.

The beta offers features allowing vehicles to navigate and change lanes on city streets and enabling left and right turns.

Tesla has said the FSB beta even warns drivers that it ‘may do the wrong thing at the worst time, so you must always keep your hands on the wheel.’

In several tweets Musk has made lofty predictions about being able to have full self-driving cars

Early beta tests of the FSD system showed in struggling with roundabouts and left turns. It would also suddenly veer towards pedestrians in the street and cross double-yellow lines in the center of the road, directly into the path of oncoming traffic. 

This weekend’s software release is available to those who purchased the $10,000 software upgrade, and those who also have a subscription from Tesla ranging from  for about $100 to $200 per month – although drivers will still need to pass the safety monitoring.   

It will see drivers scored and marked out of 100. 0 to 100 criteria. Drivers will be assessed on five factors, including forward collision warnings per 1,000 miles, instances of hard braking, aggressive turning, unsafe following and forced disengagements of the Autopilot system. 

Tesla will then use a formula to calculate their score with most drivers likely to score above 80.  

This weekend’s software release is available to those who purchased the $10,000 software upgrade – although drivers will still need to pass the safety monitoring

A Tesla video demonstrates how Autopilot features work

‘These are combined to estimate the likelihood that your driving could result in a future collision,’ Tesla explained. 

It’s not clear what score would need to be achieved in order to access FSD.

Worryingly, Musk has shared his own concerns over the self-driving software noting ‘we need to make Full Self-Driving work in order for it to be a compelling value proposition.’

Investigators are still looking at looking at FSD’s predecessor known as Autopilot which steers vehicles from highway on-ramps to off-ramp. The software can also park cars.

Last month, the National Highway Traffic Safety Administration opened an investigation last month into about a dozen crashes involving parked emergency vehicles while Autopilot was engaged.

Last month, the National Highway Traffic Safety Administration opened an investigation last month into about a dozen crashes involving parked emergency vehicles while Autopilot was engaged

Although the company has not specifically commented on the investigation, Tesla has repeatedly argued Autopilot is safer than cars being driven manually. 

The move to rapidly roll out the feature is drawing criticism from regulators who say the issue needs further study with a focus on safety.

 and industry peers who say the company is taking a hasty approach to an issue that requires careful study and an emphasis on safety.  

‘I do think that their product is misleading and overall leads to further misuse and abuse,’ said National Transportation Safety Board Chair Jennifer Homendy to the Washington Post.

‘I’d just ask [Musk] to prioritize safety as much as he prioritizes innovation and new technologies … safety is just as important, if not more important, than the development of the technology itself.

‘Tesla has not responded to any of our requests [regarding safety and previous crashes]. From our standpoint they’ve ignored us — they haven’t responded to us and if those are not addressed and you’re making additional upgrades, that’s a problem,’ Homendy said.

National Transportation Safety Board Chair Jennifer Homendy, pictured, has voiced concern over the company’s plans for self-driving cars

‘It is incumbent on a federal regulator to take action and ensure public safety,’ Homendy said. ‘I am happy that they’ve asked for crash information from all manufacturers and they’re taking an initial step with Tesla on asking for crash information on emergency vehicles. But they need to do more.’ 

Tesla’s cars ‘aren’t actually fully self-driving,’ added industry group the Chamber of Progress. 

How does Tesla’s Autopilot work? 

Autopilot uses cameras, ultrasonic sensors and radar to see and sense the environment around the car. 

The sensor and camera suite provides drivers with an awareness of their surroundings that a driver alone would not otherwise have. 

A powerful onboard computer processes these inputs in a matter of milliseconds to help what the company say makes driving ‘safer and less stressful.’

Autopilot is a hands-on driver assistance system that is intended to be used only with a fully attentive driver. 

It does not turn a Tesla into a self-driving car nor does it make a car autonomous.

Before enabling Autopilot, driver must agree to ‘keep your hands on the steering wheel at all times’ and to always ‘maintain control and responsibility for your car.’ 

Once engaged, if insufficient torque is applied, Autopilot will also deliver an escalating series of visual and audio warnings, reminding drivers to place their hands on the wheel. 

If drivers repeatedly ignore the warnings, they are locked out from using Autopilot during that trip.

Any of Autopilot’s features can be overridden at any time by steering or applying the brakes.

The Autopilot does not function well in poor visibility.

‘The underlying issue here is that in case after case, Tesla’s drivers take their eyes off the road because they believe they are in a self-driving car. They aren’t.’  

Scrutiny from US safety regulators, who opened an investigation into its driver assistant system, follows 11 accidents feared to have been caused because the system has trouble spotting parked emergency vehicles.  

The National Highway Traffic Safety Administration (NHTSA) said the investigation covers 765,000 vehicles, nearly everything Tesla has sold domestically since 2014. 

Of the 11 crashes that have been identified over the past three years, 17 people were injured and one was killed.

That deadly accident happened in Interstate 70 in Cloverdale, Indiana, in December 2019 and saw passenger Jenna Monet, 23, killed after the Tesla being driven by her husband Derrick slammed into the back of a parked fire engine. 

Two US senators also called on the Federal Trade Commission to investigate Tesla, saying it misled consumers and endangered the public by marketing its driving automation systems as fully self-driving. 

The 11 crashes have occurred when Teslas on Autopilot or Traffic Aware Cruise Control hit vehicles at scenes where first responders have used flashing lights, flares, an illuminated arrow board or cones warning of hazards.

The crashes into emergency vehicles cited by NHTSA began on January 22, 2018 in Culver City, California, near Los Angeles.

That incident saw Tesla using Autopilot struck a parked firetruck that was parked partially in the travel lanes with its lights flashing. Crews were handling another crash at the time.

Since then, NHTSA said there were crashes in Laguna Beach, California; Norwalk, Connecticut; Cloverdale, Indiana; West Bridgewater, Massachusetts; Cochise County, Arizona; Charlotte, North Carolina, Montgomery County, Texas; Lansing, Michigan; and Miami, Florida. 

HISTORY OF FIRST RESPONDER CRASHES CAUSED BY TESLA AUTOPILOT

January 22, 2018 in Culver City: A Tesla Model S hit the back of a fire truck parked at an accident in Culver City around 8:30 am on Interstate 405 using the cars Autopilot system. The Tesla, which was going 65mph, suffered ‘significant damage’ and the firetruck was taken out of service for body work.

May 30, 2018 in Laguna Beach: Authorities said a Tesla sedan in Autopilot mode crashed into a parked police cruiser in Laguna Beach. Laguna Beach Police Sgt. Jim Cota says the officer was not in the cruiser during the crash. He said the Tesla driver suffered minor injuries.

The police SUV ended up with its two passenger-side wheels on a sidewalk.

December 7, 2019 in Norwalk, CT: A 2018 Tesla Model 3 on Interstate 95 in Norwalk, Connecticut using the Autopilot driver assistance system rear-ended a parked police car. 

December 29, 2019 in Cloverdale, IN: A 2019 Tesla on Interstate 70 in Cloverdale, Indiana hit the back of a parked firetruck. 

The Tesla driver, Derrick Monet, and his wife, Jenna Monet, both suffered serious injuries and were transported to the hospital for immediate medical care. Jenna ultimately succumbed to her injuries and was pronounced dead at Terre Haute Regional Hospital.

June 30, 2020 in West Bridgewater, MA: A Weston, Massachusetts man driving a Tesla hit a Massachusetts State Police cruiser that was stopped in the left lane of Route 24 in West Bridgewater. A trooper who was on the scene reported that the driver, Nicholas Ciarlone, faced a negligent driving charge and was arraigned in September 2020.

July 15, 2020 in Conchise County, AZ: A Tesla Model S hit an Arizona Department of Public Safety patrol car, resulting in the patrol car rear-ending an ambulance that was on the scene of an earlier car accident. No one was seriously injured, but the Tesla driver was taken to the hospital for injuries.

August 26, 2020 in Charlotte, NC: A Tesla driver watching a movie crashed into a Nash County Sherriff’s Office deputy vehicle in Charlotte, North Carolina on US 64 west.

The driver, Devainder Goli, of Raleigh, was accused of violating the move-over law and watching television while operating a vehicle. 

February 27, 2021 in Montgomery County, TX: The driver of a Tesla rear-ended a police cruise during a traffic stop in Montgomery County, Texas. Five deputy constables were injured during the accident, which happened around 1:15 am on Eastex Freeway near East River Road. 

The Tesla driver was not injured, but was taken into custody on a DWI charge. 

March 17, 2021 in Lansing, MI: A Tesla on autopilot crashed into a Michigan State Police car. Troopers from the Lansing Post had been investigating a crash involving a car and a deer on I-96 near Waverly Rd in Eaton County at around 1:12am.

While investigating the crash, a Tesla driving on autopilot struck the patrol car, which had its emergency lights on.

Neither the driver of the Tesla – a 22-year-old man from Lansing – nor the troopers were injured at the scene. Police issued the unidentified man a citation for failure to move over and driving while license suspended.  

May 15, 2021 in Arlington, WA: A Tesla driving in Arlington, Washington hit a police vehicle that resulted in ‘significant damage’ to the police car.

There were no injuries reported from the incident. 

May 19, 2021: Three people were hospitalized after a Tesla hit a parked Miami-Dade County Department Transportation Road Ranger truck that was blocking the left lane of I-95 to help clear the debris of an earlier crash.

The driver of the Tesla was transported to a nearby hospital with with severe, albeit non-life-threatening, injuries.

Source: Read Full Article