Since the launch of Tesla Full Driving feature (FSD) in the experimental version in 2020, the owner’s owner’s guide was clear: unlike the name, the cars that use the feature cannot drive itself.
Tesla drivers assist system is designed to deal with many road conditions – stops at lights, changing corridors, guidance, braking, and transition. However, “you require full driving (under supervision) to pay attention to the road and be ready to control at all times,” Manual Countries. “Failure to follow these instructions can cause damage, serious injury, or death.”
Now, however, the new correspondence inside the car urges drivers who drift between the corridors or feel sleepy to operate FSD-specifically confusing drivers, which experts claim to encourage them to use this feature in an unsafe way. “The drift lane has been discovered. Let FSD help you so that you can focus,” and also reads the first message, which was included in the update of the programs I was monitored Earlier this month by the infiltrator who follows the development of Tesla.
“Difficulty detecting. Keep focusing with FSD”, read the other message. On the Internet, drivers have been released since then they have visual similar message On screens in the car. Tesla did not respond to the request to comment on this message, and WIRED was unable to find this message that appears on the Tesla screen inside the car.
The researchers say that the problem is that the moments of drivers ’lack of interest is exactly when the features of helping drivers with safety must require drivers in the drivers’ focus on the road-that does not indicate that they depend on a Development system To compensate for distraction or fatigue. In the worst case, this claim can lead to a crash.
“This correspondence puts drivers in a very difficult situation,” says Alexandra Muller, a senior research scientist at the Equipment of Highway Insurance Institute. She believes that “Tesla mainly gives a series of conflicting instructions.”
Many research studies how humans interact with computer systems created to help them accomplish tasks. In general, he finds the same thing: people really The terrible negative supervisors of the systems This is very good most of the time, but not perfect. Humans need something to keep them engaged.
In searching in the aviation sector, it is called “”Outside the episode Performance problem, “where pilots, who depend on the entire automatic systems, can fail to monitor sufficient functional breakdowns due to contentment after the extended operating periods. This deficiency can lead to active participation, also known as low alertness, to a decrease in the ability to understand and regain control of the automatic system of the unemployed.
“When you suspect that the driver has become soft, to remove more of his physical participation – it seems very counter -induced,” Muller says.
“As human beings, with our fatigue or tired, we take more things that we need to do can lead to a return to adverse results,” says Charlie Cloor, the research scientist and engineer who studies drivers and driving at the Virginia Institute of Technology Transport. “It is difficult.”
Over the years, Tesla has made changes to its technique to make it difficult for non -street drivers to use FSD. In 2021, the auto industry started to use drivers surveillance cameras inside the car to determine whether drivers are noting enough while using FSD; A series of alerts warn drivers if they do not look at the road. Tesla also uses the “strike system” that can prevent the driver from using the advantage of helping their driver for a week if they failed again and again in responding to his claims.
https://media.wired.com/photos/68d55737bcf9d3a4141a7cba/191:100/w_1280,c_limit/tesla-fsd-gear-2233586805.jpg
Source link