Tesla’s autopilot turns out to be easily fooled

Tram Ho

Engineers at Consumer Reports (CR) recently said they could “easily” fool Tesla’s Autopilot system, letting the car think someone is in the driver’s seat. That means the car can be controlled without anyone actually getting behind the wheel.

CR engineers performed on private roads with the Tesla Model Y, the famous American non-profit organization dedicated to consumer interests, said they decided to conduct the test later. Upon hearing news of a fatal accident in Texas, involving a Tesla Model S with no one behind the wheel.

Tesla’s autonomous driving system, and the recently released high-end full driving mode (FSD), are driver assistance systems and do not offer fully autonomous driving. . Tesla warns drivers to stay alert when their vehicle is in either mode.

Test fooling Autopilot on CR’s Tesla cars

During the test, CR engineers carried the Model Y on a test track about 800 meters long.

First, Jake Fisher, CR’s senior director of automotive testing, takes the Autopilot feature and drops the speed to zero. This doesn’t counteract the system, and while the car is stopped, he mounted a heavy weight on the steering wheel. This weight is used to mimic the rider’s handlebars on top of the car. Then Fisher got out of the driver’s seat, fastened the seat belt in place, and moved to the extra seat.

Then he accelerated and Autopilot started to control the car. Just like that, the car thought that there was still one person behind the wheel. Autonomous mode doesn’t use a camera or anything else to make sure the driver is there and pays attention to the road ahead, something the systems of General Motors, Ford and others all have. do.

“The system cannot only guarantee that the driver is paying attention to the road, it also cannot know if there is a driver there , Fisher said. “Tesla is lagging behind other automakers like GM and Ford, because on models with advanced driver assistance systems, they use technology to ensure drivers are looking at the road.”

The systems that Fisher says use cameras to make sure someone is sitting in the driver’s seat and monitor the road ahead, while Tesla’s system uses a series of sensors designed to ensure that the driver has a At least one hand is on the steering wheel. If no hand is sensed on the steering wheel, Tesla’s system will emit a series of audible warnings. If the warning is ignored, the vehicle will gradually stop. But a few videos that have appeared on social media in recent years, like CR’s test, show that users can trick the system into leaving the Tesla driving itself without anyone behind the wheel.

Since Tesla does not maintain a public relations department, no comment was made.

Hệ thống lái tự động của xe Tesla hóa ra có thể bị lừa một cách dễ dàng - Ảnh 2.

After the crash and when the incident received considerable attention from the federal government, Tesla CEO Elon Musk tweeted on social media that the initial data logs showed the vehicle was not supported. Autopilot and not equipped with beta version of the full FSD drive mode.

However, CR notes autonomous driving is still capable of many mistakes, like any other driver assistance system sold in a car. And when it makes a mistake, it can turn off suddenly, potentially leading to an accident if the driver is not paying attention and is willing to take full control of the vehicle.

Musk’s only comment since CR announced his findings was: “It sounds a little weird,” in response to Tesla owners’ assertion that the media are “really getting a boost. attack against Tesla. ”

Refer to Cnet

Share the news now

Source : Genk