首页 News 正文

In April of this year, a Tesla Model S crashed and killed a motorcyclist near Seattle. According to the latest confirmation from the US police, this Tesla car was in Full Self Driving (FSD) mode at the time of the accident. And this is at least the second fatal accident involved in the FSD that Musk heavily praises.
It is reported that investigators confirmed this fact after downloading information from the incident data recorder on this vehicle. The police stated in a statement that the driver involved in the accident has been arrested because he admitted to "losing focus while driving in FSD mode and using his phone while distracted while driving forward, believing that the machine would drive for him".
Tesla has repeatedly emphasized that its FSD software requires active driver supervision and cannot fully automate the vehicle. Tesla CEO Musk stated last week that he expects the FSD system to be able to operate unsupervised by the end of this year. For many years, he has been "drawing a cake" to launch a fleet of autonomous taxis, but the official release time has been repeatedly postponed.
A spokesperson for the Washington State Highway Patrol stated that the case is still under investigation and no charges have been filed so far.
The road ahead is difficult
For many years, Musk has been committed to achieving autonomous driving capabilities. He previously stated that he would be shocked if Tesla cannot achieve fully autonomous driving next year. However, this technology is subject to increasing regulatory and legal scrutiny.
It is understood that Tesla has two sets of partial auto drive system: FSD and Autopilot. Specifically, FSD can undertake many driving tasks in various road conditions, including city streets; Autopilot can keep the vehicle in the lane and avoid obstacles ahead. But currently, neither of these systems can achieve fully autonomous driving, and the driver must be ready to take over control of the vehicle at any time.
The National Highway Traffic Safety Administration (NHTSA) of the United States began investigating Autopilot in August 2021, after discovering over ten incidents of Tesla cars colliding with stationary vehicles. So NHTSA reviewed hundreds of accidents involving Autopilot. In December 2023, Tesla was forced to recall almost all vehicles driving on US roads to increase protection measures for the software.
Experts say that Tesla's reliance on camera and artificial intelligence technology has limitations. Guidehouse Insights analyst Sam Abuelsamid said that Tesla's camera specific system "may have many issues", such as inaccurate distance measurement of objects.
Collecting and organizing data from various real-world elements, such as motorcycles and bicycles in various possible weather, lightning, road, and traffic conditions, is an extremely challenging task, "said Raj Rajkumar, a professor of electrical and computer engineering at Carnegie Mellon University
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

六月清晨搅 注册会员
  • 粉丝

    0

  • 关注

    0

  • 主题

    30