Log in

Quick registration

The first fatal incident caused by autonomous driving: full of doubts, who is in danger?

Author:interface news Time:2018/04/02 阅读:3785
Image source: Visual China This article was first published in "Finance Magazine" (ID: i-caijing) "Finance" special correspondent Liu Hongjun posted from Silicon Valley [...]

Image source: Visual China

This article first appeared in "Finance Magazine" (ID: i-caijing)

"Finance" special correspondent Liu Hongjun Posted from Silicon Valley reporter Wang Binbin Liu Su/Wen Shizhiliang/Editor

"Avoiding running red lights and hitting pedestrians were originally the most proud aspects of autonomous vehicles, because machines can react faster than people, but Uber has done both of these things," one netizen commented. After the video of Uber's self-driving car hitting a person was released, its true technical level immediately became the focus of public opinion in the technology circle.

At 10 pm Eastern Time on March 18, 49-year-old Elaine Herzberg was hit by a self-driving car that Uber was testing while pushing her bicycle across the road, and died.

Unlike the previous failure of Tesla's autopilot mode, Uber's test car was in fully driverless mode in this accident, which is considered to be the world's first fatal accident involving a car robot.

Expand the rest95%

Because the victim did not walk in a crosswalk, Arizona Police Chief Sylvia Moir stated in a public report that the fault may not be with Uber but with pedestrians. With the release of the investigation video, Uber showed no signs of braking before the accident. Technical experts almost unanimously believe that this was caused by a failure of Uber's self-driving technology.

Affected by the accident, Uber has completely stopped testing self-driving vehicles in the United States and Canada. It previously had test bases in Arizona, Pittsburgh, Toronto and other regions.

Toyota Motor (TYO:7203/NYSE:TM) announced it was suspending autonomous driving testing in the United States after the accident due to concerns about the psychological impact on safety drivers. Nvidia founder Huang Jensen said that the cause of the accident was still unclear and the driver's safety could not be guaranteed, so Nvidia's driverless road testing was suspended.

"Safety third," an Otto employee once put an orange sticker with this word on the San Francisco headquarters. The company was later acquired by Uber, founder Anthony Levandowski (Anthony Levandowski) Become the leader in self-driving cars at Uber. This self-deprecating joke inside Uber reveals its disdain for safety.

Since 2017, self-driving companies have begun spending heavily to increase the number of test vehicles. However, with the collision case causing legal and regulatory disputes, public resistance and a series of other issues, the failure of Uber's self-driving system has become the target of criticism.

From holding pedestrians accountable to beginning to question and hold self-driving technology accountable, this traffic accident is sparking a resurgence of controversy across the industry over whether self-driving cars were introduced to the market too early.

There are many doubts, and the autonomous driving industry collectively questions

Uber is like a new player that after years of development has not yet ensured basic vehicle safety.

Police reports show that after the pedestrian appeared, the self-driving car had no reaction to slow down; the test driver (without manual intervention) said that the victim suddenly appeared in front, and the first traffic accident alert received was the sound of a collision. The biggest technical mystery of the accident was that the unmanned vehicle showed no signs of braking before the accident.

The driving log of the Uber vehicle involved in the accident has not yet been made public, and the video of the accident released by the police has become a key basis for questioning the company's self-driving technology. At present, it seems that there may be problems in aspects such as perception, decision-making and control.

Questioning voices on the perceptual level are the most concentrated. Sam Abuelsamid, a senior researcher on autonomous driving at Navigant, told the Caijing reporter that if the autonomous driving sensors do not fail, radar and lidar can identify pedestrians 50 meters to 100 meters ahead of time in the dark. When the car is traveling at a speed of 38 miles per hour, it is completely safe. It may stop within 25-30 meters in front of pedestrians.

Brad Templeton, one of the consultants on the Google team designing self-driving cars, heard an unsubstantiated claim from an Uber insider that the vehicle had its lidar turned off to test operation using only cameras and radar.

Such operations are not uncommon in testing autonomous driving technology. Even if the lidar is turned off, Abuelsamid believes that there will be at least one or two sensors that can easily identify pedestrians crossing the road. A higher-quality video he obtained showed that at the same location of the car accident, the sight conditions that day were much better than the video provided by Uber, and pedestrians could be identified using the camera alone. "The police's earliest description was wrong. They were based on a very poor quality video. The victim, Ms. Herzberg, only appeared in the video for 1.5 seconds." Sam told the Caijing reporter that if the three sensors failed at the same time, this seemed to indicate Uber's system has not been tested on public roads.

Whether pedestrians can be identified at night depends not only on what kind of sensors are used, but also on algorithms.

"With appropriate sensors and good algorithms, pedestrians crossing the road can be detected even in a night environment." Calvin Miao, a senior software engineer at Baidu Silicon Valley, told a reporter from Caijing.

In addition to sensor issues, the second possibility is that Uber's perception and prediction algorithms are not good enough. If the software does not know how to interpret the data, even if the sensor detects a pedestrian, the effective information obtained by the lidar and camera may be considered as noise and filtered out, causing the test system to "not see" the pedestrian, and the data loses meaning.

If the sensors and fusion are normal, the third possibility is that the vehicle does not know what to do after the decision-making algorithm detects a pedestrian.

The last possibility is that the path to the control layer is interrupted or the underlying controller fails. Wu Gansha, co-founder and CEO of Uisee Technology, told the Caijing reporter, "Even if the perception is discovered, the decision-making algorithm also issues instructions to turn or brake, but the process collapses, there is a problem with the underlying logic, and the vehicle is still unable to avoid it."

The simulation system of Mobileye, an advanced driving assistance company, shows that using unclear video tests released by the police, people and bicycles have been detected in the first frame when pedestrians appear. In other words, if the video quality is high or visible to the naked eye is clearer than the video, pedestrians will be detected further earlier.

Even during the fatigue period at night, pedestrians can be detected faster than the human eye. This was originally an advantage of driverless technology, but it led to a murder. Mobileye CEO Amnon Shashua criticized Uber for acting like a new player and failing to ensure basic vehicle safety after years of development.

Man-made disaster? The collapse of the second safety gate

In the competition with Google, Uber is seizing the day

The safety of autonomous driving technology should be viewed from two perspectives, one is the safety of the technology itself, and the other is the effectiveness of safety management. When the technology is immature, test drivers are needed to ensure safety.

The failure of safety drivers is also seen as an important reason for Uber's accidents.

The United States has relatively loose regulations for drivers. According to the self-driving road test methods announced by Beijing and Shanghai, test drivers must have more than three years of safe driving experience, no drunk driving, drug driving and other illegal behaviors, and more than 50 hours of self-driving experience. System operation experience, of which more than 40 hours is the corresponding application test project. When the autonomous driving system is running, the test driver should supervise the system operation in real time and take over and control the vehicle immediately when the system fails or issues a warning. If it is impossible to take over, the emergency braking function should be activated immediately.

At the corporate level, when signing a contract with a safety driver, the company will clearly require him to maintain a good state of mind while working, such as paying full attention to the road conditions and being ready to take over at any time.

Most self-driving test vehicles will have two staff members. A safety driver is responsible for observing road conditions and taking over control of the vehicle in an emergency, while an engineer will sit in the passenger seat to observe from a lower level of data. The operating status of the vehicle.

There was only one driver in Uber's test car at the time of the incident, which is not a common practice in the industry for testing. Only when a certain set of autonomous driving functions is fully developed, the co-pilot will be allowed to sit without the tester. But the fatal result also shows that Uber's system may not be mature enough for a single driver to be on the road.

In the video released by the police, the driver was not paying full attention to the road. He was lowering his head and seemed to be looking at his mobile phone a few seconds before the incident. The driver was a two-time felon and was sentenced to five years in prison for a robbery in 2001. According to online ride-hailing regulations, Uber is not allowed to hire felons as drivers, and the company has been heavily fined for this. Of course, the felon has no necessary connection with the accident, but the safety officer did not play any role in the accident.

Zachary Moore, a senior forensic engineer who has studied car accident analysis for more than ten years, believes that a normal human driver can see a pedestrian on a dry road and stop completely 2.5 meters in front of the pedestrian. But driverless systems are more prone to distraction, and drivers may not react as quickly as ordinary drivers.

Another point that confuses people in the industry is that Uber is still conducting tests at 10 p.m. An executive from a leading domestic driverless company told a Caijing reporter that although the United States has no restrictions on testing time, if his company wants to test night scenes, it must be conducted at 8 p.m. at the latest. Regarding the time issue, it is also necessary to verify how long the safety officer has been in the test vehicle. "Generally, you don't need to use your hands to drive in the car. If the driver sits and works until 10 o'clock in the evening, it is a fatigue test, and there is a high possibility of distraction." This company executive once heard that in the competition with Google, Uber It's about seizing the day.

Twists and turns, radical Uber and the probability of accidents

Safety first and smooth advancement, or technology development first?

“Recent fatal incidents involving self-driving cars remind us that no matter how automotive technology develops, safety is an issue that cannot be ignored at any time.” On March 28, Li Shufu, chairman of Geely Holding Group, said at the 2018 International Symposium on Smart Cars .

In terms of developing autonomous driving technology, traditional car companies obviously pay more attention to safety and reliability. In other words, they are more conservative. While acquiring startup Cruise Automation for US$1 billion to develop fully autonomous driving technology, General Motors (NYSE:GM)’s own R&D team launched Super Cruise, an L2-level driverless assistance technology, limiting its use scenarios to mapped On highways with clear on/off ramps, the driver's behavior is monitored through the driver attention maintenance system, which requires the driver to stay focused and can remind the driver to take over the vehicle.

Toyota, which is recognized by the industry as lagging behind in self-driving technology, has suspended testing in the United States on the grounds that the accident may have a negative impact on the psychology of safety drivers. But Uber, which was also lagging behind in the past, is a more radical risk-taker.

In October 2016, Uber drove the wrong way into a one-way street while testing in Pittsburgh. In the winter of the same year, Uber began testing in its headquarters in San Francisco, but did not apply for a road test license from the California Department of Transportation. On the first day of testing, a test car ran a red road. The video went viral; Uber's tests have since been banned by the California government. In February this year, Uber was involved in a traffic accident in Pittsburgh; in March, after moving to Arizona, it collided with a local human-driven car. Fortunately, there were no casualties. It was not until this fatal collision that it began to attract public attention.

According to data from the U.S. Department of Motor Vehicles, the number of Waymo driverless car takeovers in California is once every 5,600 miles. Uber has been testing in California for a shorter period and has not yet disclosed the number of takeovers. Uber is struggling to reach one takeover every 13 miles in Arizona, according to an internal document recently obtained by The New York Times and corroborated by two people familiar with the business. Exactly a year ago, technology media Recode broke the news that Uber’s self-driving cars would take over every 0.8 miles, which is almost the level at which self-driving cars have just started.

After the Tesla traffic accident, Lewandowski once said within the company: "The first self-driving collision was not caused by us. I am very annoyed." The words came true. Uber's former self-driving leader is known for his radicalism and disregard for rules. He has repeatedly asked the team to deploy vehicles on a large scale, find shortcuts, and force the road without applying for a test license.

When Uber's core business of online ride-hailing was just starting out, it relied on similar radical strategies to expand rapidly, and when it broke through layers of regulations, it became the Uber it is today. Eager for success and ignoring safety, these "toxic cultures" have become the underlying reasons for Uber's twists and turns in its self-driving business.

In August 2016, Uber acquired Otto, the company founded by Lewandowski, for US$680 million, which was a landmark event in Uber's self-driving start. Two months later, Uber's self-driving cars began road testing. Lewandowski was previously a self-driving engineer at Google, and the new company he founded was considered by Waymo to be suspected of infringement in autonomous driving.

Waymo sued in February 2017, and Uber began a year-long lawsuit during the golden age of autonomous driving development. After that, Uber’s self-driving business suffered steadily. In May last year, the soul leader Lewandowski was forced to leave due to a lawsuit. For a time, there were a large number of resignations in the self-driving department; sexual harassment, immigration bans, and boardroom infighting, the founders who strongly supported self-driving Kalanick was also forced to leave the company.

In February this year, the lawsuit between Uber and Waymo had just been settled, which gave Uber's self-driving technology, which had been stagnant for a year, a chance to breathe. However, in March, this traffic accident occurred and caused a storm of public opinion.

Who is responsible for driverless driving accidents

Accidents have already occurred before the rules, and legislation in various countries should be accelerated.

"Has Uber done a scenario simulation like the one that occurred during the research and development process? Has it done such a scenario test on a structured road?" According to Huang Wenhua, general manager of SAIC Motor North America, it is necessary to conduct standardized tests in a closed site. It is extremely important to proceed step by step from simulation, to closed, then to semi-closed, and finally to public open roads.

The question is, how to prevent the company from making leaps and bounds at the expense of life safety in the face of technological leadership and commercial interests? When a new industry is born, it will inevitably be accompanied by various irregularities. And when this kind of harm may expand, guiding supervision becomes particularly important.

Arizona, in the southwestern United States, has a dry climate with little rain and bright sunshine. It is regarded as the most suitable place for autonomous driving testing and has almost become a paradise for major autonomous driving companies.

In the United States, few states have strict regulatory strategies for the autonomous driving industry. Only California requires special licenses and the submission of annual test reports. California’s policy is also vague about what standards can be met for unmanned testing, which means this report is of little significance in terms of accident prevention.

Abuelsamid believes that basic performance requirements must be established for sensing systems. Sensors that can reliably detect pedestrians, cyclists and other vehicles in sunlight, twilight and nighttime should be developed by standards organizations such as SAE International before vehicles are tested on public roads.

The federal government sets safety standards, but the reality is that each state is responsible for vehicle and driver registration information. Currently, California drivers require a vision test to obtain a driver's license to ensure that the driver can see road signs; Abuelsamid said that before self-driving cars hit the road, they should also undergo a "virtual vision check" if the car cannot see pedestrians or does not know how to see them. processing, these cars should not appear in road tests.

Domestically, road test regulations in Beijing and Shanghai require companies to "test their driving licenses" in closed venues and pass evaluations by third-party agencies and experts before they can get their licenses on the road. This prevents too immature systems from being tested on public roads to a certain extent.

The definition of accident liability will also affect the development process of autonomous driving. U.S. law relies on case law. According to U.S. law, if a pedestrian crosses the road and is not at a crosswalk, the liability for the accident lies primarily with the pedestrian. However, if the victim's family sues Uber for imperfect driverless technology, it may affect Uber's local testing and future laws and regulations.

At present, China's relevant laws and regulations are still in the brewing stage.

Accidents have occurred before rules, and legislation in various countries should be accelerated. Zheng Ge, a professor at Shanghai Jiao Tong University Koyuan Law School, told Caijing reporters that legislation can stabilize people's hearts. While laws cannot eliminate accidents, they can make people feel safe.

The public, especially the legal profession, has many doubts about how responsibilities should be divided. The L0-L2 stage is considered to be assisted driving, where the driver is the leader, while L4 and L5 are machine-led unmanned driving stages with relatively clear responsibilities. The L3 stage is an awkward transitional stage. Both the driver and the machine need to be involved, and it is difficult to separate the responsibilities. During the test, the safety officers were employed by the company and could be considered as one entity. Once consumers started using it, once an accident occurred, it would be even more difficult to distinguish between responsibilities and responsibilities. At the L3 stage, it is a high probability event that car companies and consumers will sign a liability exemption agreement, and it will certainly not be hidden in the product or service manual. It will be very clearly informed, and even require consumers to sign.

However, most people in life are risk averse. The founder of an autonomous driving company whose products have begun to be launched expressed his concern, "If you sign a liability exemption agreement, consumers may have doubts about safety and dare not buy it."

Industry reflects and moves forward cautiously

Uber has poured cold water on the industry, but the prospect of driverless driving is still promising

After the first fatal case of an autonomous vehicle, what makes the industry even more nervous is the resurgence of public resistance to autonomous vehicle technology. An Uber driver told a Caijing reporter that she had received an invitation from Uber to test self-driving cars, and she was afraid of self-driving cars.

A Pew Research Center survey shows that half of Americans would not be willing to ride in a self-driving car if given the option and express concerns about it. This sentiment has spread. "The video is disturbing and worrying. The 3.18 incident is undoubtedly a major safety failure." Arizona Governor Doug Ducey publicly stated his position. He once enthusiastically welcomed Uber when it was kicked out by the California government, intending to make driverless driving a major driver of economic growth in Arizona. It was only a week after the accident that he issued an administrative order to suspend testing of Uber.

However, public concerns have not turned into panic due to the Uber incident, and investors are still optimistic about the autonomous driving industry. Xiong Weiming, a partner at Huachuang Capital, told a reporter from Caijing that everyone has been more sensitive recently, but "this wave of negative sentiment will not last long, and big funds continue to enter this industry."

"We hope that we may learn lessons from this accident and observe whether we need to change our test cars." Huang Renxun emphasized in the GTC post-meeting interview. Although the cause of the accident has not yet been identified, he insists that autonomous driving is the future and believes that before 2020, all electric vehicles on the road will have some autonomous driving function.

NVIDIA is working with 370 autonomous driving companies, including Uber, to launch the newly released Drive Constellation platform. At the scene, an engineer demonstrated how to test self-driving cars in a virtual reality environment, so that testers no longer have to risk their lives when training the software.

Since the major autonomous driving companies opened road tests and tested the waters for commercialization, the industry is collectively instilling a concept in the public: autonomous vehicles will not have problems such as drunk driving, emotion, distraction, etc., and are safer than human drivers.

"No matter how advanced autonomous driving technology is, there will always be some physical and objective factors that may make some accidents unavoidable." Chairman of the Board of Directors of Daimler AG (PINK: DDAIF) and Global President of Mercedes-Benz Car Group Dieter Zetsche said in interviews with Caijing and other media that this accident will not change their original intention of continuing to develop autonomous driving - to responsibly deliver a mature autonomous driving system to customers.

Robin Li, chairman and CEO of Baidu (NASDAQ: BIDU), also publicly stated at this year’s China High-Level Development Forum that autonomous driving will be a safer way to travel, and fully autonomous vehicles will truly enter the open road within 3 to 5 years. Running status.

Many industry experts interviewed by Caijing reporters said that in order to achieve fully automated driverless driving, there must be complete and comprehensive redundant backup. For example, Waymo and General Motors have carried out real scenario-based designs. "We must be in awe when doing autonomous driving, and we must adhere to redundant multi-sensor fusion solutions. With current computer vision technology, anyone who boasts about how awesome deep learning is can do autonomous driving at low cost using only cameras. They are all 'playing hooligans'. Before trying to please the car manufacturers, you must first touch your own conscience." Han Xu, CEO of Jingchi, said in WeChat Moments.

Therefore, many practitioners believe that autonomous vehicles in closed environments are the earliest application scenarios that can be implemented. By increasing the controllability and predictability of the environment, they can reduce the configuration and redundancy of expensive sensors. "Low-speed autonomous driving in a closed environment needs to be realized at the earliest. If you hit someone at the speed of the car, you will not die." said Qi Lei, former head of SAIC Silicon Valley Investment.

Wu Gansha said that the impact of Uber's fatal case on the industry is more of a wake-up call, allowing companies to be more careful and put safety first when designing, developing and testing products. After all, "an accident can have a huge impact on a company's performance". Reputation is fatal”.

Author: Financial Magazine

Leave a Reply


copyright © www.scitycase.com all rights reserve.
Beijing ICP No. 16019547-5