02/24 2026
532

"The public accepts human errors but expects robots to be nearly perfect"
Compiled by | Yang Yuke Edited by | Li Guozheng Produced by | Bangning Studio (gbngzs)
On January 23, 2026, a Waymo self-driving taxi collided with a child near an elementary school in Santa Monica, California, causing minor injuries. The incident occurred during the morning rush hour as students were arriving at school. The child suddenly ran into the street from behind a parked SUV and collided with the self-driving vehicle, which was moving straight.
Currently, the U.S. National Highway Traffic Safety Administration (NHTSA) has launched a special investigation. According to a preliminary assessment, the self-driving taxi, operating without a human safety supervisor, had reduced its speed from 17 miles per hour (27.3 kilometers per hour) to below 6 miles per hour (9.6 kilometers per hour) before the collision.
The previous week, on January 17, a Zoox self-driving taxi collided with a parked car in San Francisco after the car's driver suddenly opened the door, obstructing the lane of the self-driving vehicle. Zoox stated, "The self-driving taxi had detected the open door and attempted to avoid it, but contact was unavoidable."
Zoox said the struck driver refused medical treatment. Local media later reported that the driver complained of hand pain.
As of February 10, 2026, the California Department of Motor Vehicles (DMV) database recorded 15 collision reports involving self-driving taxis, including 12 for Waymo and 3 for Zoox. In 2025, nearly 140 accident reports were filed, involving companies testing self-driving vehicles, such as Nuro and Tensor.

▍01 Trust in Self-Driving Technology Collapses Quickly
In response to the child collision incident in Santa Monica, California, Waymo explained in a blog post, "Our technology immediately detected him (the child) as he emerged from behind the parked vehicle." The company further stated that its self-driving taxi braked sharply before impact, reducing its speed from 17 miles per hour to 6 miles per hour.
Under similar circumstances, Waymo claimed that a fully attentive human driver would have likely hit the child at approximately 14 miles per hour, thereby demonstrating the safety of Waymo's autonomous system.
As Waymo, Tesla, and other companies compete for early market share, self-driving taxi operators are preparing to expand across the United States. However, analysts say that despite strong safety claims, recent injury accidents and traffic violations are testing public acceptance.
There are precedents. In 2018, Uber shut down its self-driving program after a fatal accident during testing in Arizona. Five years later, in 2023, Cruise also suspended its self-driving program after a pedestrian was dragged by one of its vehicles in San Francisco. This illustrates how quickly public trust in self-driving vehicles can erode.
William Riggs, a professor at the University of San Francisco who specializes in autonomous vehicles, said, "When something goes wrong, people don't view it as a statistical probability issue but as a moral and emotional one."
Riggs noted that while more than 40,000 people die in traffic accidents annually in the United States, the public will hold self-driving cars to a higher standard. A key test is whether edge cases (rare scenarios that challenge autonomous systems) become less frequent and severe over time.

After the Waymo incident, the Teamsters union in California called on state regulators to suspend Waymo's operating permit, citing safety concerns and fears that artificial intelligence will replace skilled human workers.
The union said in a statement, "Robot taxis threaten workers' jobs and now terrorize our children."
Philip Koopman, a professor at Carnegie Mellon University and an autonomous vehicle safety expert, believes that such incidents will influence public opinion one by one, even amid controversy.
Koopman wrote in a January 31 Substack blog post, "Claims of statistical safety are insufficient to gain public acceptance." Instead, each accident is viewed in isolation rather than judged based on safety data accumulated over millions of miles.
In the January crashes involving Waymo and Zoox, Koopman said self-driving taxi operators could do more to anticipate high-risk situations and slow down before collisions become inevitable.
The industry must also prepare the public for the possibility of fatalities.
At a TechCrunch event in October last year, when asked whether society would accept deaths caused by robots, Waymo co-CEO Tekedra Mawakana said, "I think society will."
"The challenge we face is ensuring that society holds companies to a sufficiently high safety standard. Therefore, businesses should be transparent about their safety records."
However, some critics accuse Waymo of insufficient transparency.
Missy Cummings, a former NHTSA senior safety advisor who teaches at George Mason University, called on Waymo to release video footage from its self-driving taxi involved in the child injury incident. Cummings said in a LinkedIn post, "They are not credible and should immediately release the full video."
However, Waymo stated that it would not publicly share the video but would provide it to regulators as part of the reporting process.
According to an October Marist poll, Americans remain skeptical of self-driving cars. The national survey found that 56% were unlikely or very unlikely to try a self-driving taxi, although most of Generation Z and millennials said they would.

Bryant Walker Smith, a law professor at the University of South Carolina who specializes in autonomous vehicles, suggested that self-driving taxi operators need to be more humble in recognizing areas for improvement rather than reflexively defending their safety records.
"If a company says, 'This is where the real difficulty lies. This is what we don't know. This is what we're working on,' I would trust them more. We should stop asking whether people trust the technology and start asking whether the companies behind the technology are trustworthy."
At a February 4 Senate hearing on federal self-driving car regulation, Smith called for stronger industry oversight, better data reporting to regulators, and an end to confidentiality agreements that prevent the public from learning about crashes.
Waymo also faces another investigation related to school bus safety. In January, the National Transportation Safety Board (NTSB) launched an investigation after Waymo taxis repeatedly passed stopped school buses in Austin school zones during school hours, despite requests from the Austin school district to suspend operations.

▍02 Self-Driving Technology Is Not Forgiven
Concerns arise that as self-driving taxi operators expand, such incidents could multiply.
Waymo launched commercial operations in California in 2023. While self-driving taxis are relatively new outside California, this is rapidly changing.
Waymo operates in six U.S. metropolitan areas, including Phoenix, the San Francisco Bay Area, Los Angeles, Austin, Atlanta, and Miami. As a subsidiary of Alphabet, the company announced in early February that it had raised $16 billion in new funding to drive its U.S. expansion.
Waymo stated that by 2026, new cities will include Dallas, Houston, Detroit, Nashville, Baltimore, Pittsburgh, Washington, D.C., and several others. The company is also testing in New York City, London, and Tokyo.
Mauricio Pena, Waymo's chief safety officer, told a U.S. Senate hearing on February 4, "We provide over 400,000 rides per week and will have delivered more than 20 million rides by the end of 2025. Our mission is to build the world's most trustworthy driver."
Tesla CEO Elon Musk said during a January earnings call that while Waymo leads in the self-driving taxi race, Tesla plans to launch its own service in several U.S. cities this year. Musk has said Tesla will eventually surpass Waymo because it manufactures its own self-driving taxis.
In June 2025, Tesla launched a self-driving taxi with a safety driver in Austin. According to Tesla's plans, it will begin phasing out human safety supervisors earlier this year. Tesla's service in the Bay Area uses human drivers.
Zoox opened its self-driving taxi service to the public in Las Vegas for the first time in September 2025, followed by an early pilot program for passengers in popular San Francisco neighborhoods in November. Zoox said it plans to expand the service over time.
Uber CEO Dara Khosrowshahi said in early February that Uber is partnering with self-driving vehicle suppliers to offer self-driving taxi services in up to 15 U.S. cities by the end of the year. One partner, Nuro, is testing a Lucid-based robotaxi and plans to launch later this year.
Self-driving taxi operators say safety is a top priority as they scale up.
Waymo states on its website that its self-driving vehicles cause 90% fewer accidents resulting in serious injury or death and 81% fewer injury accidents overall compared to average human drivers traveling the same distances in the cities where they operate.
While Tesla and Zoox report self-driving taxi accidents to state and federal regulators as required, they do not publish detailed safety scorecards like Waymo does on its website.
Craig Melrose, managing partner at HTEC, a global mobility and advanced technology services provider, said self-driving cars may face a double standard because people accept human errors but expect robots to be nearly perfect.
He said that over time, people will need to adjust their expectations as the technology learns to handle irrational and unpredictable human behavior on the roads.
Melrose said, "I think expectations must be established so that people can reach a consensus and train themselves to the possible level—just as human drivers have done over decades of daily practice to meet current expectations."
(This article partially synthesizes reports from Automotive News and includes images sourced from the internet.)
