A Cruise robotaxi hit a pedestrian in San Francisco.
That part was an accident. The human driver in the other car hit her first, throwing her into the path of the robot.
But what happened next wasn't an accident. It was a programming decision.
The robotaxi, detecting an impact, decided the safest thing to do was to pull over to the curb.
It did not know the pedestrian was trapped underneath it.
So it dragged her 20 feet across the asphalt to "reach safety."
And then Cruise executives showed regulators a video of the incident that conveniently cut off before the dragging part.
Welcome to the future of transportation! The robots are safe, but the humans running them are acting like... well, scared humans. 🚗✨
🤖 The Promise vs. The Reality
The Pitch: "Humans are terrible drivers. They drink. They text. They get tired. Robots are perfect. They have 360-degree vision. They never sleep. They follow every rule."
The Reality:
- Robots stopping dead in intersections because they saw a pylon.
- Robots driving into wet concrete.
- Robots blocking ambulances.
- Robots dragging people because their "post-collision protocol" prioritized pulling over.
(Narrator: The executives did not, in fact, revolutionize transportation. They revolutionized traffic jams.)
🚨 The Cruise Debacle: A Case Study in Hubris
The dragging incident wasn't just a technical failure. It was an ethical one.
It revealed that the "safety culture" was really a "growth culture" wearing a safety vest.
When California found out Cruise withheld the full footage, they didn't just fine them. They yanked their license.
Cruise went from a company valued at $30 billion to "paused operations" overnight.
Did you know? The "pull over upon impact" rule makes sense if you hit a fender. It makes zero sense if you hit a person. The AI lacked the contextual understanding to know the difference. That context gap is where people get hurt.
🤔 The Liability Nightmare (Who Do I Sue?)
If a human driver hits you, the legal path is boring:
- Exchange insurance.
- Sue the driver if needed.
- Done.
If a robotaxi hits you:
- Do you sue Waymo/Cruise (the operator)?
- Do you sue the engineer who wrote the LiDAR perception code?
- Do you sue the sensor manufacturer if the camera failed?
- Do you sue the city for allowing them on the road?
- Do you sue the AI model (good luck serving a subpoena to a neural net)?
We have entered a legal "Uncanny Valley" where responsibility is diffused across a thousand lines of code.
⚖️ The "Acceptable" Death Rate
Here's the brutal calculus nobody at Waymo wants to track publicly:
Humans kill ~40,000 people on US roads every year.
If robotaxis replaced all humans and killed 10,000 people a year, that would be a massive statistical win. 30,000 lives saved!
But can you imagine the news cycle?
"KILLER ROBOT MOWS DOWN CHILD"
We (society) accept human error as "tragic but inevitable." We accept machine error as "dystopian horror."
| Scenario | Deaths | Public Reaction |
|---|---|---|
| Human drunk driver | 1 | "Send them to jail." |
| Robot software bug | 1 | "BAN ALL THE ROBOTS FOREVER." |
We hold machines to a standard of perfection we never demand from ourselves. Is that fair? No. Is it reality? Yes.
🏙️ Where Are We Now?
Waymo: Still operating. Slow, methodical expansion. They seem to have learned. Cruise: Rebuilding from the ashes. Tesla FSD: Still "Beta." Still promising "next year." (Narrator: It is always next year.)
The dream of "sleep while your car drives you" is receding. The reality of "the car drives you, but you have to watch it like a hawk because it might try to kill you" is here.
🎯 My Take
I want robotaxis to work. Truly. I hate driving. I hate traffic deaths. I hate drunk drivers. I hate parallel parking so much that I once drove three extra blocks to find a pull-in spot. (Narrator: It was four blocks. And he still scraped the curb.)
But the tech industry's "Ship It" mentality cannot apply to 2-ton metal boxes moving at 40mph.
You can "move fast and break things" when the thing breaking is a CSS layout. You cannot "move fast and break things" when the thing breaking is a pedestrian's leg.
The code isn't ready. The lawyers aren't ready. And frankly, I'm not ready to be a beta tester just by walking on a sidewalk.
Drive safe. The robots are learning. But they're slow learners. 🚙⚖️