It kind of makes sense. Why program or train on such a rare occurrence. Just send it off to a human to interpret and be done with it. If that's the case then Tesla is closer to Waymo then previously thought. Maybe even ahead.
I don't think traffic light outages (e.g. flashing yellow) or police directing traffic at intersections is that rare, but regardless these cars do need to handle it in a safe and legal manner, which either means recognizing police gestures in a reliable way, or phoning home.
We know that Waymos phone home when needed, but not sure how Tesla handles these situations. I'm not sure how you conclude anything about Tesla based on their current temporary "safety monitor" humans in the cars - this is just a temporary measure until they get approval to go autonomous.
I seem to remember as a kid that cops would be directing traffic often if a signal was out or malfunctioning. I haven't seen that in years. The only time I see anyone directing traffic is around accidents, construction zones, or special events.
I can conclude based on using FSD every single day. I've hit issues just like this, as well as police directing. And it's completely fine.
Googling for this, apparently Tesla do try to recognize police gestures, and are getting better at it.
I wonder who gets the ticket when a driverless car does break the law and get stopped by police? If it's a Taxi service (maybe without a passenger in the car) then maybe it'd the service, but that's a bit different than issuing a traffic ticket to a driver (where there's points as well as a fine).
What if it's a privately owner car - would the ticket go to the car owner, or to the company that built the car ?!
Waymo said they normally handle traffic light outages as 4-way stops, but sometimes call home for help - perhaps if they detect someone in the intersection directing traffic ?
Makes you wonder in general how these cars are designed to handle police directing traffic.