Preferences

ZYbCRq22HbJ2y7 parent
https://en.wikipedia.org/wiki/Stanislav_Petrov

> On 26 September 1983, three weeks after the Soviet military had shot down Korean Air Lines Flight 007, Petrov was the duty officer at the command center for the Oko nuclear early-warning system when the system reported that a missile had been launched from the United States, followed by up to four more. Petrov judged the reports to be a false alarm.


ethbr1
The irony is that he was both successful and a failure, a contradiction inherent in nuclear launch control.

From a deterrence and military perspective, you want a robot on launch control. Every time, on orders, without fail.

From a human and ethical perspective, you want a thinking individual with agency. Able evaluate orders and possibly disobey them.

Curious at the height of the Cold War (and now), what percentage of launch officers were expected to disobey orders to launch. It had to have been >0%.

bitwize
"Mr. McKittrick, after careful consideration I've come to the conclusion, sir, that your new defense system sucks."
ethbr1
Well, they did put the system into production without Stephen Falken. Which seems like a shortsighted way to deploy AI in a high risk use case!

This item has no comments currently.