Preferences

This is also my reasoning for why I think AI alignment is not going to be a problem for humanity any time soon.

By the time AI will be capable of maintaining the whole supply chain required to keep itself running sufficient time will have passed so we can come up with something viable.


I think Billionaire alignment is a much larger problem than AI alignment. To use Bostrom's language, it's not full-on owl domestication, but sparrows with owl-like powers that we need to worry about.

https://lukemuehlhauser.com/bostroms-unfinished-fable-of-the...

This item has no comments currently.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal