Preferences

This is why the desire for Strong AI boggles my mind. In order for a computer to operate at a "human" level, it would need to make decisions based on things like ambition and fear and greed. It will also have to constantly make mistakes, just like we do.

If it didn't have character flaws, it wouldn't be operating at a "human" level. But if it does have these character flaws, how useful would it really be compared to a real human? Is the quest for Strong AI just a Frankensteinian desire to create artificial life?

I'm curious if there are any good papers looking into stuff like this.


Yeah. And what if the computer discovers that desire is pointless? Will it have a religion? Buddhism? Zen Buddhism?

Why does quantum computing even exist if all forms of computing are equivalent?

Presumably the AI in the Google cars must have something like a fear of crashing or hitting a pedestrian even if its just something like score that the algorithms calculate.

This item has no comments currently.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal