Preferences

I'm trying to figure out an interpretation of this that actually makes sense. He's suggesting people don't even consider their prior experiences when choosing restaurants now? The subjective experience of "free will" requires a high threshold of ignorance?

Maybe I'm overestimating other people but in those specific contexts, I almost never take the number of stars or a few people's reviews as the only relevant things to consider. (Or, maybe he's flattering himself by perceiving a type of non-agency in "the masses" that doesn't exist.)


I think the point is that when the AI takes over the control of the world it does not have to be through a technological singularity, with Roko's Basilisk and all the drama. It could as well be through generations of people progressively yielding their free will to what he calls "apps". In this scenario, our choices, over the course of a century or so, become ultimately non-existent.
> In this scenario, our choices, over the course of a century or so, become ultimately non-existent.

You need to explain how choosing to let apps make certain types of decision removes all choices whatsoever. As it stands, you make it sound like a carpenter who is using a nail gun instead of hammer has stopped driving nails.

Suppose the carpenter pulls out an autonomous house-building robot and tells it to build him a house. Is he still driving nails?

But to your main point, while we may be offloading only trivial decisions to apps today, the better they become at making these decisions, the more natural it will be to trust them for more significant ones. As the original article mentions, it's not much of a stretch to imagine an app that looks at your demographics and preferences and tells you who to vote for. And from there, why not apps that choose where to live, what career to pursue, or who to marry? Some day, it may even seem foolish not to defer to apps for important decisions. After all, how can one fallible, emotional person ever hope to make a better decision than a datacenter full of machines that can coolly consider all of the parameters and potential outcomes?

At that point, floating through a blissfully optimized life, one might say that yes, the apps are deciding everything for me, but they're doing so only in accordance with my preferences and values. I'm still in charge; I'm still exercising free will. But in the absence of making decisions oneself, where exactly did those preferences and values come from?

This item has no comments currently.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal