Preferences

One related way to think about this is that, if (non-utilitarian) people are ever able to change their own utility functions, other people's utilitarianism gives them an incentive to do so, because it can make them more likely to get what they want.

For example, if you can learn to feel sadder about something, other people who want to minimize your sadness will acquire an incentive to help you avoid that thing, even at some cost to themselves.

In many moral intuitions, you find the world as it is and then act on it in some way, without other people strategizing about, or being incentivized by, your moral reasoning. But when other people can do those things, you can get very weird outcomes.


This item has no comments currently.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal