One related way to think about this is that, if (non-utilitarian) people are ever able to change their own utility functions, other people's utilitarianism gives them an incentive to do so, because it can make them more likely to get what they want.
For example, if you can learn to feel sadder about something, other people who want to minimize your sadness will acquire an incentive to help you avoid that thing, even at some cost to themselves.
In many moral intuitions, you find the world as it is and then act on it in some way, without other people strategizing about, or being incentivized by, your moral reasoning. But when other people can do those things, you can get very weird outcomes.
For example, if you can learn to feel sadder about something, other people who want to minimize your sadness will acquire an incentive to help you avoid that thing, even at some cost to themselves.
In many moral intuitions, you find the world as it is and then act on it in some way, without other people strategizing about, or being incentivized by, your moral reasoning. But when other people can do those things, you can get very weird outcomes.