> I wish I knew what you were so I could say "one of the many terrible things about __" about you.
I'm a software engineer, so I beat you to it.
> I always thought intentionally applying an emotional distance was a strategy to help us see what's really happening, since allowing emotions to creep in causes us reach conclusions we want (motivated reasoning) instead of conclusions that reflect reality. I find it a valuable way to think.
And the problem is taking that too far, and doing it too much. It's a tactic "to help us see what's really happening," but it's wrong to stop there and forget things like values, interests, and morality.
> And people tend to vastly overvalue our "humanity" anyway.
WTF, man.
> I'm guessing the ones that displaced horses didn't give much of a fuck about what happened to horses.
Who cares what "the ones that displaced horses" thought? You're the horse in that scenario,and the horse cares. Another obnoxious software engineer problem is taking the wrong, often self-negating, perspective.
Yes, the robber who killed you to steal your stuff probably didn't mind you died. So I guess everything's good, then? No.
> Anyway, I think you have an unhealthy emotional attachment to your emotions.
Emotions aren't bad, they're healthy. But a rejection of them is probably a core screwed-up belief that leads to "aloof galaxy-brain, passively observing humanity from afar" syndrome.
There's probably parallel to the kind of obliviousness that gets you the behavior in the Torment Nexus meme ("Tech Company: "At long last, we have created the Torment Nexus from the classic sci-fi novel Don't Create The Torment Nexus.'") i.e. "Software Engineer: 'At long last, I've purged myself of emotion and become perfectly logical like Lt. Cmdr. Data from the classic sci-fi Logical Robot Data Wants to Be Human and Feel Emotions."
I'm a software engineer, so I beat you to it.
> I always thought intentionally applying an emotional distance was a strategy to help us see what's really happening, since allowing emotions to creep in causes us reach conclusions we want (motivated reasoning) instead of conclusions that reflect reality. I find it a valuable way to think.
And the problem is taking that too far, and doing it too much. It's a tactic "to help us see what's really happening," but it's wrong to stop there and forget things like values, interests, and morality.
> And people tend to vastly overvalue our "humanity" anyway.
WTF, man.
> I'm guessing the ones that displaced horses didn't give much of a fuck about what happened to horses.
Who cares what "the ones that displaced horses" thought? You're the horse in that scenario,and the horse cares. Another obnoxious software engineer problem is taking the wrong, often self-negating, perspective.
Yes, the robber who killed you to steal your stuff probably didn't mind you died. So I guess everything's good, then? No.
> Anyway, I think you have an unhealthy emotional attachment to your emotions.
Emotions aren't bad, they're healthy. But a rejection of them is probably a core screwed-up belief that leads to "aloof galaxy-brain, passively observing humanity from afar" syndrome.
There's probably parallel to the kind of obliviousness that gets you the behavior in the Torment Nexus meme ("Tech Company: "At long last, we have created the Torment Nexus from the classic sci-fi novel Don't Create The Torment Nexus.'") i.e. "Software Engineer: 'At long last, I've purged myself of emotion and become perfectly logical like Lt. Cmdr. Data from the classic sci-fi Logical Robot Data Wants to Be Human and Feel Emotions."