Preferences

> I suspect you've never done therapy yourself. Most people who have worked with a professional therapist understand intuitively why the only helpful feedback from an LLM to someone who needs professional help is: get professional help. AIs are really good at doing something to about 80%.

I'm shocked that GPT-5 or Gemini can code so well, yet if I paste a 30 line (heated) chat conversation between my wife and I it messes up about what 5% of those lines actually mean -- spectacularly so.

It's interesting to ask it it analyze the conversation in various psychotherapeutic frameworks, because I'm not well versed in those and its conclusions are interesting starting points, but it only gets it right about 30% of the time.

All LLMs that I tested are TERRIBLE for actual therapy, because I can make it change its mind in 1-2 lines by adding some extra "facts". I can make it say anything.

LLMs completely lose the plot. They might be good for someone who needs self-validation and a feeling someone is listening, but for actual skill building, they're complete shit as therapists.

I mean, most therapists are complete shit as therapists but that's besides the point.


Not surprising, given that there's (hopefully, given the privacy implications) much more training data available for successful coding than for successful therapy/counseling.
> if I paste a 30 line (heated) chat conversation between my wife and I

i can't imagine how violated i would feel if i found out my partner was sending our private conversations to a nonprivate LLM chatbot. it's not a friend with a sense of care; it's a text box whose contents are ingested by a corporation with a vested interest in worsening communication between humans. scary stuff.

My partner is ok with it *
that's good! sorry for implying you were doing something like that without their knowledge; i was just thinking On Line about how i'd feel.

This item has no comments currently.