You can make a photorealistic drawing too, that can pass as a real photography (and with enough skill it will be better than whatever generative models can produce). The difference here is the lower barrier to make the images.
But I think that even a photorealistic drawing, if made and distributed to others, should be considered a form of harassment
I agree, but critically, I think it's the harassment that matters, not the drawing.
If someone draws something in the privacy of their own home (maybe or maybe not with the assistance of an AI), and the police find it in an unrelated search, they should have nothing to prosecute. Drawing is one way humans work through their thoughts and emotions; the act of putting pencil to paper—including digital pencil to digital paper—should never be illegal, lest we effectively create thought crime.
So the AI is going to take what you give it, and combine that with what others have given it based on a bunch of similarity weights. If the AI is any good at its job, it will produce an image that is remarkably correlated to the undisclosed original, since that is exactly what AI is doing every time it's asked to do anything at all: "I have an incomplete thought/image in my mind, please produce a written or visual completion of it based on a model built on several billion best guesses from real world data." And if the victim of the harassment has had the misfortune (or lack of foresight) to have their actual nude images somehow slip into the ai's training data, it is likely the AI will prioritize surfacing them in its output.
Either way, the AI will now give you a body that is roughly correlated with what your head looks like.
You can develop fake images of someone from pretty much purely public data. Images in the past, you would have needed to take an actual picture of the person. And if you had taken a picture of the person, there's generally a way to handle this scenario under copyright laws.
If someone shows up in court and says it's not a picture of the person suing, and just a generated image of someone who looks rather similar (people looking similar to each other being a rather common occurrence), and that the image is unlikely to cause imminent lawless action, it would be really interesting to watch the court try to pick apart what's going on.
I'm not trying to say it's ok, but just like the person you replied to, I truly find these things comparable.
I don‘t see the argument here. What you describe is sexual harassment and should be illegal.
If person A draws a naked picture and shares it with person B, and it looks like person C, then person C has been sexually harassed?
If you share it with the entire class? In the school building itself? It becomes impossible for the target to not be aware.
The moment someone teases the other it doesn’t matter how many times it’s been shared.
If the person is recognizable, then in many cases it would violate existing nonconsensual/revenge porn laws, which very often are not concerned with provenance but whether its a explicit, a recognizable depiction, and nonconsensual.
(Depending on where it is shared it might, as others note, be sexual harassment as well.)