Preferences

That’s not comparable though since the deepfake is going to be photo-realistic and the image wasn’t just made but shared. As noted in the article it seems that making these sorts of images of real people is highly likely to be made illegal in its own right and the behaviour could be illegal already under different laws.

> That’s not comparable though since the deepfake is going to be photo-realistic and the image wasn’t just made but shared

You can make a photorealistic drawing too, that can pass as a real photography (and with enough skill it will be better than whatever generative models can produce). The difference here is the lower barrier to make the images.

But I think that even a photorealistic drawing, if made and distributed to others, should be considered a form of harassment

> But I think that even a photorealistic drawing, if made and distributed to others, should be considered a form of harassment.

I agree, but critically, I think it's the harassment that matters, not the drawing.

If someone draws something in the privacy of their own home (maybe or maybe not with the assistance of an AI), and the police find it in an unrelated search, they should have nothing to prosecute. Drawing is one way humans work through their thoughts and emotions; the act of putting pencil to paper—including digital pencil to digital paper—should never be illegal, lest we effectively create thought crime.

Yes, when I was in college I took an art class where we had to cut a magazine picture in half, glue it to a piece of paper, then draw the other half. Mine was good enough that unless you got up close, you couldn’t tell which was the photo and which was my drawing. If someone had taken a picture of it, you wouldn’t have been able to tell at all. Photorealistic drawings are very possible.
I agree there’s no difference between making a fake photo-realistic image by hand or by AI. A crudely rendered drawing could also be a form of harassment as well but probably in different circumstances.
Photo realistic only in a sense. The body will have no relation to the actual body of the person in question.
I think you may be underselling what's going on here, because the entire point of AI generated artwork is that it is based on some chain of relationships to the source material, that there is no actual creativity involved.

So the AI is going to take what you give it, and combine that with what others have given it based on a bunch of similarity weights. If the AI is any good at its job, it will produce an image that is remarkably correlated to the undisclosed original, since that is exactly what AI is doing every time it's asked to do anything at all: "I have an incomplete thought/image in my mind, please produce a written or visual completion of it based on a model built on several billion best guesses from real world data." And if the victim of the harassment has had the misfortune (or lack of foresight) to have their actual nude images somehow slip into the ai's training data, it is likely the AI will prioritize surfacing them in its output.

I do not think we have to seriously entertain the thought that a random 15 year old will have nudes in the first place, much less part of a training dataset.

Either way, the AI will now give you a body that is roughly correlated with what your head looks like.

That's going to make an interesting supreme court case when it ends up there.

You can develop fake images of someone from pretty much purely public data. Images in the past, you would have needed to take an actual picture of the person. And if you had taken a picture of the person, there's generally a way to handle this scenario under copyright laws.

If someone shows up in court and says it's not a picture of the person suing, and just a generated image of someone who looks rather similar (people looking similar to each other being a rather common occurrence), and that the image is unlikely to cause imminent lawless action, it would be really interesting to watch the court try to pick apart what's going on.

So if a teenager is skilled enough at drawing on their iPad to make something look photorealistic, and they draw a naked picture of a classmate and share it, should that be illegal too?

I'm not trying to say it's ok, but just like the person you replied to, I truly find these things comparable.

> they draw a naked picture of a classmate and share it, should that be illegal too?

I don‘t see the argument here. What you describe is sexual harassment and should be illegal.

I guess I've never looked into everything that can constitute sexual harrassment.

If person A draws a naked picture and shares it with person B, and it looks like person C, then person C has been sexually harassed?

I think it’s a matter of degree. If you share it behind closed doors, with a few friends, it wouldn’t turn any heads (mostly because the target is unaware).

If you share it with the entire class? In the school building itself? It becomes impossible for the target to not be aware.

What if you share it with one person, and that person shares it with one person, and so on, until the whole class sees it? Has no crime been committed?
Has anyone been victimized at that point? The issue is not with the sharing, but with how people act afterwards.

The moment someone teases the other it doesn’t matter how many times it’s been shared.

> So if a teenager is skilled enough at drawing on their iPad to make something look photorealistic, and they draw a naked picture of a classmate and share it, should that be illegal too?

If the person is recognizable, then in many cases it would violate existing nonconsensual/revenge porn laws, which very often are not concerned with provenance but whether its a explicit, a recognizable depiction, and nonconsensual.

(Depending on where it is shared it might, as others note, be sexual harassment as well.)

Yeah if they can draw something that photorealistic then sure. Poorly rendered drawings could also constitute harassment depending on what they were and how they were used.
Sounds like harassment

This item has no comments currently.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal