People always have been pervy and gross in private and society has always provided outlets for them to do so while preserving their privacy and dignity.
However, using a wannabe AI social media platform to engage with this stuff (and said platform encouraging you to do so), is crossing several uncomfortable lines for me.
I mean any sort of involvement in this in any way, either making stuff or being used as a face model or anything. This is semi-public and content you make is associated with you, and so is the content made about you.
That is something of a valuable service. People might see that option and think “well, why not? What’s the harm?” This way they can learn what the harm is, without having to be the person being harmed.
I agree that this can be a method that can provide insight. It's also one of the areas where journalistic ethics are most strongly considered.
A journalist has done good work if they report on their ability to smuggle a replica bomb onto a plane. It's a bit hazier if they smuggle a real bomb on board because it puts people at risk. They shouldn't blow up a plane to show how easy it is.
I didn't offer any judgement in my comment as to the ethics of this particular reporter, just noting that was the style that they do.
The claims she purported to represent Meta in public statements would, if true, count as unethical journalism. I don't know of the accuracy of those claims, so at this stage I would remain undecided but wary.
All evidence points to the author trying to make this a moral panic, especially with the emphasis on "real women" being used to generate these (despite the feature meaning they have consented to it)
Yes. And we also have documented cases where generative AI, in the hands of people with serious psychological issues, is significantly accelerating those people's loss of control of those issues, to very negative outcomes.
The fact that the AI industry is apparently littered with incredibly immature guys who perceive themselves to be Randian superheros does not reassure me that this tool is going to be better.
I propose a subtractive solution: that we ban this category of congenerative video products entirely, or consider criminalising any use that reproduces the appearance of a known person without their explicit contractual consent.
(I really don't care that this suggestion typically provokes what amounts to libertarian screeching)
> Will you be a little outraged though when they use real women without their consent?
It's a complicated issue that I've considered many times before. If we deem deepfake pornography unethical because it creates images/videos that look like real people, what does this mean for "lookalike" pornography, featuring actors done up (or who just naturally look) like famous people?
For example:
Let's say Person A has a friend, Person B, who looks like Person C. Person B consents to Person A using artificial intelligence to generate pornographic images of them, which in turn look like pornographic images of Person C. Should Person A need consent from Person C?
Is it really that complicated? I'm not sure the slippery slope towards a potential future with too-generic porn faces is really all that convincing against the very real and ongoing mass distribution of a nascent, poorly regulated (or even understood) medium that features torrential quantities of--let's not mince words--sexual exploitation.
Engaging with sexual exploitation includes its consumption, and it absolutely includes its creation. I don't have any problems with content, generated or otherwise, that includes informed consent from all models involved. But that's not what's being described in this article. They are specifically talking about photorealistic imagery generated using the likeness of individuals who did not knowingly consent to such depictions.
>Will you be a little outraged though when they use real women without their consent?
Probably not. AI slop doesn't really "go viral" except when it is super ridiculous like shrimp Jesus. Most people generating AI slop porn are likely in the 10s of people who will see it. If someone generates porn with my face on it and I never even know, how does this harm me? Why should I care?
This seems like you are not fully thinking this through, either intentionally or not. Benefit of the doubt, I will add to your question about why you should care with more questions: what if it's someone you know? what if they want to then sell the generated content? what if it was your political enemies? what if it was your boss? what if it was someone who was stalking you or made you feel unsafe?
>This seems like you are not fully thinking this through, either intentionally or not. Benefit of the doubt
I am not, that is why I'm leaving it to replies to convince me otherwise.
>what if it's someone you know?
Why not start with what if it's me? If I don't know about it, implicitly, I don't care. But if I do and it bothers me, AI companies have already been given free reign over copyright law so I don't really have any recourse here. Can I sue Sam Altman? No, because if that was possible, someone like Studio Ghibli would have done it and made a billion dollars by now.
>what if they want to then sell the generated content?
If it's generated by AI, then the courts have already stated it can't be copyrighted, and is therefore unsellable. The first person you sell it to can redistribute it for free and there's no way you can stop them.
>what if it was your political enemies?
I'm not political, which upsets a lot of people. Just tell people "I'm not voting, it doesn't matter" and watch them lose their minds.
>what if it was your boss?
I'm not really sure why I should be fighting my boss's battles.
>what if it was someone who was stalking you or made you feel unsafe?
People can stalk me without AI. I'm not sure how AI changes this.
Have you been in a high school in the past two years? "Your classmates will generate AI porn of you and share it amongst your other classmates" is rampant.
There are dozens of scenarios where this will completely ruin your life.
One example is someone generating prom of you with underage children and sending it to everyone you know -> life ruined.
You can already easily do this with open source models
> Most people generating AI slop porn are likely in the 10s of people who will see it. If someone generates porn with my face on it and I never even know, how does this harm me? Why should I care?
You would presumably care if one of those "10s of people" was a family member or peer.
Maybe not caring is enlightened of you, but it shouldn't stretch your imagination to consider why others would.
>You would presumably care if one of those "10s of people" was a family member or peer.
This is an interesting angle. Maybe someone might try to blackmail me with AI porn of me cheating on my spouse? This could also be a great usecase for divorce attorneys. But it would be easy enough to do that without AI. There's the old joke:
>Once a year I send out 1000 Valentine Day cards signed "Guess Who? XOXO".
>Its a cheap and easy marketing plan for my divorce attorney law group.
… and this is probably where the article should’ve ended. Or in fact, where the author should’ve realized there didn’t need to be an article at all.
People are weird and gross. They do weird things that we would often prefer they not. Sora provided a tool to avoid that weirdness. Use it.