• GrymEdm@lemmy.world
    link
    fedilink
    arrow-up
    73
    ·
    edit-2
    8 months ago

    It isn’t too much to ask. According to Dr. K of HealthyGamerGG (Harvard Psychiatrist/Instructor), research shows that the release of non-consensual porn makes the unwilling subjects suicidal over half the time. Non-consensual porn = deepfakes, revenge porn, etc. It’s seriously harmful, and there are other effects like depression, shame, PTSD, anxiety, and so on. There is functionally unlimited porn out there that is made with consent, and if someone doesn’t want to be publicly sexually explicit then that’s their choice.

    I’m not against AI porn in general (I consider it the modern version of dirty drawings/cartoons), but when it comes to specific likenesses as with deepfakes then there’s clear proof of harm and that’s enough for me to oppose it. I don’t believe there’s some inherent right to see specific people naked against their will.

    • fidodo@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      8 months ago

      I think it would be too big of a privacy overreach to try to ban it outright as I think what people do on their own computers is their own business and there’s no way to enforce a full ban without being incredibly intrusive, but as soon as it gets distributed in any way I think it should be prosecuted as heavily as real non consensual porn that was taken against someone’s will.

    • HakFoo@lemmy.sdf.org
      link
      fedilink
      arrow-up
      6
      ·
      edit-2
      8 months ago

      I wonder if part of the emotional risk is due to the general social stigma attached to porn. It becomes something that has to be explained and justified.

      If done to grand excess, deepfakes could crash the market on that, so to speak. Yeah, everyone saw your face on an AI-generated video. They also saw Ruth Bader Ginsburg, their Aunt Matilda, and for good measure, Barry Bonds, and that was just a typical Thursday.

      The shock value is burnt through, and “I got deepfaked” ends with a social stigma on the level of “I got in a shouting match with a cashier” or “I stumbled into work an hour late recently.”

      • fidodo@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 months ago

        My main concern is for kids and teenagers. They’ll bully people for no damn reason at all and AI porn allows for bullies to do more fucked up psychological abuse, and that could be made much worse if victims have no recourse to fight back.