• HakFoo@lemmy.sdf.org
    link
    fedilink
    arrow-up
    6
    arrow-down
    5
    ·
    edit-2
    7 months ago

    I wonder if part of the emotional risk is due to the general social stigma attached to porn. It becomes something that has to be explained and justified.

    If done to grand excess, deepfakes could crash the market on that, so to speak. Yeah, everyone saw your face on an AI-generated video. They also saw Ruth Bader Ginsburg, their Aunt Matilda, and for good measure, Barry Bonds, and that was just a typical Thursday.

    The shock value is burnt through, and “I got deepfaked” ends with a social stigma on the level of “I got in a shouting match with a cashier” or “I stumbled into work an hour late recently.”

    • fidodo@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      7 months ago

      My main concern is for kids and teenagers. They’ll bully people for no damn reason at all and AI porn allows for bullies to do more fucked up psychological abuse, and that could be made much worse if victims have no recourse to fight back.