shared via https://feddit.de/post/2805371

Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.

      • lloram239@feddit.de
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        I don’t think just giving up and allowing porn deep fakes and stuff of people is really an acceptable answer here.

        It’s the only sensible answer. Anything else would require an extreme violation of everybody’s privacy and the implementation of total surveillance. See France’s recent attempt at giving police full access to peoples phones, that’s the kind of stuff you end with when going down that route.

        This AI is out there today, can be run on every half descent gaming PC and can generate new images in about 30sec. And it will only get better going forward. Images are as malleable as text now, you can accept that, or keep trying to fight windmills.

        but sites absolutely can manage deepdakes

        Of course they can, and most already do. But on the whole, that really doesn’t have much of an effect, anybody can make their own sites and you don’t even have to go deep down into the dark web for that. It’s the first link on Google when you search for it.

      • davehtaylor@beehaw.org
        link
        fedilink
        arrow-up
        3
        ·
        1 year ago

        The way people are just throw up their hands at every new problematic issue with AI is not a good way of dealing with them, we don’t have to tolerate this stuff and now is the best time to deal with it and start creating rules and protections.

        Exactly. In another thread on here recently someone said something that basically boiled down to “your protest against AI isn’t going to stop it. There’s too much corporate power behind it. So you might as well embrace it” and I just cannot get my head around that mentality.

        Also, you can absolutely see the models who were used as references in some of the images generated by apps these days. Like that popular one right now that everyone is using to make idealized images of themselves. A few of my family and friends used it recently and you could clearly see in some of the pics the A-list celebs who were used as pose references, like Gal Godot, Scarlett Johansen, etc. It’s creepy as hell.

          • davehtaylor@beehaw.org
            link
            fedilink
            arrow-up
            4
            ·
            1 year ago

            I never said it was. But like the person I was replying to said: we need to take a good hard look at what the hell these tools are doing and allowing and decide as a society if we’re going to tolerate it.

            The real issue here is what things like deepfakes can do. It’s already starting, and it’s going to continue accelerating, generating mis- and disinformation: for private citizens, celebs, and politicians. While you might say “it’s creepy, but there’s nothing we can do about people deepfaking Nancy Pelosi’s face onto their spank material”, it’s extremely problematic when someone decides to make a video where Joe Biden admits to running a CP ring, or some right wing chud makes a video of Trump appearing to say something they all want to hear, and it leads to a civil war. That’s the real stakes here. How we react to what’s happening with regular folk and celebs is just the canary int he coal mine.

    • Lionir [he/him]@beehaw.org
      link
      fedilink
      English
      arrow-up
      49
      ·
      1 year ago

      Everybody gets horny, idiot.

      Please don’t call people idiots needlessly.

      Does it matter if someone jerks off to JaLo in the Fappening or some random AI generated BS?

      The issue is that this technology can be used to create pornographic material of anyone that has some level of realism without their consent. For creators and the average person, this is incredibly harmful. I don’t want porn of myself to be made and neither do a lot of creators online.

      Not only are these images an affront to the dignity of people but it can also be incredibly harmful for someone to see porn of themselves they did not make with someone else’s body.

      This is a matter of human decency and consent. It is not negotiable.

      As mentioned by @ram@lemmy.ca, this can also be used for other harmful things like CSAM which is genuinely terrifying.

      • TheFriendlyArtificer@beehaw.org
        link
        fedilink
        arrow-up
        28
        ·
        1 year ago

        I have to disagree (but won’t downvote!)

        AI porn is creepy. In multiple ways!

        But it’s also a natural evolution of what we’ve been doing as a species since before we were a species.

        Does imagining a different partner while having sex or masturbating count? I would imagine most people would say, “no”.

        How about if somebody draws a crude stick figure of somebody they met on the street? Unless you’re Randall Munroe, this is probably harmless too.

        Now a highly skilled portrait artist paints a near replica of somebody he knows, but has never seen in the nude. They never mention their friend by name, but the output is lifelike and unmistakably them.

        Maybe a digital artist finds a few social media pictures of a person and decided to test drive Krita and manipulates them into appearing nude.

        Or, and this happened to me quite recently, you find your porn doppelganger. My spouse found mine and it ruined her alone time. And they really did look just like me! Taking that a step further, is it illegal to find somebody’s doppelganger and to dress them up so that they look more like their double?

        Like you, I don’t want people like this in my life. But it feels like this is one of those slippery slopes that turns out to be an actual slippery slope.

        You can’t make it illegal without some serious downstream effects.

        If you did, the servers will just get hosted in an Eastern European country that is happy to lulwat at American warrants.

        I don’t have any answers, just more Devil’s advocate-esque questions. If there was a way to make it illegal without any collateral damage, I’d be proudly behind you leading the charge. I just can’t imagine a situation where it wouldn’t get abused, a’la the DMCA.

        • Lionir [he/him]@beehaw.org
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Does imagining a different partner while having sex or masturbating count? I would imagine most people would say, “no”.

          You can’t share that though so while I still think it is immoral, it is also kind of impossible to know.

          Now a highly skilled portrait artist paints a near replica of somebody he knows, but has never seen in the nude. They never mention their friend by name, but the output is lifelike and unmistakably them.

          Maybe a digital artist finds a few social media pictures of a person and decided to test drive Krita and manipulates them into appearing nude.

          Those would be immoral and reprehensible. The law already protects against such cases on the basis of using someone’s likeness.

          It’s harmful because it shares images of someone doing things they would never do. It’s not caricature, it’s simply a fabrication. It doesn’t provide criticism - it is simply erotic.

          Taking that a step further, is it illegal to find somebody’s doppelganger and to dress them up so that they look more like their double?

          If the goal is to look like you, I would imagine it is possible to defend by law. Otherwise, it is simply coincidence. There’s no intent there.

          I don’t think it is a stretch or slippery slope. Just as a picture is captured by a camera, a drawing is captured by a person or a machine.

          Both should be the same and it is often already the case in many jurisdictions around the world when it comes to CSAM.

          • Rekorse@kbin.social
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            All of your arguments assume profit is the motive. Are you saying as long as no profit is made that it would be okay to do all of these things? (Ex. Self use only)

            • Lionir [he/him]@beehaw.org
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              No. I think that it would still be bad if it were self-use because it is ultimately doing something that someone doesn’t consent to.

              If you were to use this on yourself or someone consenting, I see no issues there - be kinky all you want.

              Consent is the core foundation for me.

              The reason why imagining someone is different is that it is often less intentional - thoughts are not actions.

              Drawing someone to be similar to someone you know is very intentional. Even worse, there is a high likely chance that if you are drawing someone you know naked, you likely never asked for their consent because you know you wouldn’t get it.

            • Rekorse@kbin.social
              link
              fedilink
              arrow-up
              2
              ·
              1 year ago

              That person just can’t grapple with any nuance, as they are afraid to let the sentence “ai child porn is less bad” cone out of their mouths

            • Lionir [he/him]@beehaw.org
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              I don’t like grading evil for this very reason so I think I will refrain from doing so - thank you for catching me doing that. I will refrain from doing that.

              That said, AI CSAM could enable other forms of abuse through blackmail. I can also see very harmful things happening to a child or teenager because people may share this material in a targeted way.

              I think both are inhumane and disgusting.

                • Lionir [he/him]@beehaw.org
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  1 year ago

                  I mean maybe calling it evil is part of the problem ?

                  I call it evil because it is intentional and premeditated.

                  There are degrees in everything. Punching somebody is less bad than killing somebody.

                  Trying to put everything on degrees is bound to show ignorance and imply that certain things are more acceptable than others.

                  I don’t want to hurt people with my ignorance and I do not want to tell someone that what they experienced is less bad than something else. They are bad and we’ll leave it at that.

                  Btw its totally humane because we invented the shit.

                  I am working with this definition : “Characterized by kindness, mercy, or compassion”. There is a difference between human-made and humane.

    • Jordan Lund@lemmy.one
      link
      fedilink
      English
      arrow-up
      26
      ·
      1 year ago

      You say that NOW, but if people start using your images to generate revenge porn or, you know, really anything you didn’t consent to, that’s a huge problem.

      Both for the people whose images were used to train the model and for the people whose images are generated using the models.

      Non-consent is non-consent.

      This is how you get the feds involved.

      • ram@lemmy.ca
        link
        fedilink
        English
        arrow-up
        36
        ·
        1 year ago

        Let’s not forget that these AI aren’t limited by age. Like fuck am I gonna be out here defending tech that would turn my kid into CSAM. Fucking disgusting.

        • PelicanPersuader@beehaw.org
          link
          fedilink
          English
          arrow-up
          14
          ·
          1 year ago

          Worse, people making AI CSAM will wind up causing police to waste resources investigating abuse that didn’t happen, meaning those resource won’t be used to save real children in actual danger.

        • MaggiWuerze@feddit.de
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          On the other hand, this could be used to create material that did not need new suffering. So it might reduce the need for actual children to be abused for the production of it.

          • ram@lemmy.ca
            link
            fedilink
            English
            arrow-up
            4
            ·
            1 year ago

            Ya, no, those people need psychological help. Not to feed the beast. This is nonsense.

            • MaggiWuerze@feddit.de
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              Sure they do, but if they have to consume would you rather a real child had to suffer for that or just an Ai generated one?

              • ram@lemmy.ca
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                Neither. I would have mental health supports that are accessible to them.

                • tweeks@feddit.nl
                  link
                  fedilink
                  arrow-up
                  2
                  ·
                  1 year ago

                  Of course we don’t want both, but it comes across as if you’re dismissing a possible direction to a solution to the one that is definitely worse (real life suffering) by a purely emotional knee jerk.

                  Mental health support is available and real CSAM is still being generated. I’d suggest we look into both options; advancing ways therapists can help and perhaps at least have an open discussion about these sensitive solutions that might feel counter-intuitive at first.

            • ichbinjasokreativ@beehaw.org
              link
              fedilink
              arrow-up
              1
              ·
              1 year ago

              It’s (rightfully) currently illegal, but that doesn’t stop people. Keep it illegal, increase punishment drastically, make AI-created material a grey area.

              • Rekorse@kbin.social
                link
                fedilink
                arrow-up
                2
                ·
                1 year ago

                Its already the worst crime around and people still do it. Maybe its not the punishment we need to focus on.

              • ram@lemmy.ca
                link
                fedilink
                English
                arrow-up
                1
                ·
                edit-2
                1 year ago

                I’m not sure increasing punishment is actually an effective manner of combating this. The social implications of being a child predator are likely to have a more deterrent effect than the penal system imo (I don’t have data to back that).

                I, personally, am an advocate for making treatment for pedophiles freely, easily, and safely accessible. I’d much rather help people be productive, non-violent members of society than lock them up, if given a choice.

          • tweeks@feddit.nl
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            That’s a fair point. And I believe AI should be able to combine legal material to create illegal material. Although this still feels wrong, if it excludes suffering in base material and reduces future (child) suffering, I’d say we should do research on it at least. Even if it’s controversial, we need to look at the rationale behind it.

      • Evergreen5970@beehaw.org
        link
        fedilink
        English
        arrow-up
        10
        ·
        1 year ago

        As someone who personally wouldn’t care at all if someone made AI porn of me and masturbated to it, I am incredibly uncomfortable with the idea that someone who doesn’t like me may have the option to generate AI porn of me having sex with a child. Now there’s fake “proof” I’m a pedophile, and I get my life ruined for sex I never had, for violation of consent I never actually committed. Even if I’m vindicated in court, I might still be convicted in the court of public opinion. And people could post faked porn of me and send it to companies to try to say “Evergreen5970 is promiscuous, don’t hire them.” Not all of us have the luxury of being able to pick and choose between companies depending on whether they match our values, some of us have to take what they can get and sometimes that would include companies that would judge you for taking nude photos of yourself. It would feel especially bad given I’m a virgin by choice who has never taken nudes let alone sent them. Punished for something I didn’t do.

        Not everyone is going to restrict their use to their private wank sessions, to making a real image of the stuff they probably already envision in their imagination. Some will do their best to make its results public with the full intention of using it to do harm.

        And once faking abuse with AI porn becomes well-known, it might discredit actual photographic/video proof of CSAM happening. Humans get fooled by whether an AI-generated image was taken by a human or generated by AI, and AI doesn’t detect AI-generated images with a perfect accuracy rate. So the question becomes “how can we trust any image anymore?” Not to mention the ability to generate new CSAM with AI. Some more mainstream AI models might try to tweak algorithms to prevent people from generating any porn involving minors, but there’ll probably always be some floating around with those guardrails turned off.

        I’m also very wary of dismissing other peoples’ discomfort just because I don’t share it. I’m still worried for people who would care about someone making AI porn of them even if it was just to masturbate with and kept private.

  • Melmi@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    42
    ·
    1 year ago

    I worry that the cat is out of the bag on this. The tech for this stuff is out there, and you can run it on your home computer, so barring some sort of massive governmental overreach I don’t see a way to stop it.

    They can’t even stop piracy and there’s the full weight of the US copyright industry behind it. How are they going to stop this tech?

    • Rekorse@kbin.social
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      The point isnt that its too late, its that the hype is overblown. Go to the website this article mentions and follow the instructions (explore, turn on nsfw, type name and nude) and you will VERY QUICKLY realize the technology is just shit.

      You can see some resemblances and sometimes one close one, but like another poster said they just look like the same shitty fan fiction we’ve had since Photoshop came about.

      Also, this could end porn blackmail when none can tell if its real or not. People will start judging the person who us supplying the material rather than the person in it.

      • Melmi@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        If it ends porn blackmail, it also ends photographic evidence. I think that’s significantly worse.

        And sure the tech is bad if you just type a name directly into a model, but if you take the time to refine it it gets pretty good, and it’s only going to get better over time. It’s time to start thinking about a future where this tech exists.

  • stown@sedd.it
    link
    fedilink
    English
    arrow-up
    28
    ·
    edit-2
    1 year ago

    This doesn’t even feel like an article - more like one long advertisement. The second paragraph of the article launches into a review of the “Erect Horse Penis - Concept LoRA”

      • LogicalDrivel@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        10
        ·
        1 year ago

        Yeah i picked up quite a few tips for generating good AI porn in this article. Im still not sure if this was satire or not. it basically gives you step by step how to do this.

        • kniescherz@feddit.de
          link
          fedilink
          arrow-up
          7
          ·
          1 year ago

          I think its great to explain the concept and even some details of the process. This removes the mysticism from the topic and someone who has no big knowledge on the topic may umderstand that it is a pretty easy process nowadays and can give up the mental image of hackerman with a black hoodie.

  • SteleTrovilo@beehaw.org
    link
    fedilink
    arrow-up
    14
    ·
    1 year ago

    The tech isn’t there yet. There are so often distracting flaws around the hands/feet. The AI doesn’t really know what a human is, its just endlessly re-combining existing material.

    • rhabarba@feddit.deOP
      link
      fedilink
      English
      arrow-up
      40
      ·
      1 year ago

      As much as I loathe having to reveal this to you, the shapeliness of the hands should be semi-negligible to most people who would love to have an image created from the statement “I want to see Billie Eilish’s boobs”.

      • CraigeryTheKid@beehaw.org
        link
        fedilink
        English
        arrow-up
        16
        ·
        1 year ago

        Agree that was a strange take. Can you usually tell it’s AI/fake? Yes. Is it still achieving the goal of the creator/user? Yes.

          • SteleTrovilo@beehaw.org
            link
            fedilink
            arrow-up
            9
            ·
            1 year ago

            I’m not into feet specifically, but when I ask for “Veronica Mars in a string bikini” I don’t want to get “Veronica Mars with unattached toes.” It’s distracting AF.

            Doesn’t happen with real models, or even human-made hentai.

            • lloram239@feddit.de
              link
              fedilink
              arrow-up
              5
              ·
              edit-2
              1 year ago

              Doesn’t happen with real models

              That actually happens quite a lot. Hollywood movie poster are especially full of it, as they rarely do specific photoshots for the poster and instead just copy&paste together whatever random images they can find. So you end up with stuff like the 300 poster where the sword doesn’t attach to the handle. Magazine covers adding an extra hand or leg isn’t all that uncommon either.

              And that’s the expensive stuff, once you go into low-budget productions like self-published book covers, it’s all just crude stock image copy&paste at best.

            • pbjamm@beehaw.org
              link
              fedilink
              English
              arrow-up
              7
              ·
              1 year ago

              Rob Leifield could barely draw hands/feet and managed a successful career as a comic book artist.

              • Rekorse@kbin.social
                link
                fedilink
                arrow-up
                4
                ·
                1 year ago

                So about the same amount of work to Photoshop an celebs head onto a naked body.

                Got it. I’m terrified of all the poorly made fanfic. Even perfectly made fanfic will never have the effect of a real photo or irl

      • Rekorse@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Just because the ai produces a model with Billie eilishes face and a naked body does not mean you’ve seen her nude for real.

        Its the exact same as drawing a naked lady and then drawing Billie eilish’s face on it.

        If that really gets you off and really violates her autonomy in some way I’d be interested to hear how. Its not currently illegal to draw real people in fictional scenarios is it?

        • The Doctor@beehaw.org
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Implicit in this statement is that people who’re inclined to generate a visual simulacrum of a real person for fantasy purposes actually care if it’s real. By definition, it’s not. If “real” was an issue to them, they probably wouldn’t bother with it.

    • Catsrules@lemmy.ml
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      Key word is yet.

      Yeah some body parts are a little weird today, but what about tomorrow, next week, next month, next year?

      I really haven’t given this much attention but the last time I did maybe 6-8 months ago, most of the photos had hands that were stuff of nightmares. Looking at them again today at least from the quick 10 minutes of looking they have improve significantly. Yeah they are still far from perfect but a handful are very good, most are passable and a few are still nightmare fuel.

    • Mutoid@beehaw.org
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      From a cursory scan, I did appreciate their deep dive and investigative approach in the article. I’ll be looking for their articles in the future.

  • Queen HawlSera@lemm.ee
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Furry transformation and ass expansion, can you do that for me HAL69000?

    Actually make him turn into a donkey girl before expanding dat butt, so we can make ass iokes