A very NSFW website called Pornpen.ai is churning out an endless stream of graphic, AI-generated porn. We have mixed feelings.

    • drekly@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 year ago

      CivitAI is a pretty perverted site at the best of times. But there’s a disturbing amount of age adjustment plugins to make images of children on the same site they have plugins to make sex acts. It’s clear some people definitely are.

      • oats@110010.win
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        Some models also prefer children for some reason and then you have to put mature/adult in positive prompt and child in negative

    • inspxtr@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      I remember reading that this may be already happening to some extent, eg people sharing tips on creating it on the deep web, maybe through prompt engineer, fine tuning or pretraining.

      I don’t know how those models are made, but I do wonder the ones that need retraining/finetuning by using real csam can be classified as breaking the law.

    • Knusper@feddit.de
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      4
      ·
      1 year ago

      Well, to develop such a service, you need training data, i.e. lots of real child pornography in your possession.

      Legality for your viewers will also differ massively around the world, so your target audience may not be very big.

      And you probably need investors, which likely have less risky projects to invest into.

      Well, and then there’s also the factor of some humans just not wanting to work on disgusting, legal grey area stuff.

      • Womble@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        ·
        1 year ago

        yup, just like the ai needed lots of pictures of astronaughts on horses to make pictures of those…

        • JonEFive@midwest.social
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Exactly. Some of these engines are perfectly capable of combining differing concepts. In your example, it knows basically what a horse looks like, and what a human riding on horseback looks like. It also knows that an astronaut looks very much like a human without a space suit and can put the two together.

          Saying nothing of the morality, In this case, I suspect that an AI could be trained using pictures of clothed children perhaps combined with nude images of people who are of age and just are very slim or otherwise have a youthful appearance.

          While I think it’s repugnent in concept, I also think that for those seeking this material, I’d much rather it be AI generated than an actual exploited child. Realistically though, I doubt that this would actually have any notable impact to the prevalence of CSAM, and might even make it more accessible.

          Furthermore, if the generative AI gets good enough, it could make it difficult to determine whether an image is real or AI generated. That would make it more difficult for police to find the child and offender to try to remove them from that situation. So now we need an AI to help analyze and separate the two.

          Yeah… I don’t like living in 2023 and things are only getting worse. I’ve put way more thought into this than I ever wanted to.

          • Ryantific_theory@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            Aren’t AI generated images pretty obvious to detect from noise analysis? I know there’s no effective detection for AI generated text, and not that there won’t be projects to train AI to generate perfectly realistic images, but it’ll be a while before it does fingers right, let alone invisible pixel artifacts.

            As a counterpoint, won’t the prevalence of AI generated CSAM collapse the organized abuse groups, since they rely on the funding from pedos? If genuine abuse material is swamped out by AI generated imagery, that would effectively collapse an entire dark web market. Not that it would end abuse, but it would at least undercut the financial motive, which is progress.

            That’s pretty good for 2023.

      • d13@programming.dev
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        Unfortunately, no, you just need training data on children in general and training data with legal porn, and these tools can combine it.

        It’s already being done, which is disgusting but not surprising.

        People have worried about this for a long time. I remember a subplot of a sci-fi series that got into this. (I think it was The Lost Fleet, 15 years ago).

      • Jesus_666@feddit.de
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        Typically, the laws get amended so that anything that looks like CSAM is now CSAM. Expect porn generators tuned for minor characters to get outlawed very quickly.

    • mrnotoriousman@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      There was an article the other day about underage girls in France having AI nudes spread around based on photos as young as 12. Definitely harm there.

    • 👁️👄👁️@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      You’d also have to convince them that it’s not real. It’ll probably end up creating laws tbh. Then there are weird things like Japan where lolis are legal, but uncensored genitals aren’t, even drawn.