Archive.org link

Some key excerpts:

A pseudonymous coder has created and released an open source “tar pit” to indefinitely trap AI training web crawlers in an infinitely, randomly-generating series of pages to waste their time and computing power. The program, called Nepenthes after the genus of carnivorous pitcher plants which trap and consume their prey, can be deployed by webpage owners to protect their own content from being scraped or can be deployed “offensively” as a honeypot trap to waste AI companies’ resources.

The typical web crawler doesn’t appear to have a lot of logic. It downloads a URL, and if it sees links to other URLs, it downloads those too. Nepenthes generates random links that always point back to itself - the crawler downloads those new links. Nepenthes happily just returns more and more lists of links pointing back to itself,” Aaron B, the creator of Nepenthes, told 404 Media.

Since they made and deployed a proof-of-concept, Aaron B said their pages have been hit millions of times by internet-scraping bots. On a Hacker News thread, someone claiming to be an AI company CEO said a tarpit like this is easy to avoid; Aaron B told 404 Media “If that’s, true, I’ve several million lines of access log that says even Google Almighty didn’t graduate” to avoiding the trap.

  • Lvxferre@mander.xyz
    link
    fedilink
    arrow-up
    14
    ·
    15 hours ago

    This looks interesting. I’d probably combine it with model poisoning - giving each page longer chunks of text, containing bullshit claims and “grammar of slightly brokenness”; so if the data is used to train a model with, the result gets worse.

  • Onno (VK6FLAB)@lemmy.radio
    link
    fedilink
    arrow-up
    26
    ·
    edit-2
    22 hours ago

    This will not work. It sounds great, it sounds plausible, even realistic at some level, but this will not work.

    Here’s why.

    The bot operator has more money than you do. If the efficiency of one bot decreases on one website, they’ll throw another bot at it, rinse and repeat until your website stops responding because it’s ground to dust.

    Meta bots are good at doing this, hitting your site with thousands of requests a second, over and over again.

    Meta is not alone in this, but in my experience it’s the most destructive.

    Source: One of my clients runs a retail website and I’ve been dealing with this.

    At the moment the “best” - least worse is probably more accurate - “solution” is to block them as if they’re malicious traffic - which essentially is what they are.

    • G0ldenSp00nA
      link
      fedilink
      arrow-up
      14
      ·
      17 hours ago

      I think part of this software at least in the description on the website is easy and reliable detection of LLM bots to block. You can run it and it will generate statistics about bots that get caught in it so you can easily block big lists of them.

  • UrLogicFails@beehaw.orgOP
    link
    fedilink
    English
    arrow-up
    18
    ·
    1 day ago

    As I have gotten more and more exhausted with modern social media, I have contemplated building my own simple website (think neocities), but have been hesitant to put much of myself out as I do not want to feed the AI slop machines.

    Nepenthes is encouraging since I could see it being able to protect text in a way “glazing” (ie Nightshade) cannot.

    Hopefully AI companies cannot sidestep this as easily as ROBOTS.txt…

    • Onno (VK6FLAB)@lemmy.radio
      link
      fedilink
      arrow-up
      6
      ·
      24 hours ago

      If you build a static website, you can host it on AWS S3 and it will likely cost you cents to run. S3 on its own will handle a lot of traffic and you’re unlikely to hit that threshold, but if you do, you can add AWS Cloud Front and have a site that can handle more traffic than you can imagine.

      • UrLogicFails@beehaw.orgOP
        link
        fedilink
        English
        arrow-up
        5
        ·
        22 hours ago

        I don’t have a full picture idea yet, but I would probably upload art/ projects/ WIPs. I also was considering adding writings, but I’m not sure if there would be enough there to have a section for writings.

        I definitely don’t have as much to share as others, but I would like to see the Internet become a less centralized space.

        An author I follow on Bluesky has been talking about neocities a lot and it seems pretty silly and fun. You probably need to be signed in to view this, but you can see her talk about the process here.

        • arglebargle@lemm.ee
          link
          fedilink
          English
          arrow-up
          7
          ·
          18 hours ago

          I still run websites. Some simple and static, others forums, yet others are blogs.

          I like to have small communities around simple interests and they get enough traffic to stay interesting.

          The blogs are for recording things I do and want to remember and share, like setting up postgres to do interesting things.

          None of them have ads or generate revenue. I just miss the old internet and like to see these things exist.

          At the same time, I don’t really care if AI scrapes it. It’s out there to be looked at. Of course one website is a complete farce, illogical and fake product nonsense. I enjoy it getting scraped the most.