I just started using this myself, seems pretty great so far!

Clearly doesn’t stop all AI crawlers, but a significantly large chunk of them.

  • zutto@lemmy.fedi.zutto.fiOP
    link
    fedilink
    English
    arrow-up
    30
    ·
    2 days ago

    You have a point here.

    But when you consider the current worlds web traffic, this isn’t actually the case today. For example Gnome project who was forced to start using this on their gitlab, 97% of their traffic could not complete this PoW calculation.

    IE - they require only a fraction of computational cost to serve their gitlab, which saves a lot of resources, coal, and most importantly, time of hundreds of real humans.

    (Source for numbers)

    Hopefully in the future we can move back to proper netiquette and just plain old robots.txt file!