• Strawberry@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      2
      ·
      4 hours ago

      The bots scrape costly endpoints like the entire edit histories of every page on a wiki. You can’t always just cache every possible generated page at the same time.

    • nutomic@lemmy.ml
      link
      fedilink
      arrow-up
      10
      arrow-down
      1
      ·
      12 hours ago

      Cache size is limited and can usually only hold a limited number of most recently viewed pages. But these bots go through every single page on the website, even old ones that are never viewed by users. As they only send one request per page, caching doesnt really help.

    • LiveLM@lemmy.zip
      link
      fedilink
      English
      arrow-up
      41
      ·
      1 day ago

      I’m sure that if it was that simple people would be doing it already…