• The bots scrape costly endpoints like the entire edit histories of every page on a wiki. You can’t always just cache every possible generated page at the same time.

    • @jagged_circle@feddit.nl
      link
      fedilink
      English
      -2
      edit-2
      3 days ago

      Of course you can. This is why people use CDNs.

      Put the entire site on a CDN with a cache of 24 hours for unauthenticated users.