• MonkderVierte@lemmy.ml
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    1
    ·
    8 months ago

    Btw, how about limiting clicks per second/minute, against distributed scraping? A user who clicks more than 3 links per second is not a person. Neither, if they do 50 in a minute. And if they are then blocked and switch to the next, it’s still limited in bandwith they can occupy.

    • letsgo@lemm.ee
      link
      fedilink
      English
      arrow-up
      8
      ·
      8 months ago

      I click links frequently and I’m not a web crawler. Example: get search results, open several likely looking possibilities (only takes a few seconds), then look through each one for a reasonable understanding of the subject that isn’t limited to one person’s bias and/or mistakes. It’s not just search results; I do this on Lemmy too, and when I’m shopping.

      • MonkderVierte@lemmy.ml
        link
        fedilink
        English
        arrow-up
        5
        ·
        8 months ago

        Ok, same, make it 5 or 10. Since i use Tree Style Tabs and Auto Tab Discard, i do get a temporary block in some webshops, if i load (not just open) too much tabs in too short time. Probably a CDN thing.

        • Opisek@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          8 months ago

          Would you mind explaining your workflow with these tree style tabs? I am having a hard time picturing how they are used in practice and what benefits they bring.

      • MonkderVierte@lemmy.ml
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        8 months ago

        Ah, one request, then the next IP doing one and so on, rotating? I mean, they don’t have unlimited adresses. Is there no way to group them together to a observable group, to set quotas? I mean, in the purpose of defense against AI-DDOS and not just for hurting them.