• 0 Posts
  • 4 Comments
Joined 2 years ago
cake
Cake day: April 30th, 2024

help-circle

  • I am skeptical about the real level of protection that Anubis really provides.

    At the end is an automated test. Meaning that any machine could easily solve it.

    Most “attackers” wont bother solving it because they don’t really care. But if they would want they could. It’s sort of protection by obscurity.

    The more Anubis it’s used the more we see attacks that actually equip a way to solve the challenges. Then is when Anubis up the challenge and the battle begin, between how much can Anubis up the challenge so normal users can still browse and how much cost the attacker is willing to eat.

    Giving that these attackers tend to have high budgets I’m not that certain about its actual capabilities to reject a targeted ddos.

    As for crawling for big data. I do think that it does nothing here. Companies willing yo scrape big amounts of data, for AI training or other purposes, have massive budgets and the electricity cost of solving the JavaScript challenges become nothing in comparison. They also doesn’t need ro deny the service so they could spread the scrape to keep the challenge low reducing the cost even more.

    Once again, positive results we currently see in practice I believe that are caused just because most scrappers and ddos attackers are just blindly attacking and doesn’t really equip themselves for Anubis. Protection by obscurity. But a well equiped attacker I don’t think it would have that much trouble getting past it, specially for scrapping, or other type of bot attacks that could be slowed down.



  • Both have different purposes.

    The Anubis challenge could be easily and cheapely solved by any JavaScript engine. It only becomes expensive for a massive number of petitions.

    If for instance you would want to register a few thousand emails in a forum anubis is not going to stop anyone.

    In fact I’m sceptical about really having an impact. As even when the challenge goes up in difficulty is not that expensive compared with all other cost related to these kinds of attacks or massive scrapes.

    My suspicion is that most websites using Anubis see a positive impact because most crawlers and probers doesn’t take into account Anubis, so they don’t even attach a way to solve the challenge and they directly go into the “rejected by anubis” bucket. But any targeted attack I suppose would pass easily, either by doing a slow attack not to up the challenge very much, or just eating the cost. Imagine an AI company that using nuclear plants for training data, the cost of solving a few million JavaScript challenges is nothing in comparison.

    As a DDOS mitigation it helps, but once again it’s just a matter of eating the cost by the attacker. And the attack will still deny some service as the challenge go up and new legit users would also need to solve harder challenges.