4

How to protect API endpoints from unauthorized usage by bots?

If the API end points are meant to be used by any incoming to CSR frontend user without prior registration?

So far, my the only idea is going from pure CSR React to something with partial SSR at least in Node.js, Django or any other backend framework. I would be able restricting some API endpoints usage to specific allowed server ip.

Next.js allows dynamically both things as well.

As alternative I have a guess to invent some scheme with temporally issued tokens... But all my scheme ideas I can break really easily so far.

Any options? If SSR is my only choice, what would you recommend as best option in already chosen Django and not decided fully front-end framework?

I have the most crazy idea to put some CSR frontend framework literally into my django backend and making initial SSR from it. The only thing its missing... my lack of skills how to use React, but perhaps I have enough time to get a hang of it.

SSRed frontend can be protected with captcha means at least.

Comments
  • 0
    As alternative adding node.js to react, or choosing next.js.
  • 1
    You can't really, like you can't prevent scraping of your website either.
    Either require authentication (where during account creation it is validated the user is not a bot) or use a risk based approach (e.g. invisible recaptcha, be aware of false positives if users have tracking protection enabled).

    Still, nothing can prevent an attacker which manually creates an account to call your API.
    The (real) solution is to make sure it does not have consequences if a bit calls your API (which you must do anyway).
    As you said, the last resort is to make it only callable from the sever side. A friendly reminder, it will still be possible to gather information from your server, as the same data might very well be extracted from the server side rendered HTML (or in case data is sent your application: bots can fill out forms as well).
  • 0
    Maybe you can describe what your API does and why it is a problem of bits using it?
  • 0
    @sbiewald the point is, with server side render I ll have only one entry point for users, which I will be able to defend with captchas. API will be restricted to accept requests only from specific allowed ip.

    I will be expecting getting only users now

    While having public API for CSR only framework, I can't protect against bots, how, if I our CSR is technically a bot too+
  • 0
    @sbiewald mm yeah sure.
    I have several (3 APIs) to defend.

    Auth API, and two API (which fulfill information searching operations) which are meant to be used with different set of rules for users with free subscription and for users with paid subscription.

    JWT things in httpOnly cookies solve problems to authenticate users for paid features.

    The problem is with features that are allowed as demo access for all unregistered users. How to protect them to be used by real users only and prevent being taken by bots.

    So far seeing SSR as the only choice.
  • 2
    Add a robots.txt and I'm sure they'll honor it 🤭
  • 1
    @ScriptCoded haha, nice joke.

    It will not prevent people for manually sniffing the site, finding API end points and hooking their bots to them.
  • 0
    Seriously though. Unless your frontend contains sensitive data (which it really shouldn't) you shouldn't have any problems having it publicly available. As long as your API endpoints are protected you should be fine. I get that that's what your original question is about, but I don't think you should have to worry about SSR.
  • 0
    @sbiewald the problem with demo access.. That it makes load and usage of things aren't free for us. The less it is abused than better.

    If it will be massively used by bots... it will burn our money quicker.

    As addition I can see getting money-hurting features out of unregistered access perhaps. But it will be against business thing.
  • 0
    @ScriptCoded ...The problem with protecting demo usage free features for unregistered users.

    If they are available for CSR, I see the only way to lesser abuse by adding throttling limits to request amount from one ip.

    Actually it can be a choice... different behaviour pre and post throttling limit.
  • 0
    @sbiewald bots can fill html forms... Yeah.
    Yesterday finished reading the book about web scrapping.
    I know now quite a few ways to break my own site.
    Just wishing to lesser damage somehow.
  • 3
    Just add a secret header. Most of the bots are for general spamming and not specifically targeting you.
  • 2
    In your session (maybe the JWT), store if the user has filled the captcha - if he didn't fail all API requests and require a captcha fill, after that update the JWT and allow API access.
  • 2
    Rate limits + protected routes + input validation, you should be fine.
  • 0
    @sbiewald idea sounds the best approach imo
Add Comment