2
azuredivay
163d

How do y'all approach media-endpoints?
Specially publicly accessible user-uploaded media

Rn I encrypt the path to the media-
/file and expose it, decrypt on the server (returning a relative file-path) which then fetches the file via File.Read and returns it as-is

I put a cache header and works fine

But something in the back of my mind makes me feel it isnt right

Like, normal endpoints and file-read endpoints shouldnt be in the same backend, potentially affecting each other

But since it's just a fun pet project so Im not paying for a 2nd baremetal server as a CDN/media server -.-

Worst case scenario I use it as-is, but would appreciate hearing other approaches

Comments
  • 2
    Are the links permanent? Then you may want to have a filter to decrypt them

    You can also not encrypt them but make the links random stored in a db

    If not permanent then you may make these links time bound temp.

    If these links are part of a webpage you may think of sending whole image as a base-64 instead of a link

    Depends on your use case really
  • 1
    @gymmerDeveloper the encrypt/decrypt part is ok,
    it's a controller-level static function used only within 1 controller, and its role ends there

    it's deterministic so same path = same encryption hench cachable at browser-level
    But this isnt what im worried about, im worried about the actual File read-and-response

    coz say a Comment Threat that can have 20-30 comments, potentially all with images

    All by unique users so their User Images

    ie for this 1 page load, the server will read and serve 50-60 images?
    That seems, idk like im doing it wrong

    This assumes cache disabled at browser level btw so "worst case scenario"
  • 2
    @azuredivay

    What's the point of encrypting the path? I can get hiding structure if they are regular files in a server, but if you are using any kind of cloud storage, that seems... Unnecessary.

    If the issue is unauthorized access, then save with metadata as needed, and implement auth in your resource server, via JWT or any other scheme.

    If the issue is caching, you can't really rely on browsers, so just put a CDN in front of your resource server.
  • 1
    Serving static files without permission levels is a trivial task. The only thing you need to do past this is implement rate limiting.
  • 0
    @CoreFusionX ofc im not using any cloud storage (ew 🤢)
    it's a baremetal server in a DataCentre

    a specific folder structure is made for user-uploaded images that are always publicly available (think images in comments) so no JWT/auth needed coz they're meant to be visible publicly

    But I also dont want to make it /user/commentID/image.jpg kinda path where people write scripts to go scrape/download images (30TB/month bandwidth restriction)

    Anyway, the path isnt the problem, the fact that im sending potentially 50-60 images in 1 page request, is

    That might hog the Disc usage etc and slowin down other non-media API endpoints that need DB read-write is bad

    Feels like theres a better method and im not seeing it
  • 0
    @AlgoRythm hmm, figuring out how much concurent data/image access causes other things to slow down

    and ratelimiting 5% below that threshhold seems do-able
    For non-logged-in users in the beginning
  • 0
    @azuredivay

    Well, if you are unwilling to use a CDN (which fits exactly to your use case)...

    Guess you could use an explicit in memory cache, but if images can be big that can eat your RAM.

    Could also embed the images as blobs into the posts themselves, so you only have to read database and that's it. (That also solves the whole path thing).

    Besides that, not much you can do. The way you avoid disk overusage is by distributing into many disks.

    As for preventing abuse without auth and aside of rate limiting, just use any kind of server challenge, like csrf tokens do, in addition to cors, and you are golden.
  • 1
    If you are willing to go nuclear, there's also another option...

    Reverse proxy your web server into *two* actual httpd/vhosts, one for static files, the other for cgi.

    Hard cap them via config or system tools so that you ensure cgi always gets its share.
  • 0
    @CoreFusionX oh hohhh :o that's actually the best idea
    Make another backend, whose only role is to serve media for the main backend, and given it'll be a Linux Service, I can limit its resource usage (never done it before but im guessing it's possible)

    Restrict this media-service/backend from Public and voila
    If and when this Media Service starts hitting its resource limits, add @AlgoRythm 's rate limiting on the publicly exposed service and it covers all issues
Add Comment