Ranter
Join devRant
Do all the things like
++ or -- rants, post your own rants, comment on others' rants and build your customized dev avatar
Sign Up
Pipeless API
From the creators of devRant, Pipeless lets you power real-time personalized recommendations and activity feeds using a simple API
Learn More
Comments
-
what graphics card do you have? i wanted to try having my own offline gpt aswell.
Wait?? They are deploying it in termux? Damn haha -
Well, you could buy a lot of "decommissioned" enterprise servers from specialized auction houses (no, not eBay) and assemble your own little datacenter, for running stuff you need a lot of resources!
-
@max19931
HP 's, old machines though but I wanted to make sure they would run on low power consumption. These run at psu's of 2*480 Watts. Dells usually have PSU of 750 Watts and is, certainly with small factor sata or ssd disks overkill.
But this is just a small setup, I'm building a larger one in my basement, 2*48U racks with proper UPSes, power conduit's, batteries, dedicated network hardware and storage units full with high capacity disks for storage with direct fiber connections to compute nodes. Building on Openstack software.
Building on dreams also goes step by step. Don't rush it. You for a lifetime to live after all. -
Ah and for anyone who is in Europe, looking for affordable decent refurbished hardware, I'd like to recommend you https://troostwijkauctions.com
-
@NeatNerdPrime Wauw, they sell the LWE 130 Electric Pallet Truck that i wanted for years.
Related Rants
I had my own gpt running on a 16 core server with 16gb ram. Performance: one word per two minutes 😂 I tried several projects but this one works the best: https://github.com/ggerganov/...
Db: https://huggingface.co/Sosaka/...
random
gpt
diy
performance