14

I had my own gpt running on a 16 core server with 16gb ram. Performance: one word per two minutes 😂 I tried several projects but this one works the best: https://github.com/ggerganov/...
Db: https://huggingface.co/Sosaka/...

Comments
  • 4
    what graphics card do you have? i wanted to try having my own offline gpt aswell.

    Wait?? They are deploying it in termux? Damn haha
  • 2
    @joewilliams007 won't work. Wrong processor
  • 3
    @joewilliams007 no gpu
  • 3
    @joewilliams007 it seems that it works fine under a Mac m1..
  • 1
    @retoor on the github theres a vid of it on a pixel phone :0
  • 3
    Well, you could buy a lot of "decommissioned" enterprise servers from specialized auction houses (no, not eBay) and assemble your own little datacenter, for running stuff you need a lot of resources!
  • 2
    @NeatNerdPrime that'd be a dream.

    16gb of ram sees seriously low though.
  • 2
    Slowly but surely....
  • 0
  • 2
    @max19931
    HP 's, old machines though but I wanted to make sure they would run on low power consumption. These run at psu's of 2*480 Watts. Dells usually have PSU of 750 Watts and is, certainly with small factor sata or ssd disks overkill.

    But this is just a small setup, I'm building a larger one in my basement, 2*48U racks with proper UPSes, power conduit's, batteries, dedicated network hardware and storage units full with high capacity disks for storage with direct fiber connections to compute nodes. Building on Openstack software.

    Building on dreams also goes step by step. Don't rush it. You for a lifetime to live after all.
  • 1
    Ah and for anyone who is in Europe, looking for affordable decent refurbished hardware, I'd like to recommend you https://troostwijkauctions.com
  • 1
    @NeatNerdPrime Wauw, they sell the LWE 130 Electric Pallet Truck that i wanted for years.
Add Comment