1
cb219
196d

Tried to run ollama with a small model on a laptop that's like 10 years old and use that inside vs code. Weak CPU, GPU not to mention. I didn't really expect it to work. I was still a bit disappointed, though as expected. It was crying for help.
Are there any laptops powerful enough?

Comments
  • 4
    Maybe... A laptop that isn't 10 years old..?
  • 1
    I want to try it in a raspberry and see how it goes
  • 3
    Im running 7B models on a 3 year old laptop with RTX 3050 Ti with 4GB at good speeds even though the models don't fit entirely.

    I can do a 13B at barely usable speeds too.

    I can also run the 7B models fine on the Ryzen 5 that's in the laptop, usable in CPUs.

    Also a Mac M2 is perfectly capable of running 13B models very fast.

    You should be able to run 3B models on even older Hardware. Models like Starcoder for local copilot can run on almost anything.
  • 2
    @cafecortado I did run a small model on a Radxa RockPi 4 SE, similar to Pi 4 with some advantages.

    I ran a 7B model on it, excrusiatingly slow and started to over heat easily, but it did run ..

    Even 3B models were a bit too slow for practical use, but they do run
Add Comment