3

I'm finetuning llama 3.1 8b locally on a 32gb 5090. It took weeks to learn enough to know where to start, hours to get my environment sane, and it's going to take 5.5 hours to finish training.

I started on my own model but the lack of data and acres of server farms led me down the finetuning path.

I honestly don't have any idea if any of it's going to be worth the time or CONSIDERABLE financial investment I made building my computer, but I refuse to stay ignorant.

Comments
Add Comment