If you’re a g*mer or do any work with AI (preferably open-source), pretty much anything relatively recent with a graphics card, used is better for driving down price while still getting a good deal.
even the huge models that you can’t run locally are kinda mid and hallucinate a lot, why would you run some tiny local thing unless you’re generating banned material
If you’re a g*mer or do any work with AI (preferably open-source), pretty much anything relatively recent with a graphics card, used is better for driving down price while still getting a good deal.
I don’t work with AI and I’m pretty much just a casual console gamer
Thinkpad is probably your best option then, I’ve heard a lot of good things over the years
who tf is doing local “work with AI” lol
On a laptop, no less
even the huge models that you can’t run locally are kinda mid and hallucinate a lot, why would you run some tiny local thing unless you’re generating banned material
I use a local llm to re-write the tone of emails for my job so I’m not dripping with hatred and seething contempt
this, so much this
you ever seen Sorry To Bother You?
I have an LLM give my emails White Voice and it’s helped a lot as someone who can’t stand being “professional”
Please share some links, i need this!
hahaha yes, this is exactly what I’m thinking when I do it
I use a pytorch accelerator on a laptop.