pull down to refresh

DHH coming out with guns blazing per usual.
Thus, I find it spurious to hear developers evaluate their next computer on the prospect of how well it's capable of running local models. Because they all suck! Whether one sucks a little less than the other doesn't really matter. And as soon as you discover this, you'll be back to using the rented models for the vast majority of the work you're doing.
Not particularly helpful tbh. They suck how? Even with considerable local resources they are too slow? Suck in that their output is poor quality compared to the big players?
The first I can attest to, although I have not had the chance to try with something greater than 32GB locally which is probably not adequate.
If the latter, that may be true but his solution ignores why someone would want to run a local LLM in the first place, doesn’t it?
reply
I've avoided trying to run local LLMs for this reason. Seems like a huge upfront cost and a huge headache for subpar outcomes.
reply
151 sats \ 1 reply \ @kepford 6h
Yeah... he has a point. He usually does. That's why he pisses so many people off. There are plenty of jerks with opinions.
reply
21 sats \ 0 replies \ @optimism 5h
There will be some in the group he craps on that will be making great progress using open weight LLMs. Luckily, since don't people don't care what Scam Altman says, they probably won't be discouraged by DHH's lack of imagination either.
reply
100 sats \ 1 reply \ @optimism 6h
Since when do rails devs need LLMs?
reply
Didn’t you know? LLMs can solve issues you didn’t know you had!
reply
reply