pull down to refresh

DHH coming out with guns blazing per usual.
Thus, I find it spurious to hear developers evaluate their next computer on the prospect of how well it's capable of running local models. Because they all suck! Whether one sucks a little less than the other doesn't really matter. And as soon as you discover this, you'll be back to using the rented models for the vast majority of the work you're doing.
172 sats \ 1 reply \ @kepford 25 Nov
Yeah... he has a point. He usually does. That's why he pisses so many people off. There are plenty of jerks with opinions.
reply
There will be some in the group he craps on that will be making great progress using open weight LLMs. Luckily, since don't people don't care what Scam Altman says, they probably won't be discouraged by DHH's lack of imagination either.
reply
100 sats \ 1 reply \ @optimism 25 Nov
Since when do rails devs need LLMs?
reply
Didn’t you know? LLMs can solve issues you didn’t know you had!
reply
Not particularly helpful tbh. They suck how? Even with considerable local resources they are too slow? Suck in that their output is poor quality compared to the big players?
The first I can attest to, although I have not had the chance to try with something greater than 32GB locally which is probably not adequate.
If the latter, that may be true but his solution ignores why someone would want to run a local LLM in the first place, doesn’t it?
reply
I've avoided trying to run local LLMs for this reason. Seems like a huge upfront cost and a huge headache for subpar outcomes.
reply
reply
21 sats \ 0 replies \ @Ge 26 Nov
It's funny how no matter the level we want the best of the best always lol pretty fiat minded individuals i guess?
reply
I sense a sequel coming out:
"Mommy why is there a server cluster in our house"
reply