pull down to refresh
50 sats \ 0 replies \ @krispy_donkey 2h \ on: Local LLMs are how nerds now justify a big computer they don't need AI
Not particularly helpful tbh. They suck how? Even with considerable local resources they are too slow? Suck in that their output is poor quality compared to the big players?
The first I can attest to, although I have not had the chance to try with something greater than 32GB locally which is probably not adequate.
If the latter, that may be true but his solution ignores why someone would want to run a local LLM in the first place, doesn’t it?