pull down to refresh
31 sats \ 1 reply \ @0xIlmari 4h \ on: Just fucking code. devs
Because there's an absolute bazinga of algorithms that's a waste of time to memorize because in your carreer you'll make run into needing 3.
For those 3 times there are LLMs which serve two functions:
- glorified search engine which might even understand your predicament (you're NEVER the first person to encounter a problem) - "I have this data structure and need to do X. Is there an algorithm for that already?"
- coding intern - "Okay, write me an A* solver for this graph."
I'm paid six figures to be the supervisor:
- sanity check the intern by checking the code against a Wikipedia article
- better yet, write fucking unit tests around what the AI spit out
- integrate into the broader solution
- know what tasks can be relegated to AI and which can't
Those tasks, LLMs are NOWHERE CLOSE to being able to do.
This is interesting. Why do you think that a well tuned/instructed AI is worse at (for example) writing unit tests than humans though? When I write tests, I use structural analysis and perhaps some intuition, which is arguably seen as the most human trait in cognitive skills. If the premise that intuition is actually pattern recognition is true at all, then maybe it's just a matter of developing the right model?
To be clear, I'm not saying that I'd particularly like that outcome (one of the most pleasant interactions throughout my career has been with people that found bugs in my code, so I'd consider this a real loss socially), but I do think that this is actually a reasonable outcome, and probably soon, unless there is magic going on in
intuition
that we don't understand and can't emulate (yet). But then, it's still only a matter of time until we do discover it?reply