pull down to refresh

Oh I see. Specs may not be enough to describe an interesting system fully, but humans close the gap by “decompressing” a spec into an interesting system. LLMs appear to be capable of decompressing prompts in some way too.

I think what this means for LLMs is that any prompt, like a spec given to human, is not enough to determine/predict the output if the output is complex enough.

Then again, the clojure guy is talking about humans writing specs. Maybe if you prompt LLMs to write specs, they can create complete specs of interesting systems.

aybe if you prompt LLMs to write specs, they can create complete specs of interesting systems.

I can't wait to live into the future and see how this turns out.

reply