pull down to refresh
300 sats \ 1 reply \ @Scoresby OP 30 Jun \ parent \ on: Claude runs a vending machine: decides to stock tungsten cubes, loses $250 AI
I'm sure my understanding of LLMs is a little shallow, but I mostly think of them as a very complex prediction machine for forecasting the next word in a given set.
Such an understanding would explain a little the way LLMs seem to change their mind about what is happening or has happened. While this is pretty disturbing to humans, because it feels deeply duplicitous, it may simply be the most likely next word for the LLM.
Once the April Fool's context became more important for it, it made complete "sense" to the LLM to use the April Fool's excuse and in its "mind" it became the most likely next words even going down the path of acting like it had always been part of the plan.
If it's all just guessing the next word based off all the words on the internet, an LLM can sound very like a human but might not have a sense of reality or past vs present or duplicity.