pull down to refresh

I absolutely loathe the LLM pattern of:
  • "It isn't [thing]. It's [other thing]."
  • "It doesn't [this], it does [other this]."
It's an effective way to convey meaning, because contrast is how we see and smell and think, which why I suspect it pops up in these machines that are desperate to make meaning, but it's a bit like eating a gallon of ice cream all at once.
The only reason you see this a lot in LLMs is that their training included a lot of it. The LLMs are just big statical, this word comes after that word, language generators. To distinguish yourself from an LLM, you just have to do things that are not very statistically likely.
reply
40 sats \ 1 reply \ @k00b 14 Jun
I think that's true to some extent, and how they're explained to work at a very high level, but it's not purely a statistical thing. If it were, we would've had LLMs a long time ago.
reply
Perhaps it is part of the training where humans are involved. Maybe the trainers were frustrated English Lit majors. That could account for several deficiencies in the LLMs. :)
reply
Forgot one:
  • "It's [this], not [other this]"
reply