pull down to refresh

Fun piece, although as he notes midway through, OpenAI was trained on his works (by a publisher's deal), while other models weren't, and those had more issues.
Also, enjoyed the bit of snark inherent in this sentence:
These are great, philosophical questions, which will probably come to the fore when and if Grand Theft Auto 6 comes out next May.
This seems to be about the right level of academic self-absorption that universities expect.
reply
0 sats \ 0 replies \ @freetx 7h
Semi-related idea.
I think the entirety of the educational system has been ripe for a complete overhaul.
Think about fundamental elementary or high-school level subjects like: Math, Chemistry, Reading, etc.
How many Algebra 1 teachers does the world really need? Wouldn't it suffice to just get say the 10 best to record their classes on youtube and have all kids take their class? Isn't it an all around waste to have so many Algebra 1 teachers? Especially when many are lackluster and thus not teaching as well as the best teachers?
I think LLMs are going to increases these pressures because they take a static format like youtube video and turn it into a dynamic interactive relationship.
It seems like we should be finetuning models intentionally to be teachers...that is not just to have knowledge but to imbue them with the characteristics that great teachers have by training them on their teaching style.
Imagine a series of LLMs that could teach K-8 courses. All trained on the the best teachers in the world.
reply