"AI is to content what search engines are to browsers. Money machines."
fun prompt ;) Let's examine.
search engines index content, make it discoverable, and people will pay for what they find, and like. The models open to them are subscriptions or advertising. You pay to let people advertise on your content because search engines present it to the user and can coordinate which ads show up where to maximize revenue for both the creator and the advertiser. This was the bulk of FB and Google's revenue for the last 15 years, and it's the predominant model for monetizing content. So yea, search engines are money machines for browsers.
How would AI play this role for content? People's willingness to pay for LLM outputs will be entirely dependent on the quality of the inputs to those LLMs, i.e. the content that they are trained on. For example, many LLMs out today relied on the Pile, a crawl of public text to create the "backbone" of LLM's text prediction capabilities. Clearly, content is not killed by LLMs, as many have suggested. Quite the opposite: we'll want more, better content if we ever want powerful LLMs for both specific and general use. And for that, we'll have to pay someone.
idk what this economic model would look like, but I think what will happen is that LLMs create new interfaces outside of the browser, which will need new content markets and syndication to power these interfaces. Very different content discovery and monetization world from 2000-2020.
599 sats \ 4 replies \ @kr OP 22 Jan
How would AI play this role for content?
i was brainstorming with @k00b on this exact topic this morning.
my view is that creators are becoming the largest pool of unorganized labor in the world, and resemble workers in late 19th century America.
their “bosses” are all monopolies (Google, Facebook, etc…), and creators earn very little from them with no negotiating leverage. Most creators rely on building supplementary businesses out-of-band (selling merch, tickets, affiliate deals, sponsorships, etc…).
it seems possible that AI content collection practices (whether they’re deemed legal or not) could be the topic that stimulates creators to organize their labor and negotiate with AI platforms for direct payments.
this lets creators focus on doing what they do best (creating… not selling t-shirts), and gives AI companies access to unique data.
reply
I directionally agree, but I think you could argue value-for-value solves your concern and enables creators to disintermediate platforms by bypassing them and going direct. The problem for creators is that the platforms are the kingmakers, and they take huge percentages. That doesn't change if they organize within the current paradigm.
reply
i think one possible outcome is that creators form some sort of a union and only allow their content to be used for AI training if the AI companies pay them directly. v4v is unorganized labor, i think it’s incompatible with unionized work.
creators have the benefit of having nothing to lose right now, they aren’t getting paid much at all as is… and they’re getting nothing for the content that is used to train AI models.
the big assumption i’m making is that they’ll be able to effectively organize and stand up for some kind of organized labor agreement… TBD.
reply
I'm not sure looking at creators as "labor" is the right framing. For one, there's no labor theory of value in art. A song that takes an hour to write might make millions or might be heard by five people. The economically interesting part about "intellectual property" is that you make it once and it pays dividends forever (or at least until copyright runs out). It might be better characterized as an asset, like a car or land. imo it makes more sense to arm individual creators with their own licensing rights and build a way for them to issue contracts and negotiate their own licensing terms, instead of offloading that to a third party which then distributes via platforms. anyway thanks for thinking about this it's an important topic !
reply
good points
reply