682 sats \ 2 replies \ @jirijakes 4 Jul 2023 \ on: Privacy on nostr and the war on AI scrapers nostr
I believe Nostr already has a way how to deal with that: paid relays. Perhaps if scraping increases cost of running relay too much, more of them will require a few sats from their users. And they can require more than a few sats from those scrapers.
This. They should charge in the first place anyways.
The main problem is the lack of support for auth based reads. We see writes being locked down on paid but the scraping problem is a read problem.
reply
It's a problem that can only be solved with relaying systems and session accounting. And it needs to be designed to be low latency as well to minimise propagation delay.
reply