That's pretty cool, thanks for sharing.
Currently the industry is flooded with funding and data is way more valuable than money. Therefore there are many free services like lmsys (my favorite), bing, chatgpt website/app, talkai, googles gemini website, many social media apps.
If you don't want to give away your data you can just run mistral or llama locally.
You know what would be pretty cool? A service where I can offer my computer / gpu for inference and earn a few sats for it.
Is anyone building that?
reply
Reminds me of gputopia (sell gpu to AI models for sats) Think they had a crazy amount of sellers but few buyers.
reply
gputopia (GitHub readme)
barrier to entry?
just now looking into running alby for business. if their setup with voltage is as easy as they say, it might be worth running through the steps to unlock some convenience.
reply
Of course there are. Vast.ai and Runpod.ai (recently acquired by NVIDIA) are a few examples.
reply
Good question, I think I have heard of some shitcoin projects being built around this model, but haven't looked further into it.
reply