pull down to refresh
Check this https://docs.owlrun.me/#provider-payouts
These are indicative projections though, depending on market supply/demand dynamics and market rates are also subject to change.
Great questions,
- The system is designed to distribute inference requests across the network, so there is no such thing as an "untrusted node" in this context because neither there is a system of trust in place, so you could say that all nodes should be considered "untrusted" by definition. Like use them for general inference but never to be used to handle API secrets and such which is also always valid good practice for any other "cloud provider" anyway.
Nodes are just processing random inference from random sources so the inference queries are potentially visible from the node runner's point of view, just like they already are when using any cloud inference today on all the main platforms, the advantage here is that there is no direct connection between the identity of the inference buyer and the node runner, like you would have on a centralized platform.
- There is already a Karma system for nodes that are contributing to the free tier, this could be a great feature request addition, thanks!
I'll do some research to see if there is a way to embed encryption at some levels on the inference stack ...
Both your question are very good and thoughtful thanks for contributing! I'm new on stacker news, let me see if I can find a way to send some Sats your way : )
Not unknown, the node client is Open Source MIT license, code is here:
https://github.com/fabgoodvibes/owlrun
Just point your favorite AI agent to the repo and ask your AI agent about doing a security assessment.
Having said that, please consider that the software is still Beta testing stage, although pretty stable in the last weeks and tested by a dozen of people on different platforms, ideally run it in a VM or some alternative hardware first, like not your primary machine.
The node client source code is live on Github, not sure where the "coming soon" is, maybe a cached version of the website?
https://github.com/fabgoodvibes/owlrun
The inference verification is a good question, for now this is beta and it's not happening yet, there are plans in the pipeline.
Good point on training/tuning that may be a natural path to evolve into ...
Thanks for your thoughtful considerations!