pull down to refresh

You already can use LLMs that are 100% on device and private on desktop. So I have no doubts that will be an issue on Android. It just isn't computationally there yet on phones. And the whole infrastructure around it doesn't exist yet.
So I personally don't worry about the "actually private" part - more about the other parts
esn't exist yet. So I personally don't worry about the "actually private" part - more about the other parts
That makes sense, and yeah I’m with you on the infrastructure side being the real bottleneck. Even if the models could technically run on-device, without the right OS-level support and clean integrations, it’s just clunky. I guess my hesitation is more about how companies might market it as private while still quietly shipping data off-device. But once it’s truly local and usable, I’m all in.
reply