0 sats \ 0 replies \ @nout 24 May 2022 \ parent \ on: Apple's data auction privacy ad is only scary because it's true bitcoin
It really depends on which data and which applications from that company you are talking about. In general I don't believe any of those two companies actually resell data further (after the Facebook Cambridge Analytica story). There are also many solutions that prevent de-anonymization.
For example for machine learning on device both companies have "federated model" (or "federated learning"), where there is some machine learning happening on device in "small" model and then it has ability to send information back to servers to update the large models too. ("model" is the data structure/database in machine learning that holds the learnt information).
The trick is that before sending to server the data is multiplied by a completely random number generated on the device. So on server you receive a random garbage. But the trick is that if many devices send the information on the server, then across all of those the server can make statistical analysis and make assumptions and update the "large" model.
This type of data is not possible to de-anonymize, but again as I said above, this is specific to the new ways companies do machine learning. There are likely some outdated approaches in place that would technically allow de-anonymization.