pull down to refresh

This interesting paper was referenced in the latest Spiral newsletter and it really got me thinking. It's a little dense, so I'm doing my best to summarize here:
If you use tor to obscure your activity on the internet, clearly you are already leaking some information by using tor. You might assume that your anonymity set is all the people who use tor, or more specifically all the people who used tor at the same time as you. One common way to measure your anonymity is to speak of bits of entropy.
However, the author of this paper argues that this is a misunderstanding of the situation. Not all tor entry nodes are benevolent actors. What if you connect to a malicious entry node? Additionally, what if this node also has access to the exit node you end up using? In that case it is trivial to track your activities.
Another example: if you used a specific coordinator to do coinjoins, you might think your anonymity set is all coinjoins arranged by that coordinator. However, if the legal names of the people who run the coordinator are known and they happen to keep logs of the coordinator at their place of business or residence, it might not be that difficult for an attacker to raid them and seize such logs. In which case your anonymity set is significantly less than you thought.
The author of the paper argues that we need to better ways to talk about anonymity that include the amount of effort an attacker must exert to reduce your anonymity. In the case of tor, an attacker must exert quite a lot of effort if you only ever use trusted entry-nodes run by a group of your friends. But if you randomly connect to entry nodes, the attacker may need to exert much less effort.
[The adversary] strives to reduce his uncertainty about linking senders and recipients, but the measure of his ability to succeed is not the size of those sets or probability distributions on them. Rather it is the resources it will take him to place himself in a position to learn those linking relations.
[The system of measuring anonymity that focuses on the size of the set of known senders or receivers or utxos in a coinjoin--called the entropist approach] will yield system designs that might be useful in controlled settings like elections, where we can control or predict the number and nonidentity of participants and where anonymity within expected-size sets is useful. But the entropist approach is not appropriate for general communication on large diversely shared networks like the internet
Anonymity is really hard. Reminds me yet again how impressive it is that we still don't know who Satoshi is.
Is your anonymity set what you think it is?
Almost certainly not :(
reply
17 sats \ 0 replies \ @Akg10s3 5h
I like to take care of my safety and that of my SATS! This was an interesting read! Thanks for sharing ⚡
reply
34 sats \ 1 reply \ @0xbitcoiner 7h

Privacy Is Boring, Until You Lose It

Privacy doesn’t sell. It doesn’t trend. It doesn’t give you dopamine hits or push notifications. Compared to sleek new features or viral posts, it feels outdated, like locking your door in a neighborhood where no one’s been robbed. Until you are.
Then it gets real.
[...]
reply
You are absolutely right in what you say, we often think we are safe until our carelessness catches up with us.
reply
Reminds me yet again how impressive it is that we still don't know who Satoshi is.
truuuuu-lyyyy
reply
22 sats \ 0 replies \ @seashell 7h
feels like most people confuse plausible deniability with real anonymity. tor, coinjoins, whatever, it’s all just delaying correlation unless you actually control the edges. if your adversary has time, money, and legal reach, your “anon set” is probably just a list they’re waiting to cross reference later.
reply
Great post. Most people confuse "obscurity" with true anonymity. If your threat model includes nation-states or well-resourced actors, your "anonymity set" can collapse fast. Trusting entry nodes or central coordinators is a gamble—entropy doesn't equal safety.
reply