It's the trust weights. Trust is hard to earn. It takes consistently zapping stuff other stackers like early (which is why we reward it). Most folks just don't zap (quite a few only do it when attempting to game the algo hence the need for trust at all) so they never earn trust.
The only problem with this (and it's a big problem imo) is that trust is opaque right now. I've explained how the algo works and how trust is earned but most people don't get it.
I'm going to be spending the next few weeks of my life giving us all personalized feeds and part of that will be figuring out how to make this understandable and transparent.
deleted by author
reply
@k00b and me discussed creating blog posts about SN internals, for example to explain in more depth how the trust algorithm works (including examples and such). I think this would help with questions like these :)
reply
deleted by author
reply
deleted by author
reply
I believe that truly malicious people go to great lengths to achieve their goals so not having a blog post would not stop them since as you mentioned, the code is open source. Explaining how the algorithm works could just create more "opportunistic malicious" people but there shouldn't be something so obviously wrong in our algorithm that a lot of people immediately think: "wait, I can game this."
However, making code more accessible makes it easier for honest people to chime in and give feedback. So we should optimize for that imo.
That's at least what I think, I don't speak for @k00b.
reply
That's right.
I'd also add that fastest way we'll improve the algorithm's gamelessness is by helping people "play the game," observe, and correct it.
The alternative, a secret weak algorithm we hope people don't discover vulnerabilities in, is super unsatisfying to me.
reply
100 sats \ 1 reply \ @ek 16 Aug 2023
The alternative, a secret weak algorithm we hope people don't discover vulnerabilities in, is super unsatisfying to me.
Yeah but somehow, people still believe security by obscurity is a valid strategy. I think it's just intuition but maybe this is a prime example where intuition is wrong.
I would rather build everything in the open and get exploited on day 1 and then fix it than build secretly and then hope no one is going to find a single exploit.
Going from proprietary software to OSS code is a whole different topic though. I think it's hard to be confident enough in your code to release it if you built it long enough not in the open. But maybe that's another sign that security by obscurity doesn't work "at scale".
reply
deleted by author
deleted by author
reply
1000%
reply