320 sats \ 6 replies \ @Jon_Hodl 5 Dec 2022 \ on: Building a new Web Search Engine. Just for you, Stackers! bitcoin
What I really want from a search engine is to be able to omit and prioritize certain websites from the search results.
For example, I don't ever want to see search results from mainstream media outlets and I want to prioritize certain other sites.
If you can build that, I will use it.
Thank for your input @Jon_Hodl! For that, I'd have to let you create an account and have your personal settings stored there. That would mean I'd be tracking your search history. Your acc would be anonymous of course (LN-based), but still. Would that be ok with you?
reply
I'm not following why there is a need to track anyone's search history.
For blocking a URL, all there would need to be is a row of boxes that I paste a URL into and the search engine never shows me those sites.
For prioritized sites, another row of boxes I enter sites that I want to be shown at the top if they have content related to my search queries.
When I search, the site searches the prioritized sites first and if any of the prohibited sites have content related to my search query, they wouldn't be shown to me.
Does that make sense?
reply
This makes total sense!
I understand the idea of a whitelist and a blacklist of sites. You'd have to create an account on my server, I would store your lists there, and for every query you make I look into those lists to filter the results for you. You get better results, but I get the full list of your searches, all linked to your account.
I wonder if some perceived loss of privacy here is actually negligible compared to the positive effects this would give.
reply
Is there a way to keep the whitelist/blacklist client side?
reply
Need to experiment here, not sure atm. Keeping the list is simple, filtering on the client means much more data has to be served to the client, that might be an issue. OTOH, if I make the price higher for this feature, then I might get properly compensated. Thank you for pushing on this!
reply
You could store templates (list of blackisted sites) on the server, which the user can download onto his client, or just simply let the user add his own entries. you can store the list on the client, inside local storage. you can do the filtering client side also, just match the set from the server with the urls in the local storage, and throw out any matches. no need for the server to know anything.
I think this is a niche feature though
reply