actually awesome and fast search engine (depending on which instance you use) with no trashy AI and ADs results also great for privacy, if you don’t know which instance to use go to https://searx.space/ and choose an instance closest to you
I’d use it if its public instances didn’t get rate-limited so often
Man, i wish i had the same experiencr
The couple of times I tried it out, the search results where barely accurate
Try kagi. It’s paid at $5/mo., but you get 300 searches to try it out.
btw
Aren’t all search queries available to whoever hosts an instance? In my eyes this is much worse to privacy and a much bigger risk unless you really know who is behind your chosen instance. I would trust some a company a bit more with safeguarding this information so it does not leak to some random guy.
I’ve always gotten the impression it was mostly intended to be self hosted. I’ve self hosted it for something like a couple years now, runs like a clock. It still strips out tracking and advertising, even if you don’t get the crowd anonymity of a public instance.
Self hosting doesn’t make sense as a privacy feature because then it’s still just you making requests to google/other SE
It’s not useless, it removes a lot of the tracking cookies and such and sponsored links loaded with telemetry. Theoretically you can also get the benefits of anonymity if you proxy through Tor or a VPN, which I originally tried to do but turns out Google at least blocks requests from Tor and at least the VPN endpoint I have and probably most of them. Google or whatever upstream SE can still track you by IP when you self host, but its tracking is going to much less without the extra telemetry cookies and tracking code it gets when you use Google results directly.
But yes, practically you either have to trust the instance you’re using to some extent or give up some of the anonymity. I opted to self host and would recommend the same over using a public instance if you can, personally. And if privacy is your biggest concern, only use upstream search providers that are (or rather, claim to be) more privacy respecting like DDG or Qwant. My main use case is primarily as a better frontend to search without junk sponsored results and privacy is more of a secondary benefit.
FWIW, they have a pretty detailed discussion on why they recommend self-hosting here.
Companies are definitely selling your data. Use a VPN.
A VPN will not save you, they are easily worse for privacy in terms of user tracking. It centralises your entire web traffic in a single place for the VPN provider to track (and potentially sell).
You either trust the ISP or a VPN. Its a tool not a blanket of protection. Opsec and knowing how to move is most important.
But you pay more for what is essentially the same with a VPN. You have to buy a VPN subscription on top of your internet subscription, get less speed because your internet traffic is being routed through a different country and get no benefit to privacy. The only use case for a VPN is when you have to bypass georestrictions.
Been rocking self-hosted Searxng for the last 3 weeks now as my default search engine; it’s as good or better than DDG and certainly better than Google. Results I need are usually within the first three items, no extraneous shit.
I thought I’d just try it out, but it’s staying. The ability to tune the background engines is awesome. My search history is private (though I wasn’t that worried about DDG, there was no way in fuck I was using Kagi) since it’s running it’s searches via a VPN and returning me results locally.
Keep in mind that to protect your privacy you should also share your instance with others. All the searches are still linked to an IP which can be abused as well.
Yes, that’s the purpose of the VPN. It’s out there mixed in with everyone else that’s using that exit node.
Honestly, it’s not too much of a concern to me, I’m not doing anything illegal or naughty, it’s just making sure I’m not part of the dataset.
Yep, a VPN is a good solution.
How does it work self hosting? Is it querying other search engines or just maintaining a database on your server?
It’s a meta search engine: it aggregates results from multiple sources for your search query. So yes, it queries other search engines.
Its all calls to other engines, that you can choose and tune. So its making those calls and filtering out shit like AI results, and then ranking it to return back to you. Seems to do a good job.
it’s as good or better than
It’s only as good as the search engines you select. Which ones have you selected?
Defaults are working fine, I might have added one or two.
What does it default to? Google+bing+DDG?
deleted by creator
which language are we talking?
Mojeek reminds me of early Google results which only searches title and inurl I like it
You can use Mojeek with SearXNvm with nothing enabled but Mojeek returned no results, I wonders why is that?I could be wrong but didn’t Mojeek also index results from Google and Bing?I’m wrong they index their own results, I mean Qwant is a search engine built in EU and they index their own resultsqwant is bing, mainly
The homepage took 5 seconds to load. I’ll pass.
Let me tell you about waiting for AskJeeves to load up in the 90s.
Web crawler indeed
not sure what’s happening there for you, speed is one of the things which people frequently say we do well for
Not at all for me
I stopped using it not because of the results but because you couldn’t swipe back without it sending you to the base website.
On DuckDuckGo (and google n others) a search is shown in the URL like looking for frog:
https://duckduckgo.com/?q=frog&t=fpas&ia=webHowever in SearXNG it just shows
https://searxng.world/search
Which I don’t have an issue with, however when you click on a link and then go back to the search results it would have no idea what you searched for as it’s not in the URL and show an error.That aside, the UI is great. icons don’t swap around on you like Google or have annoying popups about ‘privacy’ like DDG. On the topic of search results, it was good enough for me. Not great but then again there aren’t any good search engines right now.
Set it to do get requests rather than post.
Can someone explain the meaning of the name and the people building this project please?
You’re trusting how information is filtered and funneled to you with a search tool, but a change to take lightly. Google sucks, but they have a lot to lose, a lot of eyes on them and I know generally their base motivations.
Fortunately, you can read through the source code of SearxNG and even modify it - provided that you also publish the modified version to your users if you host it publicly.
You can run your own instance, public or private. Or you can use a public instance.
Internally, it uses other search engines, rather than crawling the entire web and indexing everything.
If you are on a desktop, you can run it locally, you are much less likely to be rate limited, but this comes at cost of your ip being still visible to google or whatever search engine you choose to scrape from
IP addresses are not some super-secret PII. You don’t have to try to hide it unless you don’t want to reveal the country you’re in. You can also proxy SearXNG through tor. Though Google wouldn’t work then, and of course search time increases as well.
It’s your queries + your IP combined with the rest of the data the net collects from you that identifies you.
Your IP is waay down the list on the fingerprint vector. Go to fingerprint.com, connect to a VPN, hard refresh the page. They’ll know you’re the same person. Nobody uses IP to fingerprint people. SearXNG mitigates a lot of the things that they actually use.
I completely agree with you, and what i wrote was in haste, essentially, what i wanted to say, was that an individual running searxng does not provide the anonymity benefits you would get by using some public instance, but it it still better because you are not directly using google or whatever website, and now searxng kinda acts like a browser between you and them, which does limited conversation - there aren’t any js based fingerprinting. I also use searxng locally, i cant stand the constant rate limiting of public servers, or sometimes only a few engines are blocked, and variation in result quality is unacceptable to me. I just wanted to add that bit for transparency,
4get.ca has been great for me.
How is it compared to DDG?
I never use DDG before, but I think it’s pretty similar since DDG and SearX index their results from Bing (SearX and also index from Google, qwant… if you enable them), but SearX is decentralized and open source but DDG at core is not, you also have to trust DDG with your info (who know? maybe they lie about their privacy policy but I hope not)
In terms of search results, ddg sometimes will find very specific searches better and has more bells and whistles. I still prefer SearXNG, and have been using my instance almost exclusively since setting it up.
It’s pretty nice. The REST API for running searches makes running SearxNG worth it, if nothing else.
Could you elaborate on what you mean by that?
Whoogle is a good option for self hosting as well
have been using it for a while on my mobile and so far i like it better than ddg or startpage
i get a lot of simpsons pictures in the image results for some rsn
Some raisin
Some resin
same here but with hentai on searx.be