It is known that more and more SEO Spam is being crafted just to spoil the Google search results in an ongoing battle between the brightest heads in the Blackhat SEO Spam malcreant world and the greater geeks of Google Officialdom.
Therefore my thesis is that one could run a greater risk of meeting malicious and suspicious search results on a Google search page rather than using MSN, DuckDuckGo, etc. I mean this relatively, as Google sits on much much more search data than all the other search engines combined. All the others also use Google results in some way or other.
My question is- are Google search results per se more malicious/suspicious, because malcreants and cybercrime circles specially are malcoding Google specific?
I know we are being protected by Google Safe browsing, WOT, Bitdefender TrafficLight, DrWeb’s URL checker, Netcraft. and last but not least Avast Web Rep on the Google Search page .
So does it make sense using an alternative search engine for security reasons?
When I do a search, I don’t click on the first reply that pops up.
Regardless of the search engine used, always look at the most relevant results and then
pick the results from a site you trust or if you’re looking for software, pick the Mfg. site.
Allow you brain to aid in selecting your search. Don’t just pick the first thing that pops up.
Seems a sound advice to check unknown territory first before we decide to go and land. When web rep and pre-URL-scans say it is probably OK to visit that search result we can decide to venture out there. And even then I am always secured via Google SafeBrowsing & Avast Webshield or v.v… For search-results pre-scanning I have Bitdefender’s TrafficLight, DrWeb URL Checker extensions and I am on and also contributing to WOT (as “luntrus”). Finally when ready and secure enough to go I could allow ScriptSafe when already on the search results. Once bitten twice shy is a good way to go forward.
Depends what I’m searching for. eCommerce sites (IMHO) are far more likely to be spam, but OTOH I’m usually going direct to eBay or Amazon, sites that spend a huge amount of time and money protecting themselves.
For research I use DuckDuckGo (The search engine that doesn’t track you.) or Ixquick (Take a deep breath. You’re safe here.) Beaucoup and Dogpile get a workout when I’m desperate.
Yes I use a legacy version of Avast, but the WebShield has always been useful as a backside covering: I am always surprised how many sites have been hacked rather than spammed!
When all else fails, hug your teddy and google, but don’t be disappointed.
Agree with you that where your search query gets you is to a certain extent depending on the contents you are searching for. I am an adept of the late FRAVIA on his searchlores and even published on his site on cgi-security, so I guess I know a bit what you are on about.
But there are more factors that come into play and there it is not only the search friendly and privacy friendly search engine to protect you. For a great deal and as a constant second line of protection I trust in script blocking. I think the protection that Giorgio Maone brought to firefox with his NoScript is second to none. RequestPolicy is a good second. It is a pity that the average user will feel ill at ease with these extensions, because they haven’t got enough insight into the workings thereof to toggle it sensibly, but one can learn to work it - fair enough.
Another uncertainty is that one does not know the search destination is a good site in a malign environment (main cause of false positives with av) or a bad site hosted with a reputable AS.
And then there are factors that are completely out of our hands. When I use the Secret Agent add-on in firefox to modify HTTP Requests to go kind of stealth I often get messages that my requests get hijacked (by my provider - by a tracker - main domain webbug - unknown?).
The tracking and retention of browser data goes so far, that when we are using a tor browser with script blocking and a decent Adblocker for instance Google stops working alltogether and says that I am not any longer behind the keyboard and it blocks the workings likeI am some bad bot, a crawler, anyway the browser stops working because it isn’t worked by a human being. This is the best proof that privacy has ceased to exist and web browsing was developed to become completely and utterly transparent to those parties that take an interest here and we all can imagine what these parties are (government and big commerce).
So the search engine users is entering the surfing ring with one arm tied onto his back and it is an uneven fight. We need some clever browser developers to help us to turn the scales somewhat into the other way here. ;D
It was a very sad day when Fravia left us. One can only hope he went to a better place.
Another uncertainty is that one does not know the search destination...
That's why I rely on WebShield and ScriptShield. they can intercept scripts and other nasties on-the-fly, so I don't have to wonder if F7 (Block JS in kmeleon) would be a good or bad idea. Because F7 must be used before you go to the site, or you can reload when you discover the problem...
When I use the Secret Agent add-on in firefox to modify HTTP Requests to go kind of stealth...
Stealth mode in SRWare Iron was one of the very few attractions in a product only slightly more useful than Chrome.
Adblockers. I use the “hosts” file a lot, and I wish Avast! would leave it alone, or learn how to parse it. I’ve had to exclude it and my backups from scanning. Fortunately Avast! doesn’t (yet) bother eDexter! I suppose I could always get a HIPS app…
I like Personal Blocklist as an extension in Google Chrome.
There you can blocks domains/hosts from appearing in your Google search results.
Also like Tampermonkey a lot, it is meant to work your personal user scripts in Chrome.
So I could keep Malware Script Detector v.o2b inside this browser
as Google had disabled that extension half a year ago,
when they closed all that was not from their Google WebStore over our heads as insecure.