First and foremost, script blockers are good instruments in the hands of those that know what sites to block or rather keep blocked and NOT use the full functionality there. The rule should be leave all blocked until proven to be secure to unblock or known to be safe.
And how would one know where one could toggle and where not. There are many indicators as where to not venture out. I see a site alerted by Google Safebrowsing and/or Yandex blacklisted, I am interested to know why (that is why I do what I do), but I decide to not go there directly or circumvent the Google blocking, also when Google says my connection is not private and attackers try to intervene, I won’t go out there and nicely keep such sites blocked. ScriptSafe red, Umatrix enabled. Lift third party links only if you know these sites to be clean, curiosity killed many a cat.
Now sometimes with a clean site we have to allow some additional functionality to make that site function. At urlquery net for instance you have to allow google-analytics.com else you will not get the scan results. This Norse scan site may have issues as I find: HTTP Server: Apache HTTP Server 2.2.22 (Outdated)
Operating System: Ubuntu 12.10 (Quantal Quetzal) (Unsupported)
PHP Version: 5.4.6-1ubuntu1.8 (Outdated)
This indication does not need to be the final word as some website admins put out an outdated server version to mislead eventual attackers. But on all instances it is a better server policy just to make the server header version and other extensive info is not spread globally and to hackers, and whenever a website or server admin knows what he is doing he knows how to adjust his server configuration. Alas many do not, are incompetent securitywise and this means a threat to the website visitors and themselves. There is more to this than one would think. Whenever I see a warning for a website because of malware, scam, fraud, phishing from WOT and very much when there is a user report from someone who knows what he/she is reporting or from third party listing (malware resources) I won’t even consider visiting such a site. Red ball from Bitdefender, don’t even try to go there, DrWeb URL link checker detections, a big no-no. Sites on their malicious website list, do not venture out there.
And the Avast block when malscripts are detected has not been equalled by none, it is very accurate. Blocked do not go there.
When you want to know why - ask in the virus and worms section and some of us may know.
There is a remote chance of FPs but as a rule I only visit and allow non-flagged websites. And always remember the clean site of one minute ago can be the malicious site of the next minute. There are much more aspects to this, but the above is a good thumbrule for many that are advanced users and want to use script blocking and request blocking.

polonus