In all honesty I have never been much concerned about mixed content pages as there is nothing I can do about the situation aside from blocking the mixed content. I generally avoid https other than for known sites as I have said before those who try to browse enforcing/using all https don’t help themselves.
The network shield would still be working even if the original parent page is HTTPS, so if the mixed content comes from a malicious site it would still alert. As if this was intended and or a hacked site with mixed content your browser should at least warn you of the mixed content before you proceed with entering any logon details, etc.
I would also say that firefox with NoScript and Request Policy is likely to afford you some sort of protection as there is little point in trying to harvest this data if it can’t be collected/sent to the originator of the scam.
Thank you for pointing out to us all that the http-part of the mixed content on websites is being protected through the avast shields. That must be a reassuring thought to folks that find themselves from time to time on public networks in public places for instance at an airport, where they could fall victim to a Man in the Middle set-up that are more easily performed under such circumstances. Good to know that in-browser script protection like NoScript and RequestPolicy will protect against malicious javascript injections there. In Google Chrome the user is warned for visiting a mixedcontent site (see attached gif). Good we have pointed this out to forum users and hopefully they will watch their clicks…
Well that is an assumption on my part and not a certainty, as I don’t know if it effectively being wrapped up in the parent HTTPS connection, it might get past the web shield. So that would have to be something answered by the developers.
As for the network shield, I don’t believe it matters about the connection method/protocol as it is basically looking at the domain name. So hopefully that would give some limited protection.
Firefox is a little clearer in its warning about mixed content (as is IE8 and most likely IE9), which by default should be enabled.
If the capture and transmission of the data require scripts or cross site scripting then NoScript and RequestPolicy (don’t know about BetterPrivacy) could well help in blocking that capture/transmission.
Chrome tracks mixed scripting more precisely than Firefox or IE (as seen from the attachment in my previous posting) Here the user can install the Mixed content protection extension: http://userscripts.org/scripts/source/69977.user.js (only works for the body tag elements)
Because of this thread I have that now installed in Google Chrome,
I don’t see anything in the image that you posted other than a url/address bar (in the middle of a huge 800x600 white space), nada about a warning or tracking