review data

review data

Visualizing eight years of independent reviews

Posted on August 14, 2015 - 12:24 by ccondon

StopBadware has been performing independent reviews of websites blacklisted by our data providers for more than eight years. As we've explained in the past, a manual review done by our staff is not always necessary: if a webmaster requests a StopBadware review of a site on Google's Safe Browsing blacklist, the first step in our review process is an automated request for Google to rescan the site in search of malicious code. If Google's automated systems don't find anything suspicious, that site will come off Google's blacklist without our ever having to touch it. When Google still finds malware, or when one of our other data providers is the blacklisting party, one of our website testing team uses a variety of tools to scour the site for malicious code and other bad behavior.

As our home page proclaims in red, we've helped de-blacklist more than 171,000 websites since 2007. Before we shutter operations as an independent nonprofit next month, we want to give our community a better idea of what goes into that number. 

Since we started collaborating with Google, and later ThreatTrack Security and NSFocus, we've performed 53,167 manual reviews. We've also processed an additional 188,149 review requests that were resolved automatically thanks to our automated integration with Google. Those aren't all unique requests, so combining them doesn't yield an accurate number. Here's what all those review requests look like over time:

Why the decline? 

You'll undoubtedly notice that we received many more review requests early on than we do today. Better security awareness, wide availability of relatively low-cost security tools, and default use of things like Webmaster Tools all contribute to the decline we've experienced in review requests. We also have better ways of detecting and weeding out abusive requests than we used to. 

Unfortunately, something else that's contributed to the decline in review requests is malware distributors' widescale use of stealthier, more targeted methods like malvertising. When a resource is compromised only very briefly (e.g., through an infected ad network), even when blacklist operators are able to detect the infection and warn users away, the compromise is often resolved too quickly for StopBadware's Clearinghouse to reflect that the resource was ever blacklisted. Generally speaking, if something is blacklisted for fewer than six hours, we won't have a record of it in our Clearinghouse. On the one hand, this is good news, in that we want blacklists to operate as narrowly as possible to maximize user protection while minimizing penalty to site owners; on the other hand, this is bad news, in that malicious actors are able to effectively utilize powerful technologies to spread malware in ways that are difficult to detect and counter. 

What's not included in this data? 

What you don't see in this chart is the tens of thousands of URLs we've reviewed in bulk for web hosting providers, AS operators, and other network providers over the years. We've worked with everyone from dynamic DNS companies and bulk subdomain providers to small resellers and abuse departments at big companies to clean up malicious resources on their networks and help remove them from blacklists. The majority of this process is manual, and because it's initiated based on trust and human communication instead of by clicking a button, bulk review data isn't reflected in our public review data. 

StopBadware's review process will continue to operate normally during and after our operations transfer to our research team at the University of Tulsa. Thanks to our research scientist, Marie Vasek, for putting this data together!

Key metrics for 2013

Posted on January 8, 2014 - 14:31 by ccondon

In 2013, StopBadware received requests for independent reviews of just under 40,000 blacklisted URLs. For those unfamiliar with our process, here’s some background on how it works. Because of the way our review request system interfaces with Google (one of three data providers whose blacklist data we track), it’s not necessary for us to manually test every URL submitted to us for review. Many reviews are closed automatically when Google’s systems re-scan the sites in question and do not find badware.

StopBadware review requests 2012-2013

One of our key metrics is the number of reviews we have to perform manually, as opposed to the number of reviews that are closed automatically because their requestors have been successful in cleaning up infections. Despite the 69% increase in review requests for 2013, we manually tested 35% fewer URLs this year than last year.

Caveats to this data: These figures generally do not include bulk review requests, which come from hosting providers, bulk subdomain providers, and network operators who contact us in good faith and request review of a large number of URLs. Bulk review requests can range from a few dozen to several thousand sites at a time, and we receive them regularly*. We also did not manually test any URLs blacklisted by ThreatTrack security for several months in 2013 due to a technical issue with their de-listing process.

Review results

StopBadware review results 2012StopBadware review results 2013Types of badware in StopBadware reviews 2012Types of badware in StopBadware reviews 2013

Incremental progress

In addition to the encouraging number of reviews that closed automatically in 2013, we noted this year that our clickthrough traffic from Firefox warnings fell by over 50%. This, contrary to common intuition, is a good thing. Firefox users get to StopBadware by choosing to ignore a “Reported attack site” warning and then clicking a button labeled “This isn’t an attack site” on a toolbar Mozilla shows on infected sites. So Internet users who find their way to StopBadware’s Firefox landing page have not only ignored a malware warning; they’ve indicated they put little stock in the intel behind the warning.

We’ve communicated to Web users and webmasters for years that it’s an unfortunate but commonplace occurrence for legitimate websites to be infected with malware without the knowledge of their owners. It’s a tough message to impart successfully, and it’s tough to measure how well it’s sunk in for the general Internet populace. The drastic drop in warning clickthroughs is a very good sign, especially when combined with the review numbers for 2013.

We’re used to seeing dramatic headlines about malware and lack of security, but despite the news, we still see indications of incremental progress. Not all is lost; in fact, change both positive and negative seems to come much the same as it always has. Progress is slow and won by hard work, but it's work we intend to keep doing with help from our partners and friends. Thanks for your support this year!

 
*As of May 2013, StopBadware changed its bulk review policy. We now process bulk reviews of URLs blacklisted by Google only.