It’s been 30 years since The Terminator graced big screens with its dystopian view of the future, and (spoiler alert) it didn’t go well for the human race. James Cameron’s sci-fi thriller starring Arnold Schwarzenegger was pretty game-changing for its time, but little did anyone know that the apocalyptic vision of The Terminator would actually come true. Sort of.
Spam’s been around for awhile now, and the nasty blight on modern society was bad enough before the bots got involved. But now that spam is automated, it’s gotten worse, and you might be surprised to learn just how much of modern spam is controlled by botnets. Hint: it’s a big number.
Incapsula, an Internet security firm, published a study for 2012 revealing that 51% of website traffic was generated by non-humans. In fact, 31% of website traffic was identified as damaging intruders. These are alarming numbers, and Incapsula points out that a good portion of the traffic we see visiting our sites is comprised of “shady non-human visitors including hackers, scrapers, spammers and spies of all sorts who are easily thwarted, but only if they’re seen and blocked.” If you use Google Analytics to assess your website’s activity, this is a frustrating development, too, because you won’t see the malicious and automated activity reflected in your site’s traffic metrics.
Pretend that you’re a T-1000 terminator and jump ahead a year. Incapsula published its 2013 report in December, and the numbers are even more dismaying. Bot visits are up by 11% (the infographic on their blog says 21%, but the math doesn’t work so we assume this to be a typo) to a staggering 61.5% of all website traffic. And in case you were wondering, only 38.5% of Internet traffic is generated by live human beings.
If the numbers are accurate, that’s a pretty disturbing testament to the current state of the Internet. Incapsula’s methodology was to observe 1.45 billion visits over a 90 day period, captured from 20,000 sites on Incapsula’s network. The traffic was representative of the world’s 249 countries and consisted of websites ranging from free to enterprise sites. The breakdown looks like this:
- 31% – Search engines and ‘benevolent’ botnets
- 5% – Impersonators/spies. Dangers include marketing intelligence gathering, Layer 7 DDoS attacks (service degradation and website downtime), and bandwidth consumption and service degradation
- 5% – Scrapers. Dangers include content theft and duplication, theft of email addresses for spamming, and reverse engineering of pricing and business models
- 5% – Hacking tools. Dangers include data theft, malware injection and distribution, website and server hijacking, and website defacement or content deletion
- 5% – Spammers. Dangers include posting of irrelevant content that could (and probably will) annoy legitimate users, posting of malicious links for malware propagation and phishing, and turning a website into a ‘link farm’ and cause search engine delisting
The report points out that “The bulk of [the] growth is attributed to increased visits by good bots (i.e., certified agents of legitimate software, such as search engines) whose presence increased from 20% to 31% in 2013,” and suggests that this growth could be attributed to the evolution of web-based services and the increased activity of existing bots.
The report also notes that the number of spammers has declined. While 31% of bots are malicious, spam bot activity decreased from 2% in 2012 to 0.5% in 2013. This could be due to Google’s antispam campaign and the Penguin 2.0 and 2.1 updates. The report states a 75% decrease in automated link spamming activity, a significant number.
Another interesting point, the report concludes, is the “8% increase in the activity of “Other Impersonators” – a group which consists of unclassified bots with hostile intentions.” The common factor here is that 3all of its members try to assume the identity of someone else. “Some of these bots use browser user-agents while others try to pass themselves as search engine bots or agents of other legitimate services. The goal is always the same – to infiltrate their way through the website’s security measures.”
The takeaways from this report shouldn’t make anyone feel good about what’s happening here. If the numbers continue to go this way, it’s going to have a significant impact in the way we view and react to our web traffic. “The 8% increase in the number of such bots highlights the increased activity of [career] hackers, as well as the rise in targeted cyber-attacks. This is also reflective of the latest trends in DDoS attacks, which are evolving from volumetric Layer 3-4 attacks to much more sophisticated and dangerous Layer 7 multi-vector threats.”