The rapid increase in web-based services has also led to an increase in non-human Internet traffic, accompanied by growing sophistication in hacking techniques.
While website visits by non-human entities represented half of all Internet traffic in 2012, during 2013 this figure rose substantially, according to the Bot Traffic Report 2013 from Cloud-based application delivery platform Incapsula. Total bot traffic for 2013 was up 21% on the 2012 volume, reaching 61.5% of all web traffic. So are all these machine-generated website visits of malicious intent? Well, things are not quite as bad as that. In fact the bulk of the growth is attributed to greater activity by ‘good’ bots, i.e. certified agents of legitimate software such as search engines. Site-‘crawling’ forays by these good bots rose to account for 31% of traffic, up by 55% on the previous year. Incapsula reckons that the progress of web-based services is the main cause of this phenomenon. The report points in particular to newly established search engine optimisation (SEO)-oriented services that crawl a site at a rate of 30-50 daily visits or more.
Four types of malicious bots
Incapsula also notes increased activity by existing bots as the visit patterns of some ‘good’ bots – e.g. search engine type crawlers – consist of recurring cycles. The report shows that in some cases these cycles are getting shorter and shorter in order to allow higher sampling rates, which in turn results in additional bot traffic. The overall proportion of malicious bots remains unchanged at 31%. Incapsula identifies four types. Firstly the ‘Spammers’. The only noticeable reduction in non-human Internet traffic is in this area, which decreased by 1.5% of the total, the most plausible explanation for the reduction being Google’s anti-spam campaign focusing on SEO techniques. Spammers look at all web sites that authorise posts and comments and then display irrelevant content and links that could harm the user or even cause damage to his/her system. Moreover, if the Spammers turn the site into a ‘link farm’, this may result in the website being blacklisted by search service providers. The second category is ‘Hacking Tools’ designed to attack websites. These bad bots steal data, e.g. from credit cards, they inject and distribute malware, hijack websites and servers and sometimes even delete content.
Hackers becoming more sophisticated
Thirdly there are the ‘Scrapers’ whose general aim is to pirate information. They steal content, duplicate it, steal email addresses for spamming purposes and even carry out reverse engineering of companies’ pricing and business models. The most common targets are travel industry websites, news sites, e-commerce stores and forums. Finally, those that Incapsula dubs ‘Other Impersonators’ attack just about anyone. This fourth type of malicious visitor – consisting of a range of unclassified bots, all with hostile intentions – damage market intelligence gathering, cause service degradation and, by using bandwidth saturation as part of their tactics, may even bring websites down entirely. In 2013 there was an 8% increase in the activities of these ‘Other Impersonators’. The common denominator for this group is that all these attackers are trying to assume someone else’s identity. Some of these bots rely on browser user-agents while others try to pass themselves off as search engine bots or agents of other legitimate services in order to infiltrate their way through the website’s security shield. In terms of their functionality and capabilities, such ‘Impersonators’ usually represent a higher level in the bot hierarchy. These can be automated spy bots, human-like Distributed Denial of Service (DDoS) agents, or a Trojan-activated ‘barebones’ browser. One way or another, these are the hallmark tools of top-tier hackers, and advances in these techniques mean that their attacks are becoming more frequent and more serious.