Why Manage Bots? Here’s Five Reasons.

Why Manage Bots? Here’s Five Reasons.

  • By Anand Hariharan |

According to Internet Retailer magazine, Thanksgiving weekend was the largest online shopping weekend ever. Saturday and Sunday, Nov. 24-25, generated $6.4 billion in online sales, according to Adobe Analytics. Just a day later, Cyber Monday turned out to be the biggest online shopping day in history! Overall, online sales grew by over 20% from the same period last year. And we’re not even close to being done: I fully expect the 2019 holiday shopping season to leave last year’s records in the dust.

With more shoppers buying online than ever before and e-commerce spending running into the billions, e-commerce businesses are “natural” targets for malicious cyber-attacks. It’s quite likely, in fact, that your site is currently under surveillance by hackers and cybercriminals looking for a way to either steal credit card data or hijack your site (and hold it for ransom).

Attackers are becoming smarter, more resolute, and more sophisticated, by employing bots or botnets to do their dirty work, at a larger scale, targeting more websites. (Read this blog on how typical Magento attacks are executed.) In fact, it’s safe to assume that there was a bot attack on almost every major e-commerce website during Cyber Week.

Up to half of your online storefront’s traffic could be bots, but not all are bad. Search engine crawler bots (from Google, Bing, Baidu, and others), for example, index your site based on keywords and intent. If you blocked “all” bots, you might see a significant drop in your search rankings over time.

Enter Bot Management.
Bot Management begins with identifying and classifying bots into various types – good, bad, and malicious. Malicious bots, without a doubt, need to be denied access to your site and application infrastructure. Good bots, on the other hand, need to be managed optimally to ensure minimal load on your infrastructure, and maximum available capacity for your real users. This is an ongoing and iterative process, and one that needs a comprehensive strategy.
Here are the top five reasons you need a bot management solution:

1. Bot attacks are becoming more frequent.

According to an industry report, 21.45% of the traffic on an average e-commerce site in 2017 was comprised of bad and malicious bots. To learn more about the different kinds of attacks hackers can execute using these bots (or botnets), read my previous blog on bot classification.

The number of bot attacks overall has been on the increase. Last year, it grew by 9.5% across industries. Account takeover attacks now occur two to three times a month on an average site, and up to three times more frequently, immediately following a breach.

The important thing is, whether executed covertly or by brute force (like a DDoS attack), the impact of an attack is always devastating. It can damage an online storefront’s brand, reputation, revenue, and shopper retention. Prevention is always better than the cure.

2. Your competitors are deploying bots to steal pricing data.

Competitors often resort to deploying scrapers to steal valuable content and pricing information to gain an unfair advantage. If they can gain real-time access to your product availability and pricing information, they can change their pricing to lure customers away from you. A sophisticated bot management solution can identify and redirect these scrapers to an alternative website/fake application backend or drop them off your website.

3. Identification of bots is difficult.

Good bots usually announce themselves. However, bad and malicious bots can masquerade as good bots, or humans. The more sophisticated ones can even mimic human behavior to some degree to evade detection. This makes it hard to tell one from the other. In fact, e-commerce has the highest proportion (22.9%) of such “sophisticated” bots, according to an industry report.

The most rudimentary way of identifying bots is by using user agents and IP addresses. Utilizing user agents alone is problematic as they are easy to fabricate. For instance, it’s easy to for a bot to claim that it’s from Google, but unless its IP address is validated as such, watch out!

A more sophisticated way of identifying malicious bots is by using behavioral analysis. However, when you use people or static policies to identify human behavior (from bots), you are limited by human biases, user knowledge, and not being able to adapt dynamically to changes in shopping behavior or bot attack patterns.

The smart way involves the use of machine learning to classify user behavior as “human” or “not human”. In other words, you’re using machines (and not people and pre-built policies) to identify and combat machines. If behavior has been identified as potentially non-human, you can issue challenges that only humans are capable of performing, for instance, typing out a number code (on display), solving a picture grid, or simply checking the “I am not a robot” box. If behavior is confirmed to be non-human, you can take actions to block or redirect the bot.

Other bot identification techniques include IP reputation-based filtering and browser tests.

All the above techniques, besides being iterative, involve specialized technology and considerable expertise, and cannot typically be executed in-house.

4. Good bots need to be taken care of, while levying the least burden on your application infrastructure.

Good bots, once identified, need to be managed in a way that imposes the least burden on the application infrastructure, leaving as much capacity as possible for real human traffic.

When trusted good bots are requesting pages, these can be delivered from your cache without accessing the backend. This improves performance by ensuring available capacity at the application backend for real human visitors.

If you can manage responses to search engine crawlers outside of your infrastructure through an intelligent caching system, such as the one we have at Webscale, you’re doing a good job of making your backend more efficient and potentially saving some operational costs as well.

5. Your key metrics might be skewed.

Key e-commerce metrics, such as web traffic and conversion rates, can be significantly skewed if you don’t accurately account for bots. This leads to poor benchmarking and misaligned investments, budgets, and website design choices; in general, bad business decisions.

By filtering out bots from real human visitors, merchants can boost conversion rates, and gather truly accurate data on clean traffic and genuine users, thus facilitating better-informed decisions.

Security is an arms race

Security has been, is, and will always remain an arms race. Cyber-attackers keep getting more and more sophisticated. If you build a higher wall, they’ll find a way to get over it, and you’ll need to go build a bigger one.

In that sense, security is a very iterative process. Attackers are using sophisticated systems and machines, so you need to be as well. Relying on an in-house team of security personnel to fight hackers using sophisticated bots, is the cyber-equivalent of bringing a knife to a gunfight.

It’s advisable to talk to experts and trusted third-parties that tackle cyber security for a living, so you can go back and focus on what you know and do best.

If you’re looking to build a business case for Bot Management or implement a comprehensive security solution to stay ahead of hackers (and competitors), fill out this form or drop us an e-mail at sales@webscalenetworks.com.

Anand Hariharan

Anand Hariharan is the Vice President of Products at Webscale. He is a product management, marketing, and business development leader with significant success in growing cloud-based businesses across different industries and geographies.