As all applications need to be both secured and fast, the industry moves towards mitigating bad bots. As nearly 25% of all web traffic is generated by bad bots, we have to be sure we can detect and block them. Of course, this ratio depends on your market – for example, gambling companies and airlines have approximately 54% and 44% of their traffic coming from bad bots, respectively.
An essential part of the technological evolution is creating systems, machines and applications that autonomously and independently create, collect and communicate data. This automation frees information technology folk to focus on other tasks. Currently, such bots generate more than half of the internet traffic, but unfortunately every evolution brings with it some form of abuse. Various ‘bad’ bots aim to achieve different goals, among which are web scraping, web application DDoS and clickjacking. While simple script-based bots are not much of a challenge to detect and block, advanced bots dramatically complicate the mitigation process using techniques such as mimicking user behavior, using dynamic IP addresses, operating behind anonymous proxies and CDNs, etc.
Captcha means “Completely Automated Public Turing Test to tell Computers and Humans Apart”.