Simply put, the days when firewalls and a large enough pipe to the internet were enough to protect your network have long since passed. Any organization or website is a potential target, and with high odds of a given attack flooding homegrown defense tactics, most companies are moving their mitigation tools offsite. The cost of downtime – upwards of $9,000 per hour for small businesses and $690,000 for large companies – are just too great to risk going it alone.
It seems hardly a week can pass without some cloud-based security service provider announcing the latest expansion of their cloud infrastructure. The cadence has turned into something of an arms race mentality on the part of these providers, perhaps in response to a sense that’s what the market wants to see in a service provider. After all, X+1 number of Points of Presence (POPs) is better than X, right?
Well, the real answer is that most confounding of answers: it depends. In this case, the dependency is a question of what specific problem you’re trying to solve.
Successfully protecting against web-based attacks is like trying to win a game that keeps changing its rules all the time… only nobody tells you what the new rules are! Static cloud security services cannot help you win the web security game. Only cloud security services that continuously and automatically adapt to the rapidly evolving threat landscape and protected assets can assure you are well prepared to anything that will be thrown at you… even as the rules continuously change!
Virtualization of existing technologies is an evolutionary step in the development of cloud designs. The cloud is supposed to be an architecture that delivers applications and data in a reliable and fault-tolerant manner. The benefits that we want to derive are not new. We are just applying them to a different business model. We created the cloud to deliver applications and data anytime, anywhere, and to any device. We need to reconfigure existing processes and technologies to support the evolving cloud architecture.
The hackers are winning.
Or said more accurately, strong security is losing . . . sometimes to itself.
That seemed to be a general undertone of last weeks’ RSA Conference. No one actually came out and said it in those words, but there is an undeniable degree of humility to many of the messages passing through the halls of the Mascone Center this year.
Mike Geller from Cisco’s CTO office and Ehud Doron of Radware’s CTO office presented at Cisco Live Berlin 2016 the revolutionary concept of Network-as-a-Sensor to fight DDoS attacks.
There are two approaches to detect against DDoS attacks: on-premise (also sometimes called in-line) and Cloud (out of path). When a DDoS protection solution is deployed on-premise, organizations benefit from an immediate and automatic attack detection and DDoS mitigation solution. Within seconds from the start of an attack, the online services are well protected and the attack is mitigated.
Sometimes it feels terrible to be right. In our recent Global Application & Network Security Report we predicted an increase in complex encrypted attack vectors and the importance of putting in place adequate defenses that can scale and inspect encrypted traffic. Just last week, we got a vivid example of the increasing threat posed by encrypted attack vectors. A high profile attack occurred with an organization that had both a combination of on-premises and cloud-based DDoS protection, yet the organization’s site still went down, in large part because the attack “hid” from detection by the cloud-based resources by using encryption.
Information security professionals can hardly be blamed for a recent over-emphasis on looking to cloud-based solutions for protection for both network and application protection. It’s a natural reaction to the seemingly endless stream of news about large volumetric attacks. Add to that the natural migration of many applications to the cloud and the cloud momentum really gets rolling.