The success of an online business depends in large part on the user experience. After all, competitors are only a single click away. There is a broad spectrum of services that impact user experience from an infrastructure and application perspective. Think about page load times, availability, and feature richness. Agility in the delivery infrastructure and continuous delivery of applications have become ubiquitous to the success of an online business. Hyper scale cloud providers such as Google, Amazon, eBay, and Netflix have been leading on highly scalable, agile infrastructure and continuous delivery of applications and are considered the golden standards for the practice of online business.
From the requirement to keep up with customer demand through continuous delivery of updates and features to applications grew DevOps. DevOps is a piece of the puzzle that eventually leads to better user experience. DevOps’ premise is about enabling greater speed and agility in bringing changes from development to production as such to keep up with customer demands. DevOps blends organizational ideas, processes, and software tools in the areas of development, deployment and delivery of applications. For hyper cloud providers, DevOps enables multiple application updates per day. Most online enterprises will not have a need for multiple updates per day, but to keep up with a fast-moving market and demanding customers, several updates per month should not be exceptional. Independent of the number of changes per day or month, DevOps has become second nature for agile, high-performing enterprises and a foundational element for the success of their business. Until recently, Security practice has been lagging behind on the promise of agile and speedy deployment and has proven to be hard to keep up with the speed and scale of modern application delivery.
The implementation of DevOps consists of an automated chain of tools that progress an application from source code through compile, test, and deployment to final delivery in production. The automation chain is comprised by many different tools that touch on application development, testing and infrastructure management. Developers and infrastructure engineers join hands and automate the chain by integrating their individual processes for bringing applications from development into production within minutes, not days or months. The philosophy of DevOps is to automate the boring stuff, investing time up front to make things easier and more consistent over the long haul. While providing agility, automating these tedious and boring tasks also eliminates human error. While the front end of the DevOps chain emphasizes building and testing to produce working applications, the tail end of the DevOps chain is about delivering that application and putting in place the infrastructure required for it. Think about provisioning of the compute resources in the form of containers and/or virtual machines, network resources and network virtualization, application delivery controllers to improve scale. This tail end part is commonly referred to as “infrastructure as code” and mostly achieved through tools such as Puppet, Chef, Ansible and Salt.
With business demand for DevOps and Agile, traditional security processes have become a major roadblock and sadly sometimes easier to bypass all together. More than anything else, the user experience and trust of your online business will depend on security. What if your application is brought down for hours or even days through a DDoS attack? What if your application is breached and personal information or credit card information of your customers are leaked online?
Traditional security operates from the position that once an application has been delivered, its security posture can then be determined by security staff. It’s hardly ever the case that all the information to render security policies is available and as the value creation process speeds up to provide iterative value to closely map to customer demands, it is even more the case that a one-time determination and testing is not adequate, potentially it leads to destructive outcomes where security breaks the application. Out of the need to balance these security requirements with DevOps speed evolved a new operational framework called DevSecOps. The DevSecOps model provides a framework to add security checks in the integration and deployment pipelines and relies on the idea that “everyone is responsible for security”.
The DevSecOps movement has led to a number of new tools and technologies that will help enterprises drive their security needs across the DevOps pipeline and create a better, more agile security practice. Let’s consider a few of these new technologies and situate them in their most effective place in the application delivery chain:
- Runtime Application Self-Protection (RASP) technology is built into an application and can detect and prevent real-time application attacks. RASP prevents by “self-protecting” or reconfiguring automatically without human intervention in response to threats, faults, etc. RASP comes into play when the application executes (runtime), causing the application to monitor itself and detect malicious input and behavior. RASP technology is intimately integrated with the application execution environment and currently exists for Java virtual machines and the .NET Common Language Runtime. RASP certainly adds another layer of protection to consider for greenfield applications built on top of supported execution environments, but does not provide a solution to protect legacy applications and does not (yet) provide protection for applications written in Python, Go, Ruby, PHP or other run-times that are gaining popularity in the cloud application development community. RASP targets a very specific part of the enterprises that agree on a common run-time for all their applications and invest in porting those applications to this common run-time. There is still a large number of enterprises, financial institutions e.g., that are depending on legacy applications and code that they prefer not to touch and which are, because of their ‘ancient’ nature in terms of information-technology-years, typically not designed with a security mindset.
- Dynamic Application Security Testing (DAST) solutions are black-box testing technologies designed to detect vulnerabilities in applications in its running state. DAST tools analyze the application’s behavior based on varying inputs, usually from a pre-defined vulnerability list, to check if the application can be exploited.
DevSecOps puts less emphasis on the actual running application and is less concerned about attack detection and policy enforcement. As businesses integrate SAST, DAST and RASP tools in their tool-chain to provide security excellence for the in-house cooked application, there is still the infrastructure and the third party services the application depends on such as SSO to consider. Whatever the reach of SAST, DAST and RASP, complex attacks by highly skilled ‘human’ hackers, combining and leveraging a multitude of attack vectors and unknown 0-days will potentially result in breach of applications. There is more required to detect and mitigate attacks, and none of the above mentioned tools provide the ability to enforce policies on the application which would leave an application vulnerable and open to scanning, probing, and bot scraping.
The Positive Security Model of a Web Application Firewall
Automating network services through the DevOps process has been top of mind for most enterprises and vendors as most are integrating their products with the most popular of automation tools such as Ansible, Salt, Puppet and Chef. Deployment is only one part of the integration, automated policy management is probably even more critical as it proves to be more tedious to keep up to date and in sync with the applications’ dynamic nature. As features get added to applications, as applications get scaled out or in, as applications move from one part of the datacenter to the other or even from one datacenter to the next, security policies need to be moved and adapted to reflect the latest state of the application or the delivery will effectively fail. Infrastructure services provided by ADCs and firewalls are good candidates for such integrations as their per application policies can be described by a limited set of rules and can easily be converted into templates and replicated across different applications.
Up to this point we did not cover the granular policy enforcement provided by Web Application Firewalls (WAF) for online applications. We covered the application’s security and are confident we ironed out most vulnerabilities based on known coding patterns and best practices. We were able to deploy the application, scale it, move it and protect access to it from a network level. Now it is time to consider how to protect our business from external, application level threats. Even when applications are built with security in mind, scanned and tested by the whole of static and dynamic assessment tools, we should still be nervous about bringing this application online without some minimal barrier of access control provided by a positive security model.
Web Application Firewalls provide the ultimate front-line protection for online applications, whatever their brand, form, deployment model or color… There are many options to consider ranging from cloud based services to on-prem appliances, ADC integrated services, virtual appliances up to modules loaded in the web servers. Let’s not get involved in the deployment models and offerings but consider WAF in general from a continuously changing application point of view.
Breaking down even a single, simple web application into all its possible outcomes and all intended or unintended permutations of those outcomes, at the granularity of input fields and query parameters, can be cumbersome at the least. Describing the application in the granularity required by a web application firewall to implement a positive security policy almost becomes a new development task. Describing the intent of the application and not the how, in a domain specific language which is a subset or superset of Yaml and/or Jinja2, and completely unrelated to the native programming languages of applications developers are accustomed to, is not something you would likely introduce without resistance from the development teams. By consequence the use of automation tools to automate and adapt a positive security model for WAF based on the description of intent upon deploying the application if pretty much off the table.
Continuous Security Delivery for Online Business
As programmatic description and code analysis fall short at providing a full protection from attacks in real-time, a new technology needs to be incorporated that can protect continuously delivered applications. Continuous Security Delivery is a dynamic security technology providing auto policy generation and an adaptive positive security policy by applying techniques and ideas borrowed from supervised machine learning to analyze web applications as they are being used by actual customers. The adaptive positive security policy is the result of real-time analysis, interpretation and learning from actual production traffic and a continuously refining and extending model of the application while applying only the most relevant prevention modules and defenses for specific parts or pages of that application. Accurate classification and a set of adaptive, intelligent modules activated depending on the classification provides superior attack detection and mitigation while preventing false positives. To further reduce false positives that would be introduced through application changes, incoming requests that fail the positive security policy are re-considered by modeling a probability combining user reputation and device fingerprinting, allowing the model to learn changes while it is protecting.
Continuous Security Delivery allows automated deployment of Web Application Firewall services to protect online resources with a continuously and automatically adapting positive security model that enables continuously delivered applications to be secured, without having to worry about false positives or limiting oneself exclusively to less secure negative security models.
Technology as such should be agnostic from the deployment model or make of the service. Even if in reality it proves not to be the case, it is not the purpose of this blog to differentiate existing solutions, but I would urge anyone responsible for deploying or running WAF to evaluate and test for Continuous Security Delivery technology and seriously consider this as a critical part for successfully delivering an online business.
Download Radware’s DDoS Handbook to get expert advice, actionable tools and tips to help detect and stop DDoS attacks.
Recognized Cyber Security and Emerging Technology thought leader with 20+ years of experience in Information Technology As the EMEA Cyber Security Evangelist for Radware, Pascal helps execute the company's thought leadership on today’s security threat landscape. Pascal brings over two decades of experience in many aspects of Information Technology and holds a degree in Civil Engineering from the Free University of Brussels. As part of the Radware Security Research team Pascal develops and maintains the IoT honeypots and actively researches IoT malware. Pascal discovered and reported on BrickerBot, did extensive research on Hajime and follows closely new developments of threats in the IoT space and the applications of AI in cyber security and hacking. Prior to Radware, Pascal was a consulting engineer for Juniper working with the largest EMEA cloud and service providers on their SDN/NFV and data center automation strategies. As an independent consultant, Pascal got skilled in several programming languages and designed industrial sensor networks, automated and developed PLC systems, and lead security infrastructure and software auditing projects. At the start of his career, he was a support engineer for IBM's Parallel System Support Program on AIX and a regular teacher and presenter at global IBM conferences on the topics of AIX kernel development and Perl scripting.