Apps control our lives today. We pay our bills, do our shopping, communicate with our doctors, buy our groceries, order a taxi, and even order our lunch through ‘apps.’ If you can think of it, there is an app for it. And these apps live on our phones, our desktops, in web portals and even in our internal networks. However, all these apps create new and different types of security challenges for an organization’s network. The speed and complexity inherent in these technological advances expose application vulnerabilities, security risks and skills deficiencies that can compromise sensitive data, devalue the brand, and affect financial performance.
Radware, in conjunction with Ponemon, investigated the impact of applications on network and application security. We surveyed more than 600 information security professionals across six continents. The intent was to uncover the challenges that these new technologies and rapid-fire application deployments present, ascertain how organizations in different industries identified application-layer and API vulnerabilities, measure the impact that bots have on organizations, and identify how companies combat application layer attacks.
The results were alarming.
Think about it for a second. With how many organizations have you ever shared your address, banking details, or credit card? What about medical information? Now breathe in and understand that 45% of them were hacked. To be more specific, only 27% of healthcare respondents have confidence they could safeguard patients’ medical records. This is not surprising when 52% do not inspect traffic flow to-and-from APIs and 56% do not track data once it leaves the company, as sensitive as it might be.
Are you using your apps on your smartphone, or a desktop web browser? Consider this, we found out that while mobile applications have more consumption, the businesses that roll them out invest less in their security compared to web services. Moreover, nearly 25% of mobile applications undergo changes daily; if not hourly – which is more than double the ratio of their web application counterparts.
Now let’s look at what happens when a company is attacked: Organizations testify that it takes them quite long to react to global cyber campaigns. While hackers create new malware and attack tools every day, and CVEs are disclosed all the time, nearly two-thirds of respondents have little to no confidence they could rapidly adopt security patches and updates without having an operational impact.
To overcome the complexity, organizations turn more and more towards automation. More than 70% indicated they are already using (28%) or planning to add (43%) security solutions that are based on machine-learning in approximately 24 months.
The need for automation is driven by the astonishing amounts of bot-generated traffic. Nearly 30% of the total traffic of the internet is essentially bad bots – spammers, scrapers, scanners, botnets used for DDoS and whatnot. Unfortunately, a third of organizations to be exact –still cannot make a distinction between the bad and the good bots (like search engine crawlers or chatbots). Not only this is a security issue, this also means that businesses invest in resources to accommodate these bad bots.
Some bots are sophisticated enough to imitate human behavior and bypass security controls. Advanced solutions that leverage behavioral analytics and fingerprinting technologies can help organizations detect and block these attacks.
These bots are able to evade captcha, IP-rate detection, and overcome in-session termination. In the retail industry, bots are even greater a problem: Web scraping attacks plague retailers by stealing intellectual property, copying websites, undercutting prices, holding mass inventory in limbo, or buying out inventory to resell goods through unauthorized channels at markup.
This chart gives us a notion of the prevalence of application attacks organizations suffered:
However, when they are asked which are the top three most challenging for them to defend against, they indicate that while most do well when it gets to OWASP top 10 vulnerabilities, the top concerns are application layer DDoS, encrypted web attacks, and API manipulation:
APIs are indeed an emerging concern as modern applications rely more and more on integration with third party services. However, the same vulnerabilities of applications apply to APIs as well, but these are harder to monitor. Rapid FaaS (Functions-as-a-Service) evolution is driving API adoption. FaaS (a.k.a. Serverless architecture) offers a model where the operational unit is a set of function containers rather than a web server. These functions are APIs exposed for the client side application, which may invoke these APIs upon relevant client side event. A simple example would be an application on your mobile device that uses the GPS function or reads your Facebook profile. Access violations, protocol attacks, invalidated redirects, parameter manipulations and irregular expressions of JSON/XML are just a handful of API abuse examples.
Let’s look at financial services for a particular example: 72% share usernames and passwords and 58% share payment details via APIs. Yet, 51% do not encrypt that traffic, potentially exposing valuable customer data in transit.
The #1 concern however, was HTTP/S DDoS. Attackers are slowly moving away from easy-to-mitigate network layer DDoS attacks and starting to leverage simpler and defensively complex tools such as Layer 7 HTTP and HTTPS flood attacks. HTTP flood attacks are some of the most advanced cyber security threats to web servers as it is hard to distinguish between legitimate and malicious traffic, thus creating a challenge to rate-based detection solutions.
We are also seeing more attackers attempting to leverage HTTPS floods as the tools become more popular and widely available. Low & Slow application layer Denial-of-Service attacks use slow traffic that appears to be legitimate HTTP requests to pass traditional detection and mitigation systems undetected. In addition, these tools also keep a persistent foothold in the system by sending a standard HTTP command keep-alive to force the server to maintain the open connections.
This threat weighs much when it gets to the availability of the application. An availability outage or even degradation for a retailer in the holiday season can result in tremendous financial losses. Although this is known, more than half (53%) of retailers are not confident in their ability to provide 100% uptime of their application services, and 30% suggest they lack the ability to secure sensitive data during these periods.
All these application threats jeopardize the efforts organizations invest in end-to-end automation of the application development, rollout and update cycle. This continuous delivery approach is gaining greater adoption among organizations that seek agility in creating and launching new services.
They rely on automated tools and methods that accelerate operational efficiency, but security, in many cases, is left behind. While 62% reckon it increases the attack surface, less than half say that they do not integrate security into their continuous delivery process.
We learned that applications are vulnerable and data is exposed to a greater extent than we imagined. This is mainly due to organizations’ sincere concern that they simply cannot protect them. Humans are bound to make errors and are reaching their capacity of controlling and analyzing all events in their application environments, but AI is still emerging and untrusted, which leaves up with a big gap, and high risk, for all of us to live with.