The Evolution of Application Development

0
4770

Like most things that significantly affect the lives of hundreds of millions of people, application development has gone through an array of stages. Naturally, each has introduced many challenges, some more vexing than others. In fact, the challenges facing application development got so taxing that in 2001 several noted developers got together to establish the Agile Manifesto. Its goal was to address issues they had with traditional slow and cumbersome application development — the waterfall method.

Ultimately, the waterfall method meant products couldn’t go to market in a timely fashion, which meant revenue lagged behind, as well. This also led to frustrated developers whose coding progress would grind to a halt as application iterations were inspected by operations personnel. In short, the waterfall method was slow, frustrating, inhibited innovation and prevented companies from remaining competitive in the marketplace. The Agile Framework addressed these and other issues beautifully.

Pre-Agile Application Development — A Brief History

Application development began with monolithic code. To give you an idea of what this meant to application development, here are several synonyms for monolithic — rigid, unbending, inflexible and intractable. If that doesn’t sound like application development goals and traits to aspire to, you’re right.

Monolithic code was difficult to test and, in short, required long development cycles. What followed were dedicated/embedded modules written within applications. The testing was easier and headed in a good direction — reusability. Best practices advanced to include reusability and portability, which ushered in the reuse of proprietary and open-source module code.

Reusability meant developing similar applications became easier. The downside was that it introduced unknown and unpatchable vulnerabilities due to unmanaged code. However, testing for these vulnerabilities had become easier. Best of all, productivity gains greatly accelerated.

Containers and Microservices Ushered in More Benefits

Today’s movement to containers and microservices has truly revolutionized application development. With it, developers create continuous integration and continuous development (CI/CD) pipelines, resulting in many application development benefits:

  • Faster deployments
  • Easier movement of applications between computing environments
  • Simpler testing
  • Rapid Scalability
  • Faster, more nimble ability to address business needs

However, it also introduced some intrinsic risks.

But First, the Benefits

One could argue that cloud computing gave CI/CD its reason for existence. Cloud users demand quick feature parity with on-premises applications and rapid feature delivery in agile development models, rather than interval-based large releases. These needs drove new application delivery methodologies, like containers, microservices and serverless application deployment, which created greater risk.

Research conducted by Radware and Enterprise Management Associates identified a very interesting set of benefits and problems. Over 45% of respondents said their organizations have deployed a third or more of their applications in a container/microservices architecture. Another 45% indicated that they are currently testing the waters on either how to deploy applications in a container or microservices architecture or are planning a migration within the next 12 months. It’s a breakneck pace for changing application architecture!

Why is adoption so fast? It’s in the impressive numbers:

  • 68% of organizations that deployed applications in container/microservices architectures say they have seen an increase in security effectiveness, and
  • 61% identified an increase in operational efficiency.

And Now, the Risks

Unfortunately, it’s not all good news. Fifty-two percent of respondents said their operational costs increased, and 57% said they believe their application risk profile increased, as well. So, why did these increases occur? More importantly, can they be reduced? The answer is yes to both.

Operational costs increased due to retooling and education. The same happened with other programming technique shifts after CI/CD tools were deployed. As more developers become well-versed and skill sets are enhanced, educational cost spikes will decrease accordingly. Similarly, once organizations select a single or primary tool for container management — and the same for microservices management — those costs will stabilize.

It Will Take Some Time, But Be Well Worth the Wait

Decreasing an application’s risk profile will most likely take a little longer but should also mirror previous trends in application deployment.

Delivering and securing containers and microservices is relatively new. Both application developers and information security personnel are not entirely certain or agree on how to protect applications. Standards and best practices are still evolving. Vulnerabilities are still being discovered on 10-year-old software, so expecting new methodologies to be 100% secure overnight is unreasonable. To satisfy due diligence, it’s important to always prepare, evolve and apply the necessary resources.

Though there may be a few hiccups along the way, a strong, steady application development state should eventually become the norm. The benefits are too great. Only a major, unfixable vulnerability should slow momentum.

Read Radware’s Web Application Security Report to learn more.

LEAVE A REPLY

Please enter your comment!
Please enter your name here