main

Application Delivery

SCADA Part 2: Mission critical, highly vulnerable, almost un-protectable.

February 16, 2017 — by Daniel Lakier0

SCADA-infrastructure-part-2-960x721.jpg

Hey folks, I’m back with my second installment on protecting the un-protectable:

Last week we discussed the SCADA environment and some of the unique business and technology challenges we face when trying to secure it both from availability and cyber security hazards. The questions you are all asking yourself now are “how did we get here?” “Why would anyone build anything this insecure?” The answer is so simple … we never anticipated these networks would communicate with the outside world. PCD and SCADA environments were meant to be “closed loop” and therefore air-gapped (If you’re air gapped, you don’t need security, right? Ask Iran about the Natanz nuclear facility). If you think about it, that was a perfectly good assumption. Why would factory machinery ever need to access the internet, or a power plant, or an oil rig… I could go on and on. However, this paradigm changed for two reasons.

Application Delivery

When the Application Stops Flowing, What Next?

February 14, 2017 — by Frank Yue0

big-5-visibility-960x672.jpg

Don’t you hate it when you have a problem, but have no idea what is causing it? The water in my house stopped running recently. I have a well with a pump and a fairly complex system of pipes going through a water filtration and softening system. I had no idea why the water was not flowing, but it was obviously a major issue.

I checked the pipes and they all seemed ok. I cleaned the filter, and verified that the water filtration system was in good order. While I cannot physically inspect the pump because it is dozens of feet down a 4-inch well shaft, I did power-cycle it to ensure that it seemed to be working properly. Ultimately, I had to call a plumber/well specialist who, after inspecting the entire system, determined that my water pressure tank and switch needed to be replaced.

Application Delivery

Use Application Delivery Technologies to Accelerate and Automate the Boring Tasks

February 8, 2017 — by Frank Yue0

adc-automation-960x640.jpg

Playing a stringed musical instrument like a guitar means that the different strings need to be regularly adjusted to play the correct note. Guitar players tightened and loosened the strings to tune the guitar. The strings were tuned by ear, which meant that the person tuning the guitar had to know what sound each string had to make with considerable accuracy.

With modern technology, there are tuners that can generate tones so one no longer needs to know exactly what each note sounds like. And, today, there are tuners that will automatically adjust the tension in the string to create the right tone with no human intervention. This is a great benefit for guitar players because they like to play music, and not spend a lot of time and effort tuning their instrument every time they wanted to play.

Application Delivery

SCADA: Mission critical, highly vulnerable, almost un-protectable.

February 7, 2017 — by Daniel Lakier0

SCADA-infrastructure-960x647.jpg

In today’s world when most of us think about IT infrastructure, we think about the traditional environments that have firewalls, switches, routers, standard operating systems and all the associated security. We think of internet applications like Facebook, LinkedIn, eBay, SalesForce and Amazon, etc.

What we don’t think of is the SCADA environment; Networks and systems that are embedded into all our critical infrastructure, transportation systems, power plants, water treatment facilities, all factories, mining, oil production, etc. Most of us just assume these networks are like all other IT environments, that they face the same risks and deal with that risk in the same way. I’m here to tell those of you who think that way, that they don’t and they can’t. There are technical reasons why they can’t and business reasons why they won’t. They are, to some extent, the un-protectable networks.

Application DeliveryVirtualization

Application Virtualization – Seeing the Forest Instead of Trees

February 2, 2017 — by Frank Yue0

application-virtualization-forest-960x641.jpg

Virtualization of the application environment is on every business’ mind. Terms like hypervisors, virtual machines, and software defined [insert your own popular term here: networks|data centers|storage] are being thrown around the technology industry like hot potatoes. While IT organizations focus on virtualizing specific applications, they often forget to see how this component fits into the overall trend to virtualize the entire IT infrastructure.

Application DeliverySecurity

Web Internet Companies and Carriers are Deciding to Just be Friends

December 22, 2016 — by Mike O'Malley0

carriers-wic-960x720.jpg

As the Carrier vs. cloud competitor discussion has raged over the past few years, it seems there has been a truce called in the last few months.  Rewind back a few years ago and the Web Internet Companies (WICs) and Carriers were mortal enemies fighting over the same space.  As such, Carriers moved to buy or build their own Cloud data center operations.  Verizon buying Terramark for 1.4B in 2011 being just one such example.

Application DeliveryVirtualization

Automation – Virtualizing the Human Factor

December 8, 2016 — by Frank Yue0

automation-virtualization-960x768.jpg

Everyone is forgetting to virtualize the most important element within the IT environment – the humans. Virtualization through cloud, software defined networking (SDN), and software defined data centers (SDDC) is the latest craze with internet architectures. IT organizations are moving away from proprietary hardware towards common off the shelf (COTS) platforms that can perform a variety of tasks.

The hardware and software have been virtualized, but the “humanware” is racing to catch up to support the capabilities of the virtual infrastructures. Manually manipulating the virtual networks with manual processes is not efficient. Organizations lose much of the benefits of these virtualized application delivery architectures when human-driven manual processes are still used to support them.

Application DeliverySecurity

HTTP/2 is Here – What Now?

November 16, 2016 — by Prakash Sinha0

http-2-2-960x638.jpg

Hypertext Transfer Protocol (HTTP) is the protocol used primarily for communication between the user’s browser and the websites that users are accessing. Introduced in 1991, with a major revision in 1999 to HTTP 1.1, HTTP protocol has many limitations. In 2009, engineers at Google redesigned the protocol in a research project called SPDY (pronounced “speedy”) to address some of HTTP 1.1 limitations.

Websites in the early 90’s when HTTP was introduced were markedly different from today’s websites. In February 2015 the Internet Engineering Task Force (IETF) introduced a new version, HTTP/2, to keep up with the evolution that internet has undergone since the early 90’s.

Application Delivery

Hybrid Cloud – It is Not the Migration that Hurts

November 15, 2016 — by Frank Yue0

hybrid-cloud-migration-960x640.jpg

The last time I moved was 10 years ago. At the time, I told myself that this is the last time I am moving. The packing and relocation of my belongings was not too much trouble. The main problem is trying to get everything sorted out and put into their proper place in the new home.

When businesses migrate applications to the cloud, the process is similar. They are very familiar with the applications and their data. The problem is that they need to understand how the applications will behave in the new environment. Where does the data reside and how do the clients access the application’s new home?

Application DeliveryVirtualization

Hybrid Cloud is Not the Goal – A Case for Full Virtualization

November 10, 2016 — by Frank Yue0

hybrid-cloud-960x639.jpg

In the world of cloud and virtualization, the buzz word of the day is ‘hybrid’. Everyone wants a hybrid cloud environment because they want to get the benefits of the cloud without relinquishing control of their applications and infrastructure. IT departments want the cost savings along with the agility and elasticity that cloud technologies bring, but they are not comfortable with the complete migration of their applications and data to a managed infrastructure.

Today, there are two primary use cases where companies are putting applications and data into the cloud. The difference between the two cases depends on whether it is an existing application or a new one that is being deployed. In the former, there is legacy infrastructure and support that has to be accounted for while the latter provides a greenfield opportunity to create an ideal infrastructure from the start.