The Patchwork Cloud: Portability of Security in Cloud Computing

Tuesday, May 15, 2012

Rafal Los

0a8cae998f9c51e3b3c0ccbaddf521aa

In a previous post on this topic, I talked about how when you're thinking cloud, you really need to think about a model-driven approach.  This post is a continuation of that thinking and discusses portability as a key requirement for cloud adoption and of course security.

We understand cloud computing is being presented as a fantastic way to utilize technology and virtualization in new and innovative ways, but to do that you must have some of these key ideas - one of which is portability.  

When you're thinking about deploying applications in cloud-based environments, you have to have the ability to abstract out the implementation details from the meta-data about that implementation - this is called a model-driven approach.  That model-driven approach creates the ability to be portable across not only various environments but also cloud providers as well.

Today's agile business who view cloud computing as an opportunity to gain agility and competitive advantage are thinking about how to not have cloud become a single point of failure like it has been for many.  When you think of how many organizations put their faith in single-cloud solutions in the past - take Amazon for an example - and lost not only valuable time but entire components of their business in some cases.  

This exposed many of the failures of single-sourced solutions on single-sourced platforms.  Many of today's agile and risk-averse businesses are multi-sourcing their cloud computing environments meaning that they may not only have multiple vendors, but also multiple private and public cloud environments they use for dev-test and production of their critical applications.  How does portability play into this?  The answer to that is a question- "How does the security of a compute environment move across implementations?"

Achieving high portability requires a few things.  First you need to standardize on well-adopted technology and open standards.  This includes security standards and cloud standards and cloud technologies.  HP has chosen to standardize on OpenStack, which means greater portability of the workloads and even security policies from one environment and cloud to another.  

Other vendors are doing this too... customers fear vendor lock-in and are thus looking for cloud vendors/partners who provide openness and thereby portability across their various platforms.

What are we talking about here, when it comes to portability?  First is the acknowledgement that security isn't about the perimeter (exclusively) anymore and hasn't been for quite some time.  The move to cloud computing environments hastens this awareness.  When an application is built and packaged up it can be done at the individual application level, or at the deployment level.  

Think about it this way - you can either bundle up the virtual workload (virtual machines) complete with security policies, application deployment, etc or you can do this on an application-level as well.  Either way, we're forced to acknowledge that the perimeter is vanished and firewalls around the outside of the castle simply don't cut it anymore.

Being able to claim portability is a difficult thing.  The only way you can achieve high levels of portability is when environments you're deploying to, or building for, are consistent.  As an example, OpenStack provides an API with rate limiting and authentication.  

Having the ability to rate-limit and authenticate API calls against the environment is critical not just because you need to know who's making requests of resources, but it would be really good to have the ability to keep one customer from impacting several others and thereby creating a service degradation condition.  I've talked about this at last week's HP TechDay in Sydney.  

OpenStack goes a long way to create a good security environment - but in order to take advantage of it we require a consistent implementation of OpenStack.  Therein lies the next issue - making sure your providers are consistent in their implementation of even an open platform like OpenStack.

Portability is important not just across your various cloud providers but also internally as most organizations start to build internal private clouds and external consumption of public clouds rises as well.  Being able to run an internal build-test environment on your private cloud and then have a package you can push to the public cloud when that's ready is critical to the speed of deployment and resiliency.

It's also critical when adopting a rapid-release development mentality like DevOps or similar approaches.  The ability to spread a single packaged workload or application across multiple cloud providers is critical as well... in case you need that level of resiliency and failure recovery capability.

I'm going to put together a post on exactly what portability means in technical implementation in the near future, so watch this blog for more information!

Cross-posted from Following the White Rabbit

Possibly Related Articles:
13593
Cloud Security
Service Provider
Cloud Security Enterprise Security Application Security Managed Services Data Center vendors DevOps Portability
Post Rating I Like this!
The views expressed in this post are the opinions of the Infosec Island member that posted this content. Infosec Island is not responsible for the content or messaging of this post.

Unauthorized reproduction of this article (in part or in whole) is prohibited without the express written permission of Infosec Island and the Infosec Island member that posted this content--this includes using our RSS feed for any purpose other than personal use.