cloud computing, NASA, sticky, tech, virtualization

NASA’s Chris Kemp calls OpenStack the “Linux of Cloud”, and predicts a public cloud future.

Chris Kemp, NASA’s CTO for IT, closed out yesterday’s Cloud/Gov conference in DC with a discussion of Nebula, NASA’s open-source cloud-in-a-shipping-container, and the impact it has had on the agency. Kemp was greeted with the most enthusiasm from the audience that any of the speakers got, including whoops from some of the government employees and vendors in the audience, and for good reason: Nebula has become the gravitational center of cloud standards efforts within and outside the government.

“While (the National Institute of Standards and Technology) is talking about standards, there are defacto standards that are evolving right now,” Kemp said. And Nebula, he said, “is a reference implementation of what NIST is doing.”

The Nebula project’s code has become the core of the OpenStack initiative, the open-source cloud infrastructure software project, and now is maintained by a community that includes Intel, AMD, Dell, Rackspace, and an army of other technology companies. “There are over 1000 developers that have submitted features and bug fixes,” Kemp said, “and over 100 companies.  If you’re interested in doing a cloud, you can download OpenStack today.  It’s the Linux of the cloud–it gives you an environment you can actually develop on and meet a requirement, and build your environment on, on a platform that’s compatible with everything in the industry.”

Kemp said that he believed that the public cloud could be as secure as private clouds, but that private clouds were a “necessary stepping stone” to the day when NASA didn’t have to be in the IT business, to demonstrate that cloud environments could be completely secure.  And by moving to a private cloud, agencies were doing the majority of the work required to get them to the point where they can move to a public cloud infrastructure.

“Once you virtualize an application, you’re more than halfway there,” Kemp said.  “Every agency that builds a private cloud takes us 90% of the way to where we’ll be able to put everything in the public cloud.”

Still, Kemp said, it will be decades before agencies are able to make that jump completely. “We’ve only scratched the surface of this.  We still have mainframe systems running that were coded in the ’70’s. They’re systems we just haven’t taken the time to make run in Oracle or SQL Server .  Moving something to cloud is a thousand times bigger a challenge.”  The only apps that have been written to take advantage of the features of cloud so far are apps that were written for the cloud to begin with, such as Google’s apps, and Zynga’s game platforms.

Kemp emphasized that cloud infrastructure and data center consolidation were not synonymous.  “One thing that I hope happens is that you treat data center consolidation and cloud as separate things. If you’re virtualizing existing applications, you need the support of commercial systems. But if you’re doing really pioneering development, and can’t use Amazon, then you need something like (Nebula).”

Standard
Cyberdefense and Information Assurance

NIST Sketches Map to Secure Public Cloud for Feds

NIST has just published a draft set of guidelines for government agencies to follow to ensure security and privacy compliance when they use public cloud services, such as Google Apps for Government.  Written by NIST computer scientists Wayne Jansen and Timothy Grance, NIST Draft Special Publication 800-144 is the product of several years of work examining cloud computing, and comes just as the federal government is instituting a “cloud first” policy for new IT projects.  While it’s still in draft phase for comment, the recommendations included in it apply not just to federal agencies, but to any public sector organization looking at possibly using public cloud services.
Entitled “Guidelines on Security and Privacy in Public Cloud Computing” (PDF), it takes a deep dive on all of the concerns, precautions and policies that agencies at all levels should be aware of when looking at public cloud services, and even offers up the City of Los Angeles’ conversion to Google Apps for Government as a case study of what can go right and wrong in a migration to cloud. Google had to make some extraordinary contractual concessions to the City of Los Angeles to meet the city’s needs, and it’s not certain they would be able to be so flexible for many state and local governments’ requirements.
At a high level, Jansen and Grance’s recommendations fall into four major categories:
  1. “Carefully plan the security and privacy aspects of cloud computing solutions before engaging them.” Before even looking at cloud solutions, an organization should make certain that they fully understand the privacy and security requirements of the data that will be handled.  Not doing due diligence on all of the potential privacy and security issues in advance can lead to roadblocks to deployment later—or worse, major breaches in security or exposure of citizens’ private data. The City of Los Angeles was caught by surprise when it found that its cloud solution wasn’t in alignment with federal data protection regulations for public safety data, for example.
  2. “Understand the public cloud computing environment offered by the cloud provider and ensure that a cloud computing solution satisfies organizational security and privacy requirements. “ The financial advantage of cloud services usually comes from the same advantage that drove Henry Ford’s model of manufacturing efficiency: “You can have any color, as long as it’s black.”  Most public cloud services—be they infrastructure-as-a-service, platform-as-a-service, or software-as-a-service, were not built with public sector regulatory requirements in mind.  Agencies need to do an analysis of the gaps between what cloud providers have and what their own privacy and security demands require—and then determine whether the cost of getting that sort of solution from a cloud provider makes going forward with a project financially feasible.
  3. “Ensure that the client-side computing environment meets organizational security and privacy requirements for cloud computing.” Just because the application and data are secure at the back end in the provider’s cloud doesn’t ensure the overall security of the solution.  It’s easy to overlook the client side, which can create a number of potential security problems—especially if SaaS applications include support for mobile devices.  It’s important to take issues like how to lock down smart phones and other mobile devices from being able to gain access through cached credentials, for example, if they’re lost or stolen.   And there’s also the issue of how the public cloud service will integrate with identity management and established authentication standards already being used in the organization.
  4. “Maintain accountability over the privacy and security of data and applications implemented and deployed in public cloud computing environments.” Outsourcing the infrastructure doesn’t mean an organization is outsourcing responsibility.  Public cloud should be handled like any managed service or outsourcing arrangement—agencies need to be able to ensure that security and privacy practices are applied consistently and appropriately in the cloud just as they are to internal IT resources.  That means that agencies should ensure that they have visibility into the operation of the cloud service , including the ability to monitor the security of the cloud assets, and continually assess how well security and privacy standards and practices are implemented within the cloud infrastructure.
Standard