cloud computing, Cyberdefense and Information Assurance, sticky

State, Local Agencies Should Examine NISTs Public Cloud Guidelines

(This post was originally published on the Virtual Integrated System Blog )

As I mentioned in a recent post, the National Institute of Standards and Technology recently published a document outlining the risks of cloud computing and offering policies and procedures to help reduce those risks. While the guidelines aren’t official federal policy yet, they are a good starting point for agencies at any level of government thinking about using public clouds as a part of their cost-cutting and consolidation of IT services.

The core guidelines of the NIST document come down to four main steps in preparing for a public cloud solution:

  1. “Carefully plan the security and privacy aspects of cloud computing solutions before engaging them.” Before even looking at cloud solutions, an organization should fully understand the privacy and security requirements of the data that will be handled. Not doing due diligence on all of the potential privacy and security issues in advance can lead to roadblocks later–or worse, major breaches in security and exposure of citizens’ private data. The City of Los Angeles was caught by surprise when it found its cloud solution wasn’t in alignment with federal data protection regulations for public safety data, for example.
  2. “Understand the public cloud computing environment offered by the cloud provider and ensure that a cloud computing solution satisfies organizational security and privacy requirements.” Most public cloud services–be they infrastructure-as-a-service, platform-as-a-service, or software-as-a-service–were not built with public sector regulatory requirements in mind. Agencies need to do an analysis of the gaps between what cloud providers offer and what their own privacy and security demands require–and then determine whether the cost of getting that sort of solution from a cloud provider makes going forward with a project financially feasible.
  3. Ensure that the client-side computing environment meets organizational security and privacy requirements for cloud computing.” Just because the application and data are secure at the back end in the provider’s cloud doesn’t ensure the overall security of the solution. It’s easy to overlook the client side, which can create a number of potential security problems–especially if SaaS applications include support for mobile devices. It’s important to consider issues like how to lock down smartphones and other mobile devices, preventing them from accessing internal resources through cached credentials, for example, if they’re lost or stolen. And there’s also the issue of how the public cloud service will integrate with identity management and established authentication standards already being used in the organization.
  4. “Maintain accountability over the privacy and security of data and applications implemented and deployed in public cloud computing environments.” Outsourcing the infrastructure doesn’t mean an organization is outsourcing responsibility. Public clouds should be handled like any other managed service or outsourcing arrangement–agencies need to ensure that security and privacy practices are applied consistently and appropriately in the cloud just as they are to internal IT resources. That means agencies should have visibility into the operation of the cloud service, including the ability to monitor the security of the cloud assets and continually assess how well security and privacy standards and practices are implemented within the cloud infrastructure.

 

At the end of the day, after assessing how well public cloud providers can handle the requirements of government applications, agencies may find that much of what they thought could be moved to a public cloud environment is better suited to a private cloud service.

Standard
cloud computing, sticky

McNealy’s Monday Morning Quarterbacking on Solaris and Linux … shows he still doesn’t get it.

Scott “Privacy Is Dead” McNealy told an audience at an event in Silicon Valley that Sun could have won out over Linux if the company had consistently pushed forward Solaris xI86 instead of pussy-footing around.  “Google today would be running on Solaris,” he said.

Um, no.

Solaris was, and is, a great operating system, to be sure. But Linux did not succeed because of Sun’s failure to commit to Intel.  Linux succeeded because of the open-source model, and the ability of IT people all over the world to try it without license restrictions.

If Sun had open-sourced Solaris early, Sun may very well have taken a dent out of Linux’s success. But that’s a big if.  And considering how much internal wrangling, legal finagling and patent-exchanging had to be done to get Solaris open-sourced in the timeframe that it did, even with the somewhat restrictive terms of Sun’s custom-rolled open-source license even though it was a license that split Solaris off to some degree from other open-source communities , it’s doubtful that McNealy would have pulled it off. It wasn’t until 2005 that Sun cleared the legal hurdles to open-source Solaris.

There are so many other “woulda, shoulda, coulda” moments in Sun’s history. McNealy should be acknowledged for his early recognition of the coming of cloud computing — “application dial-tone”, he referred to it as.  But  Sun had multiple opportunities to redefine the market with open-source early, both with Java and Solaris.   The company’s toe-dips with its investments in OpenOffice (via its acquisition of StarOffice), Gnome, mySQL and other open-source projects came after Linux had already become a major threat. And honestly, Sun did those things to put a thumb in Microsoft’s eye.

So, McNealy can look back and replay the game all he wants. But it won’t change the fact that Sun was caught up in Sparc , and failed to leverage Solaris and Java to transition the company toward being an open-source driven software services company that also sells hardware.  And that’s why Larry Ellison owns Sun now.

Standard