State & Local, tech, virtualization

Virtualized Desktops at the Library Check Out IT Savings

(This post was originally posted to the Virtual Integrated Systems public sector blog.)

My wife is a librarian at a county public library. Not to brag, but she excels at helping her patrons find what they’re looking for, either in the stacks or in databases. But when she became a librarian, she wasn’t expecting the degree to which she’d be called upon to provide another service: tech support.

Public libraries have long been a significant service of local governments, but their mission has changed significantly over the last decade, as more of our lives have moved online. Libraries now are where people who don’t have PCs at home or work go to do everything from check their e-mail to apply for jobs, and librarians are increasingly called upon to help with basic computer literacy issues as frequently as they’re asked a research question or asked to recommend a book, if not more often.

But managing the configuration and security of public computers at the library can be an expensive undertaking. With budgets shrinking, adding more computers or even maintaining the ones that are in place can be difficult. The cost of adding software licenses for operating systems and applications can quickly outstrip the basic hardware cost. And with patrons bringing removable media to the library and accessing potentially malicious sites, the security risks are high.

Given their limited number of PCs, libraries have to restrict the amount of time patrons can use systems. The software they use to meter usage and assign computers can often create difficulties both for the patrons and the librarians who serve them.

Then there’s the issue of how to better serve customers who bring their own technology to the library, who may wish to use resources such as databases. While some libraries offer Web portals to access these services, the cost of setting up such systems can be prohibitive to mid-sized and smaller public libraries–especially when budgets are tight.

The City of Staunton, Virginia, for example, faced many of these problems with the operation of its public library, according to Kurt Plowman, the city’s CTO. Mounting maintenance problems and malware issues left the library’s computers unusable as much as 50 percent of the time.

“Our resources were stretched thin, so spending several hours a week fixing software problems and replacing parts was becoming a never-ending nightmare,” he said recently. “The public library was overdue for a solution.”

Plowman turned to desktop virtualization as a solution, using thin clients from Pano Logic to replace the library’s aging desktops. The city used VMware to serve up virtual desktops on demand to the terminals, clearing each session at its end and preventing the storage of any data on a shared hard disk by the user.

With virtual desktops, each user gets a fresh, controlled configuration, locked down from potential security threats. That means fewer helpdesk calls, fewer frustrated patrons, and much lower desktop support costs for the city, which is considering expanding the virtual desktop model to other departments of city government.

The City of Staunton was recognized for this solution with a Governor’s Technology Award for Innovation in Local Government at the Commonwealth of Virginia Technology Symposium in September 2010.

 

Standard
cloud computing, NASA, sticky, tech, virtualization

NASA’s Chris Kemp calls OpenStack the “Linux of Cloud”, and predicts a public cloud future.

Chris Kemp, NASA’s CTO for IT, closed out yesterday’s Cloud/Gov conference in DC with a discussion of Nebula, NASA’s open-source cloud-in-a-shipping-container, and the impact it has had on the agency. Kemp was greeted with the most enthusiasm from the audience that any of the speakers got, including whoops from some of the government employees and vendors in the audience, and for good reason: Nebula has become the gravitational center of cloud standards efforts within and outside the government.

“While (the National Institute of Standards and Technology) is talking about standards, there are defacto standards that are evolving right now,” Kemp said. And Nebula, he said, “is a reference implementation of what NIST is doing.”

The Nebula project’s code has become the core of the OpenStack initiative, the open-source cloud infrastructure software project, and now is maintained by a community that includes Intel, AMD, Dell, Rackspace, and an army of other technology companies. “There are over 1000 developers that have submitted features and bug fixes,” Kemp said, “and over 100 companies.  If you’re interested in doing a cloud, you can download OpenStack today.  It’s the Linux of the cloud–it gives you an environment you can actually develop on and meet a requirement, and build your environment on, on a platform that’s compatible with everything in the industry.”

Kemp said that he believed that the public cloud could be as secure as private clouds, but that private clouds were a “necessary stepping stone” to the day when NASA didn’t have to be in the IT business, to demonstrate that cloud environments could be completely secure.  And by moving to a private cloud, agencies were doing the majority of the work required to get them to the point where they can move to a public cloud infrastructure.

“Once you virtualize an application, you’re more than halfway there,” Kemp said.  “Every agency that builds a private cloud takes us 90% of the way to where we’ll be able to put everything in the public cloud.”

Still, Kemp said, it will be decades before agencies are able to make that jump completely. “We’ve only scratched the surface of this.  We still have mainframe systems running that were coded in the ’70’s. They’re systems we just haven’t taken the time to make run in Oracle or SQL Server .  Moving something to cloud is a thousand times bigger a challenge.”  The only apps that have been written to take advantage of the features of cloud so far are apps that were written for the cloud to begin with, such as Google’s apps, and Zynga’s game platforms.

Kemp emphasized that cloud infrastructure and data center consolidation were not synonymous.  “One thing that I hope happens is that you treat data center consolidation and cloud as separate things. If you’re virtualizing existing applications, you need the support of commercial systems. But if you’re doing really pioneering development, and can’t use Amazon, then you need something like (Nebula).”

Standard
Enterprise IT, virtualization

Virtual Integrated System Blog – Government – Are Feds Putting the Cloud Before the Horse?

The Obama administration has made “Cloud First” a key part of its strategy for creating a more efficient government IT infrastructure. But simply adopting cloud-based services for new IT acquisitions isn’t going to make the IT management situation any easier.

“We’re hearing from many agencies that the OMB (Office of Management and Budget) will consider cloud as a core element as part of a larger strategy around IT services,” said Kevin Smith, Dell’s marketing director for the Virtual Integrated System architecture, at a recent Ziff Davis Enterprise eSeminar on cloud in the government. “But there has been some criticism, from those who feel that agencies should continue to create open, manageable and uniform infrastructures before they start shifting to cloud platforms.”

As I’ve mentioned here previously, the federal government has been pretty aggressive about creating open cloud standards. The Federal CIO Council’s FedRAMP initiative and the National Institute for Standards and Technology’s SAJACC efforts have laid the groundwork for security and interoperability standards for cloud services, and NASA’s contributions to the OpenStack initiative to create an open-source cloud infrastructure have done a lot to create an open implementation of cloud computing that others can build on. But all the interoperability in the world doesn’t help get the government’s application infrastructure out of its sprawling population of data centers and into a shared cloud environment.

Federal IT initiatives will not be well-served by simply buying software-as-a-service, or platform-as-a-service, or infrastructure-as-a-service from an approved Apps.gov provider and then throwing virtual servers into the cloud. Adding externally hosted cloud resources to the management stack for federal IT managers who are trying to consolidate their internal applications in accordance with OMB’s data center goals just creates another degree of complexity in the process, and another set of processes and management tools that need to be mastered by IT staff.

Read the rest of this post at:

Virtual Integrated System Blog – Government – Are Feds Putting the Cloud Before the Horse?.

Standard
General Services Administration, Google, Unisys, virtualization

Why You Won't See Many Repeats of GSA/Google Apps Deal For Now

In case you missed it, the GSA recently announced that it had awarded Unisys (and its subs: Google, Tempus Nova, and Acumen Solutions) to implement Google Apps for Government for up to 17,000 GSA employees.  With the deal (and a projected 50% savings on collaboration systems over the next 5 years), GSA has jumped out in front on the whole “cloud first” policy thing.

Yeah, yeah. Won’t see that again anytime soon. Here’s why.  First of all, Google Apps is a public cloud solution, residing out on the naked Internet.  While it may be FISMA compliant, the data still is all going to live in Google’s data centers, and there’s no private pipes to Google’s data centers–you have to traverse the public Internet to get to them. Depending on which agency you are, or what regulations you need to comply with, that may be an automatic non-starter.

For example, in Minnesota, when the state government was looking at cloud for collaboration, they ended up going with Microsoft because state regs are strict about state data not  traversing public networks.

Google isn’t offering Google Apps for Government as an appliance, like they do with their search engine. There’s no hint that they’re considering offering it in a customer’s private cloud deployment model at all.  They may be FISMA compliant, but they’ll never be DOD 5200 or 5015 or 5xxx.anything compliant as long as they’re in the public cloud.  Ideally, they could offer a government private cloud version via Apps.gov.  But I don’t see Google letting the government try to run Google Apps on anybody’s cloud hardware but their own, because they’ve tweaked the heck out of their hardware environment to support it. I can only imagine how long it would take to load my inbox if GMail was running on a 64x overprovisioned virtual server running in the secure data center of the lowest bidder.

And then there is FedRAMP.  While GSA is accepting Google’s FISMA compliance for now, and certifying it as usable, it will have to go through certification with FedRAMP all over again.  And more cautious agencies will no doubt wait until there’s more clarity about the FedRAMP process, and what will and won’t get certified by it, before they actually go out and contract someone to provide a public cloud service.

On the plus side, it would seem that Google Apps could play well with the never-ending directory services juggling that agencies (and especially DOD) have to do.  Google has Google Apps Directory Sync to connect to LDAP for provisioning. There are ways to turn Google Apps into a managed directory service as well, which I would imagine would be interesting for organizations that create and dissolve communities of interest regularly for collaboration, many of which have agency acronyms with three letters.  But again, the lack of a private cloud option sort of makes that moot.

In many ways, it’s a shame that Google hasn’t found a way to provide a private cloud service.  Google Apps could take on a significant percentage of the collaboration needs of many government agencies as-is, if only they could run it in a private cloud configuration, or find some FISMA-compliant hosting sites to handle it for them. DOD has been stumbling over how to “SaaS-ify” email for a few years now.   But I’m sure someone will take advantage of the opening…eventually.

Standard
General Services Administration, Google, Unisys, virtualization

Why You Won’t See Many Repeats of GSA/Google Apps Deal For Now

In case you missed it, the GSA recently announced that it had awarded Unisys (and its subs: Google, Tempus Nova, and Acumen Solutions) to implement Google Apps for Government for up to 17,000 GSA employees.  With the deal (and a projected 50% savings on collaboration systems over the next 5 years), GSA has jumped out in front on the whole “cloud first” policy thing.

Yeah, yeah. Won’t see that again anytime soon. Here’s why.  First of all, Google Apps is a public cloud solution, residing out on the naked Internet.  While it may be FISMA compliant, the data still is all going to live in Google’s data centers, and there’s no private pipes to Google’s data centers–you have to traverse the public Internet to get to them. Depending on which agency you are, or what regulations you need to comply with, that may be an automatic non-starter.

For example, in Minnesota, when the state government was looking at cloud for collaboration, they ended up going with Microsoft because state regs are strict about state data not  traversing public networks.

Google isn’t offering Google Apps for Government as an appliance, like they do with their search engine. There’s no hint that they’re considering offering it in a customer’s private cloud deployment model at all.  They may be FISMA compliant, but they’ll never be DOD 5200 or 5015 or 5xxx.anything compliant as long as they’re in the public cloud.  Ideally, they could offer a government private cloud version via Apps.gov.  But I don’t see Google letting the government try to run Google Apps on anybody’s cloud hardware but their own, because they’ve tweaked the heck out of their hardware environment to support it. I can only imagine how long it would take to load my inbox if GMail was running on a 64x overprovisioned virtual server running in the secure data center of the lowest bidder.

And then there is FedRAMP.  While GSA is accepting Google’s FISMA compliance for now, and certifying it as usable, it will have to go through certification with FedRAMP all over again.  And more cautious agencies will no doubt wait until there’s more clarity about the FedRAMP process, and what will and won’t get certified by it, before they actually go out and contract someone to provide a public cloud service.

On the plus side, it would seem that Google Apps could play well with the never-ending directory services juggling that agencies (and especially DOD) have to do.  Google has Google Apps Directory Sync to connect to LDAP for provisioning. There are ways to turn Google Apps into a managed directory service as well, which I would imagine would be interesting for organizations that create and dissolve communities of interest regularly for collaboration, many of which have agency acronyms with three letters.  But again, the lack of a private cloud option sort of makes that moot.

In many ways, it’s a shame that Google hasn’t found a way to provide a private cloud service.  Google Apps could take on a significant percentage of the collaboration needs of many government agencies as-is, if only they could run it in a private cloud configuration, or find some FISMA-compliant hosting sites to handle it for them. DOD has been stumbling over how to “SaaS-ify” email for a few years now.   But I’m sure someone will take advantage of the opening…eventually.

Standard
Disaster Recovery, State & Local, virtualization

Reaching Out to Other Schools to Make Virtualization a Better Fit

While virtualization may be too much for smaller schools to bite off on their own, collaboration with other school districts, universities, or other learning institutions can make it more accessible. It can also raise the value of the pay-off of virtualization to everyone. But it requires planning and some standardization to really get the maximum benefit for all.

Take, for example, the K-12 Disaster Recovery Consortium (DRC), launched by Alvarado Independent School District’s Executive Director of Technology Services Kyle Berger. Berger purchased storage virtualization technology from Compellent, but soon realized that he wasn’t able to get the same sort of disaster recovery benefit as large private sector customers.

“Big businesses that have storage area networks have them replicating across their businesses’ locations around the world,” he said. Because of the threat of hurricanes hitting the Gulf coast of Texas, Berger said, he saw a need for disaster recovery capabilities to support not just his schools, but school districts further south in the state. “We wanted the ability to offload those school districts in case of emergency,” he explained. “I had a fellow IT director in the southern part of the state whose disaster plan was actually unscrewing his racks, putting them in a truck, and driving them north.”

Read the rest at the Virtual Integrated System Blog.

Standard