State & Local, tech, virtualization

Virtualized Desktops at the Library Check Out IT Savings

(This post was originally posted to the Virtual Integrated Systems public sector blog.)

My wife is a librarian at a county public library. Not to brag, but she excels at helping her patrons find what they’re looking for, either in the stacks or in databases. But when she became a librarian, she wasn’t expecting the degree to which she’d be called upon to provide another service: tech support.

Public libraries have long been a significant service of local governments, but their mission has changed significantly over the last decade, as more of our lives have moved online. Libraries now are where people who don’t have PCs at home or work go to do everything from check their e-mail to apply for jobs, and librarians are increasingly called upon to help with basic computer literacy issues as frequently as they’re asked a research question or asked to recommend a book, if not more often.

But managing the configuration and security of public computers at the library can be an expensive undertaking. With budgets shrinking, adding more computers or even maintaining the ones that are in place can be difficult. The cost of adding software licenses for operating systems and applications can quickly outstrip the basic hardware cost. And with patrons bringing removable media to the library and accessing potentially malicious sites, the security risks are high.

Given their limited number of PCs, libraries have to restrict the amount of time patrons can use systems. The software they use to meter usage and assign computers can often create difficulties both for the patrons and the librarians who serve them.

Then there’s the issue of how to better serve customers who bring their own technology to the library, who may wish to use resources such as databases. While some libraries offer Web portals to access these services, the cost of setting up such systems can be prohibitive to mid-sized and smaller public libraries–especially when budgets are tight.

The City of Staunton, Virginia, for example, faced many of these problems with the operation of its public library, according to Kurt Plowman, the city’s CTO. Mounting maintenance problems and malware issues left the library’s computers unusable as much as 50 percent of the time.

“Our resources were stretched thin, so spending several hours a week fixing software problems and replacing parts was becoming a never-ending nightmare,” he said recently. “The public library was overdue for a solution.”

Plowman turned to desktop virtualization as a solution, using thin clients from Pano Logic to replace the library’s aging desktops. The city used VMware to serve up virtual desktops on demand to the terminals, clearing each session at its end and preventing the storage of any data on a shared hard disk by the user.

With virtual desktops, each user gets a fresh, controlled configuration, locked down from potential security threats. That means fewer helpdesk calls, fewer frustrated patrons, and much lower desktop support costs for the city, which is considering expanding the virtual desktop model to other departments of city government.

The City of Staunton was recognized for this solution with a Governor’s Technology Award for Innovation in Local Government at the Commonwealth of Virginia Technology Symposium in September 2010.


cloud computing, NASA, sticky, tech, virtualization

NASA’s Chris Kemp calls OpenStack the “Linux of Cloud”, and predicts a public cloud future.

Chris Kemp, NASA’s CTO for IT, closed out yesterday’s Cloud/Gov conference in DC with a discussion of Nebula, NASA’s open-source cloud-in-a-shipping-container, and the impact it has had on the agency. Kemp was greeted with the most enthusiasm from the audience that any of the speakers got, including whoops from some of the government employees and vendors in the audience, and for good reason: Nebula has become the gravitational center of cloud standards efforts within and outside the government.

“While (the National Institute of Standards and Technology) is talking about standards, there are defacto standards that are evolving right now,” Kemp said. And Nebula, he said, “is a reference implementation of what NIST is doing.”

The Nebula project’s code has become the core of the OpenStack initiative, the open-source cloud infrastructure software project, and now is maintained by a community that includes Intel, AMD, Dell, Rackspace, and an army of other technology companies. “There are over 1000 developers that have submitted features and bug fixes,” Kemp said, “and over 100 companies.  If you’re interested in doing a cloud, you can download OpenStack today.  It’s the Linux of the cloud–it gives you an environment you can actually develop on and meet a requirement, and build your environment on, on a platform that’s compatible with everything in the industry.”

Kemp said that he believed that the public cloud could be as secure as private clouds, but that private clouds were a “necessary stepping stone” to the day when NASA didn’t have to be in the IT business, to demonstrate that cloud environments could be completely secure.  And by moving to a private cloud, agencies were doing the majority of the work required to get them to the point where they can move to a public cloud infrastructure.

“Once you virtualize an application, you’re more than halfway there,” Kemp said.  “Every agency that builds a private cloud takes us 90% of the way to where we’ll be able to put everything in the public cloud.”

Still, Kemp said, it will be decades before agencies are able to make that jump completely. “We’ve only scratched the surface of this.  We still have mainframe systems running that were coded in the ’70’s. They’re systems we just haven’t taken the time to make run in Oracle or SQL Server .  Moving something to cloud is a thousand times bigger a challenge.”  The only apps that have been written to take advantage of the features of cloud so far are apps that were written for the cloud to begin with, such as Google’s apps, and Zynga’s game platforms.

Kemp emphasized that cloud infrastructure and data center consolidation were not synonymous.  “One thing that I hope happens is that you treat data center consolidation and cloud as separate things. If you’re virtualizing existing applications, you need the support of commercial systems. But if you’re doing really pioneering development, and can’t use Amazon, then you need something like (Nebula).”