Enterprise IT, virtualization

Virtual Integrated System Blog – Government – Are Feds Putting the Cloud Before the Horse?

The Obama administration has made “Cloud First” a key part of its strategy for creating a more efficient government IT infrastructure. But simply adopting cloud-based services for new IT acquisitions isn’t going to make the IT management situation any easier.

“We’re hearing from many agencies that the OMB (Office of Management and Budget) will consider cloud as a core element as part of a larger strategy around IT services,” said Kevin Smith, Dell’s marketing director for the Virtual Integrated System architecture, at a recent Ziff Davis Enterprise eSeminar on cloud in the government. “But there has been some criticism, from those who feel that agencies should continue to create open, manageable and uniform infrastructures before they start shifting to cloud platforms.”

As I’ve mentioned here previously, the federal government has been pretty aggressive about creating open cloud standards. The Federal CIO Council’s FedRAMP initiative and the National Institute for Standards and Technology’s SAJACC efforts have laid the groundwork for security and interoperability standards for cloud services, and NASA’s contributions to the OpenStack initiative to create an open-source cloud infrastructure have done a lot to create an open implementation of cloud computing that others can build on. But all the interoperability in the world doesn’t help get the government’s application infrastructure out of its sprawling population of data centers and into a shared cloud environment.

Federal IT initiatives will not be well-served by simply buying software-as-a-service, or platform-as-a-service, or infrastructure-as-a-service from an approved Apps.gov provider and then throwing virtual servers into the cloud. Adding externally hosted cloud resources to the management stack for federal IT managers who are trying to consolidate their internal applications in accordance with OMB’s data center goals just creates another degree of complexity in the process, and another set of processes and management tools that need to be mastered by IT staff.

Read the rest of this post at:

Virtual Integrated System Blog – Government – Are Feds Putting the Cloud Before the Horse?.

Standard
Air Force, Boeing, Defense Department, EADS, Policy

Gates– Just say no to “corporate food fights” on tanker procurement

Audio:Gates on Air Force KC-X competition:   This morning, in an address to the Air Force Association conference, Secretary Gates announced that the  Air Force will be the contracting authority for the KC-X Tanker, but with with oversight from contracting officials at the Department of Defense. He expects the release of the Draft RFP for the KC-X Tanker “soon”. “We are committed to the integrity of the selection process and cannot afford the kind of let downs, parochial squabbles and corporate food fights that have bedeviled this effort over the last number of years.”

Standard
Air Force, Boeing, Defense Department, EADS, Policy

Gates– Just say no to "corporate food fights" on tanker procurement

Audio:Gates on Air Force KC-X competition:   This morning, in an address to the Air Force Association conference, Secretary Gates announced that the  Air Force will be the contracting authority for the KC-X Tanker, but with with oversight from contracting officials at the Department of Defense. He expects the release of the Draft RFP for the KC-X Tanker “soon”. “We are committed to the integrity of the selection process and cannot afford the kind of let downs, parochial squabbles and corporate food fights that have bedeviled this effort over the last number of years.”

Standard
Cyberdefense and Information Assurance, Other Federal Agencies

Navy’s NGEN schedule requires an NMCI holding pattern

At today’s Navy Next Generation Enterprise Network (NGEN) Industry Day in DC, the Navy NGEN program team announced that the holders of the Navy’s current intranet program contract,Hewlett-Packard’s EDS, would be approached with a single-source contract to continue to maintain their outsourced Navy Marine Corps Intranet (NMCI) while the Navy continues with the herculean task of getting a whole new network procurement program in place.

The NMCI contract expires in September of 2010. At current projections, because of the size and required oversight for the NGEN contract, NGEN won’t be ready to begin deployment until at least mid-2011. And there’s the small matter of being able to migrate from NMCI, the infrastructure of which is owned by EDS.

So, to bridge the gap, the Navy announced that it would be working with EDS to award them a sole-source contract to continue to support the network for the expected transition period — which is expected to last about 28 months after the contract start. The contract will also include terms by which the government will have continued access to the network during the transition, and obtain a government-use license for all of the intellectual property required for NGEN-bidders to figure out how to connect to NMCI.

Standard
Contractors & Vendors, Cyberdefense and Information Assurance, Other Federal Agencies, Policy

Navy's NGEN schedule requires an NMCI holding pattern

At today’s Navy Next Generation Enterprise Network (NGEN) Industry Day in DC, the Navy NGEN program team announced that the holders of the Navy’s current intranet program contract,Hewlett-Packard’s EDS, would be approached with a single-source contract to continue to maintain their outsourced Navy Marine Corps Intranet (NMCI) while the Navy continues with the herculean task of getting a whole new network procurement program in place.

The NMCI contract expires in September of 2010. At current projections, because of the size and required oversight for the NGEN contract, NGEN won’t be ready to begin deployment until at least mid-2011. And there’s the small matter of being able to migrate from NMCI, the infrastructure of which is owned by EDS.

So, to bridge the gap, the Navy announced that it would be working with EDS to award them a sole-source contract to continue to support the network for the expected transition period — which is expected to last about 28 months after the contract start. The contract will also include terms by which the government will have continued access to the network during the transition, and obtain a government-use license for all of the intellectual property required for NGEN-bidders to figure out how to connect to NMCI.

Standard
Defense Department, Other Federal Agencies, Policy, tech

Attention (to the) Deficit

At the state level, there’s been at least a partial sigh of relief over the stimulus package (no…not THAT stimulus package). But now comes the knife–President Obama says he’s going to cut the deficit in half by the end of his term with a combination of tax increases (well, non-renewed tax cuts to wealthier Americans) and budget cuts. The biggest piece of the reduction is predicted to be the savings from the draw-down of troops in Iraq and the resulting reduction in GWOT (that’s Global War On Terror) outlays.

But there’s sure to be some serious slashes elsewhere. Considering the ongoing (and expanding) cost of Afghanistan, and that the Defense budget itself is going to be fairly static for at least the next 2 fiscal years aside from GWOT dollars, there’s going to have to be cutting elsewhere.

Arguably, that could be *good* for government IT spending, because improvements in efficiency through new technology will be key to getting the deficit down in a down economy. But the question is, where to start?

There’s a quick and dirty answer to that: procurement reform. The current approach to developing and purchasing just about anything, but particularly technology, is slow, odious and inefficient.

It’s not that the regulations prevent the government from buying things intelligently– as Charlie Croom said last year in an interview I did with him, “There’s nothing in the FAR that says you have to be stupid.” But there has to be a fundamental change in the culture of development and acquisition–there has to be incentive for reducing scope, investing in real technology standards (de facto, not arbitrary), and increasing flexibility for vendors in solving problems.

A study published by Steve O’Keefe’s Meritalk, Red Hat and DLT suggests that there are billions to be saved in a shift to the latest crop of de-facto standard technologies:

Over three years, the potential savings would be US$3.7 billion for using open-source software; $13.3 billion for using virtualization technologies; and $6.6 billion from cloud computing or software-as-a-service, the study said.

While it’s a vendor study–and a study sponsored by vendors who stand to make money over a shift to open standards and the like–there’s still plenty to chew on there. Sure, there are regulatory hurdles to leap to use some of these technologies, but the main barrier to adopting these approaches is cultural.

Standard