Defense Department, Other Federal Agencies

Lots of DISA puns are possible, but I refuse to cloud the issue.

Yes, I’ve been scarce lately. Getting paid by the word will do that to you. Last Friday, I attended the Defense Information Systems Agency’s Forecast to Industry, which is basically where all the vendors come to find out what DISA is looking to buy in the near future.

The big thing at DISA these days is the idea of turning itself into an on-demand IT services provider, where customers–the military services, the White House, and the intelligence community–can go to connect, collaborate, communicate and integrate. In other words, their strategy is to turn themselves into a giant private cloud computing provider for the national leadership, “warfighters” and the agencies that support them.

That vision is a long way off. Right now, DISA is just trying to get its customers to embrace the idea of standardized, virtualized servers that can be quickly (in a matter of minutes) provisioned, and let DISA host their common services, like email and human resources self-service applications, in a virtualized cloud of servers distributed across DISA’s data centers. And the agency is providing services like Forge.mil, a software collaboration environment (currently hosted on Navy servers, but about to be pulled into DISA’s Defense Enterprise Computing Centers on their virtualization platform, called RACE).

John Garing, DISA’s director of strategic planning, was former DISA director Charlie Croom’s CIO. He and DISA CTO Dave Mihelcic have been thinking a lot about the direction DISA needs to move in to serve DOD more quickly, and the model Garing has latched onto looks a lot like Salesforce.com — create a core set of application services that customers can use to get common tasks done (SOA, software-as-a-service, applications and servers on demand); give them a way to develop new services on top of that platform and quickly test, certify and deploy them (The FORGE.mil platform, which will soon include TestForge ), and get software vendors and integrators to build to the standardized environment to lower certification, testing and deployment costs, and allow customers to save by paying for just what they use.

“Forge is the embryo from where that’s going to grow,” Garing told me. “In the middle of june, we had the four services here, and went through all this stuff including Forge and RACE. Out of that session, Gen. Sorenson (the Army’s CIO and G6) asked us to provide those capabilities forward so that smart lieutenants in Afghanistan could use those tools.”

The “smart lieutenants” are the types of people who put together things like CAVNET, the collaborative website that the Army’s 1st Cavalry used to share ground truth in Iraq. By putting collaboration tools like FORGE.mil–which can potentially be used for a lot more than collaborative software development–within reach of officers in Afghanistan over a satcom link or in a command post at Bagram, for example, DISA could create the opportunity for grassroots, community creation of the next great thing to help save lives and speed the mission.

There’s just one problem with moving DOD to a virtualized cloud infrastructure– the current state of DOD IT is a giant cluster of purpose-bought systems, with point-to-point integration, and a whole raft of legacy operating systems and interfaces that would drive most IT manager to tears. The recent Navy Consolidated Afloat Networks and Enterprise Services RFP, for example apparently asks for backward compatibility with everything back to DOS 3.2, according to one contractor familiar with it–because there are still systems in place that rely on software written with hooks for those legacy operating systems.

Standard
Contractors & Vendors, Cyberdefense and Information Assurance, Other Federal Agencies, Policy

Navy's NGEN schedule requires an NMCI holding pattern

At today’s Navy Next Generation Enterprise Network (NGEN) Industry Day in DC, the Navy NGEN program team announced that the holders of the Navy’s current intranet program contract,Hewlett-Packard’s EDS, would be approached with a single-source contract to continue to maintain their outsourced Navy Marine Corps Intranet (NMCI) while the Navy continues with the herculean task of getting a whole new network procurement program in place.

The NMCI contract expires in September of 2010. At current projections, because of the size and required oversight for the NGEN contract, NGEN won’t be ready to begin deployment until at least mid-2011. And there’s the small matter of being able to migrate from NMCI, the infrastructure of which is owned by EDS.

So, to bridge the gap, the Navy announced that it would be working with EDS to award them a sole-source contract to continue to support the network for the expected transition period — which is expected to last about 28 months after the contract start. The contract will also include terms by which the government will have continued access to the network during the transition, and obtain a government-use license for all of the intellectual property required for NGEN-bidders to figure out how to connect to NMCI.

Standard
Other Federal Agencies, People

DC CTO to be White House CIO

Just reported for Internetnews :

DC CTO named as Obama Administration’s CIO

The Obama administration has named Vivek Kundra to be the the first-ever White House-level chief information officer.

Kundra is currently chief technology officer for Washington, DC’s city government, and has risen in the public eye because of his innovative approaches to managing the city’s technology projects. As the first-ever Federal Chief Information Officer, he’ll be responsible for managing the entire federal government’s technology portfolio and budget, and overseeing its enterprise architecture.

I’ll link when the story is live.

Standard
Other Federal Agencies, People

Nicely done! Nicely takes over as ODNI CIO while Gorman is on leave

The irrepressible Bob Brewin reports over on NextGov that Sherrill Nicely, deputy chief information officer at the Office of the Director of National Intelligence, “has assumed the role of acting CIO while Patrick Gorman, the acting CIO, is on leave. ODNI did not say why Gorman was on leave or for how long.”

Gorman was acting CIO. Does that mean that Nicely is now the acting acting CIO?

Standard
Cyberdefense and Information Assurance, Other Federal Agencies, tech

NIST puts head into cloud computing

FCW reports that NIST is assembling a cloud computing team to develop ways to assess the security of applications built with a cloud computing architecture.

“The team will give our customers a sense of what kinds of risks they may be taking on by moving into that new territory,” [Ron Ross, a senior computer scientist and information security researcher at NIST] said today at the SaaS/Gov 2009 conference produced by the Software and Information Industry Association and market research firm Input.

Standard
Other Federal Agencies, Policy

DHS Cyber-czar says Federal cyber goals would be nice, thanks

Ron Beckstrom, director of DHS’s national security center, told Homeland Security conference attendees that “the end state (of a cyber security effort) is not adequately discussed”, and that the Federal government needs to set some concrete goals.”Do we want a stable Internet for commerce, for communication, for intelligence, for information-sharing or for the warfighter to have an electronic advantage in war? We need clearer directives.”

Standard
Defense Department, Other Federal Agencies, Policy, tech

Attention (to the) Deficit

At the state level, there’s been at least a partial sigh of relief over the stimulus package (no…not THAT stimulus package). But now comes the knife–President Obama says he’s going to cut the deficit in half by the end of his term with a combination of tax increases (well, non-renewed tax cuts to wealthier Americans) and budget cuts. The biggest piece of the reduction is predicted to be the savings from the draw-down of troops in Iraq and the resulting reduction in GWOT (that’s Global War On Terror) outlays.

But there’s sure to be some serious slashes elsewhere. Considering the ongoing (and expanding) cost of Afghanistan, and that the Defense budget itself is going to be fairly static for at least the next 2 fiscal years aside from GWOT dollars, there’s going to have to be cutting elsewhere.

Arguably, that could be *good* for government IT spending, because improvements in efficiency through new technology will be key to getting the deficit down in a down economy. But the question is, where to start?

There’s a quick and dirty answer to that: procurement reform. The current approach to developing and purchasing just about anything, but particularly technology, is slow, odious and inefficient.

It’s not that the regulations prevent the government from buying things intelligently– as Charlie Croom said last year in an interview I did with him, “There’s nothing in the FAR that says you have to be stupid.” But there has to be a fundamental change in the culture of development and acquisition–there has to be incentive for reducing scope, investing in real technology standards (de facto, not arbitrary), and increasing flexibility for vendors in solving problems.

A study published by Steve O’Keefe’s Meritalk, Red Hat and DLT suggests that there are billions to be saved in a shift to the latest crop of de-facto standard technologies:

Over three years, the potential savings would be US$3.7 billion for using open-source software; $13.3 billion for using virtualization technologies; and $6.6 billion from cloud computing or software-as-a-service, the study said.

While it’s a vendor study–and a study sponsored by vendors who stand to make money over a shift to open standards and the like–there’s still plenty to chew on there. Sure, there are regulatory hurdles to leap to use some of these technologies, but the main barrier to adopting these approaches is cultural.

Standard