General Services Administration, Google, Unisys, virtualization

Why You Won't See Many Repeats of GSA/Google Apps Deal For Now

In case you missed it, the GSA recently announced that it had awarded Unisys (and its subs: Google, Tempus Nova, and Acumen Solutions) to implement Google Apps for Government for up to 17,000 GSA employees.  With the deal (and a projected 50% savings on collaboration systems over the next 5 years), GSA has jumped out in front on the whole “cloud first” policy thing.

Yeah, yeah. Won’t see that again anytime soon. Here’s why.  First of all, Google Apps is a public cloud solution, residing out on the naked Internet.  While it may be FISMA compliant, the data still is all going to live in Google’s data centers, and there’s no private pipes to Google’s data centers–you have to traverse the public Internet to get to them. Depending on which agency you are, or what regulations you need to comply with, that may be an automatic non-starter.

For example, in Minnesota, when the state government was looking at cloud for collaboration, they ended up going with Microsoft because state regs are strict about state data not  traversing public networks.

Google isn’t offering Google Apps for Government as an appliance, like they do with their search engine. There’s no hint that they’re considering offering it in a customer’s private cloud deployment model at all.  They may be FISMA compliant, but they’ll never be DOD 5200 or 5015 or 5xxx.anything compliant as long as they’re in the public cloud.  Ideally, they could offer a government private cloud version via Apps.gov.  But I don’t see Google letting the government try to run Google Apps on anybody’s cloud hardware but their own, because they’ve tweaked the heck out of their hardware environment to support it. I can only imagine how long it would take to load my inbox if GMail was running on a 64x overprovisioned virtual server running in the secure data center of the lowest bidder.

And then there is FedRAMP.  While GSA is accepting Google’s FISMA compliance for now, and certifying it as usable, it will have to go through certification with FedRAMP all over again.  And more cautious agencies will no doubt wait until there’s more clarity about the FedRAMP process, and what will and won’t get certified by it, before they actually go out and contract someone to provide a public cloud service.

On the plus side, it would seem that Google Apps could play well with the never-ending directory services juggling that agencies (and especially DOD) have to do.  Google has Google Apps Directory Sync to connect to LDAP for provisioning. There are ways to turn Google Apps into a managed directory service as well, which I would imagine would be interesting for organizations that create and dissolve communities of interest regularly for collaboration, many of which have agency acronyms with three letters.  But again, the lack of a private cloud option sort of makes that moot.

In many ways, it’s a shame that Google hasn’t found a way to provide a private cloud service.  Google Apps could take on a significant percentage of the collaboration needs of many government agencies as-is, if only they could run it in a private cloud configuration, or find some FISMA-compliant hosting sites to handle it for them. DOD has been stumbling over how to “SaaS-ify” email for a few years now.   But I’m sure someone will take advantage of the opening…eventually.

Standard
General Services Administration, Google, Unisys, virtualization

Why You Won’t See Many Repeats of GSA/Google Apps Deal For Now

In case you missed it, the GSA recently announced that it had awarded Unisys (and its subs: Google, Tempus Nova, and Acumen Solutions) to implement Google Apps for Government for up to 17,000 GSA employees.  With the deal (and a projected 50% savings on collaboration systems over the next 5 years), GSA has jumped out in front on the whole “cloud first” policy thing.

Yeah, yeah. Won’t see that again anytime soon. Here’s why.  First of all, Google Apps is a public cloud solution, residing out on the naked Internet.  While it may be FISMA compliant, the data still is all going to live in Google’s data centers, and there’s no private pipes to Google’s data centers–you have to traverse the public Internet to get to them. Depending on which agency you are, or what regulations you need to comply with, that may be an automatic non-starter.

For example, in Minnesota, when the state government was looking at cloud for collaboration, they ended up going with Microsoft because state regs are strict about state data not  traversing public networks.

Google isn’t offering Google Apps for Government as an appliance, like they do with their search engine. There’s no hint that they’re considering offering it in a customer’s private cloud deployment model at all.  They may be FISMA compliant, but they’ll never be DOD 5200 or 5015 or 5xxx.anything compliant as long as they’re in the public cloud.  Ideally, they could offer a government private cloud version via Apps.gov.  But I don’t see Google letting the government try to run Google Apps on anybody’s cloud hardware but their own, because they’ve tweaked the heck out of their hardware environment to support it. I can only imagine how long it would take to load my inbox if GMail was running on a 64x overprovisioned virtual server running in the secure data center of the lowest bidder.

And then there is FedRAMP.  While GSA is accepting Google’s FISMA compliance for now, and certifying it as usable, it will have to go through certification with FedRAMP all over again.  And more cautious agencies will no doubt wait until there’s more clarity about the FedRAMP process, and what will and won’t get certified by it, before they actually go out and contract someone to provide a public cloud service.

On the plus side, it would seem that Google Apps could play well with the never-ending directory services juggling that agencies (and especially DOD) have to do.  Google has Google Apps Directory Sync to connect to LDAP for provisioning. There are ways to turn Google Apps into a managed directory service as well, which I would imagine would be interesting for organizations that create and dissolve communities of interest regularly for collaboration, many of which have agency acronyms with three letters.  But again, the lack of a private cloud option sort of makes that moot.

In many ways, it’s a shame that Google hasn’t found a way to provide a private cloud service.  Google Apps could take on a significant percentage of the collaboration needs of many government agencies as-is, if only they could run it in a private cloud configuration, or find some FISMA-compliant hosting sites to handle it for them. DOD has been stumbling over how to “SaaS-ify” email for a few years now.   But I’m sure someone will take advantage of the opening…eventually.

Standard
Army, Joint Combatant Commands, Raytheon

Video – Raytheon's First Person Shooter

http://www.youtube.com/v/QfyxJvGG4Jg&hl=en&fs=1&rel=0

A (poorly recorded) video of Raytheon’s demonstration at AUSA of the company’s Counter IED trainer — a full-immersion simulation that the company has developed for squad-level training of troops in a highly realistic, 3-D environment that physically stresses them in similar ways to actual patrols.

Sorry for the quality — this was recorded on an iPod Nano.

Standard
Army, Joint Combatant Commands, Raytheon

Video – Raytheon’s First Person Shooter

A (poorly recorded) video of Raytheon’s demonstration at AUSA of the company’s Counter IED trainer — a full-immersion simulation that the company has developed for squad-level training of troops in a highly realistic, 3-D environment that physically stresses them in similar ways to actual patrols.

Sorry for the quality — this was recorded on an iPod Nano.

Standard
Army, Lockheed Martin, Sensors, tech

Q&A: CERDEC's Charlie Maraldo on C4ISR On-the-Move '09 and the Persistent Surveillance Testbed

At last month’s C4ISR On-the-Move Event ’09 exercise, the Army’s Communications-Electronics Research, Development and Engineering Command (CERDEC) hosted an additional event – the Persistent Surveillance Testbed, run out of Naval Air Engineering Station Lakehurst. In addition to the Lockheed Airborne Multi-Intelligence Lab, CERDEC tested two other ISR platforms – an internal electronic intelligence and electronic warfare project called Sledgehammer, and a prototype acoustic Hostile Fire Indicator (HFI).

Last week, I spoke with Charlie Maraldo, a special projects manager with the Intelligence and Information Warfare Directorate (I2WD) at CERDEC about the Lockheed AML test and the other elements of the Persistent Surveillance Testbed. Here’s the transcript:

Charlie Maraldo : Today, we can network disparate types of systems — sensor systems, e/w systems, ISR systems and ingest their data into DCGS A, normalize it on a database that is then accessible via other tools that are out on the data enterprise, and then allowing that information to be either posted or pulled or otherwise sent down to warfighters, you know. right down to the edge. That was our objective, and AML was a part of that, and a big part. So let’s talk about that for a little bit.

So, Lockheed Martin has a CRADA with RDECOM and I2WD, and as part of that CRADA we have an ongoing technical exchange of information with them. They made us aware several months ago that they were developing a testbed capability, which was the AML. It’s a capital asset of theirs — we don’t have any control over or can tell them what to do with it — it’s a solely Lockheed Martin entity. But we talked about ways that we could cooperate using it, and one idea was to have them participate in the C4ISR on the move demo, as a sub element of our Persistent Surveillance Testbed capstone demonstration that we were running at I2WD, which was part of the c4isr on the move e09 demo. So that’s what we did.

Continue reading

Standard
Army, Lockheed Martin, Sensors, tech

Q&A: CERDEC’s Charlie Maraldo on C4ISR On-the-Move ’09 and the Persistent Surveillance Testbed

At last month’s C4ISR On-the-Move Event ’09 exercise, the Army’s Communications-Electronics Research, Development and Engineering Command (CERDEC) hosted an additional event – the Persistent Surveillance Testbed, run out of Naval Air Engineering Station Lakehurst. In addition to the Lockheed Airborne Multi-Intelligence Lab, CERDEC tested two other ISR platforms – an internal electronic intelligence and electronic warfare project called Sledgehammer, and a prototype acoustic Hostile Fire Indicator (HFI).

Last week, I spoke with Charlie Maraldo, a special projects manager with the Intelligence and Information Warfare Directorate (I2WD) at CERDEC about the Lockheed AML test and the other elements of the Persistent Surveillance Testbed. Here’s the transcript:

Charlie Maraldo : Today, we can network disparate types of systems — sensor systems, e/w systems, ISR systems and ingest their data into DCGS A, normalize it on a database that is then accessible via other tools that are out on the data enterprise, and then allowing that information to be either posted or pulled or otherwise sent down to warfighters, you know. right down to the edge. That was our objective, and AML was a part of that, and a big part. So let’s talk about that for a little bit.

So, Lockheed Martin has a CRADA with RDECOM and I2WD, and as part of that CRADA we have an ongoing technical exchange of information with them. They made us aware several months ago that they were developing a testbed capability, which was the AML. It’s a capital asset of theirs — we don’t have any control over or can tell them what to do with it — it’s a solely Lockheed Martin entity. But we talked about ways that we could cooperate using it, and one idea was to have them participate in the C4ISR on the move demo, as a sub element of our Persistent Surveillance Testbed capstone demonstration that we were running at I2WD, which was part of the c4isr on the move e09 demo. So that’s what we did.

Continue reading

Standard
Army, Contractors & Vendors, Lockheed Martin, Sensors

Q&A – Lockheed's Airborne Multi-Intelligence Lab

Last month, Lockheed-Martin brought an independently developed test aircraft, called the Airborne Multi-Intelligence Lab, to the Army’s C4ISR On-the-Move exercise,

Lockheed Martin's Airborne Multi-Intelligence Laboratory

which took place at and near Ft. Dix and Lakehurst, New Jersey. The AML is a repurposed used Gulfstream III corporate jet equipped with a large radome and commercial electronics racks; the aircraft is designed for testing the integration of multiple sensors and open architecture intelligence, surveillance and reconnaissance systems, providing aggregation of multiple sensors right on the aircraft by analysts, who pass that data to operators on the ground.

I spoke with Lockheed’s Jim Quinn, vice president, and John Beck and Mark Wand, both with Lockheed’s business development group. Here’s the interview:

Jim Quinn: A little over 10 or 11 months ago, Lockheed martin made some decisions, investment decisions in particular that looked at where the customer set was going — some of their higher priority needs. This was driven both internationally as well as domestically, and the importance of intelligence, surveillance and reconnaissance in supporting operations around the globe.

We recognized that a lot of the difficulty that our customers were having were trying to take advantage of multiple sensors, and to fuse and correlate that data in a way that it provided meaningful and actionable intelligence to war fighters on the edge. Whether they be war fighters on the edge or a command post or ground station that were trying to turn that information into usable knowledge.

We know that a lot of the platforms and sensors that are in operation around the world do that in a single int. fashion. They are a dedicated platform that collects a single (form of) intelligence, whether it be synthetic aperture radar, or FLIR (forward looking infrared), kinds of electro-optic sensors, or whether it be a sigint (signals intelligence) sensor, and then usually that data is transported by data link to some sort of ground station, and in many cases those ground stations are dedicated to the platform and the sensor that they are affiliated with. So we recognize the value of trying to have at our customers’ disposal and for our own experimentation, a platform that could take and plug-and-play various sensors in a multi-intelligence configuration. That would allow us to investigate how we take multiple inputs from sensors, and then either cross-queue or show the benefit of merging and synthesizing that data onboard the platform, and then pushing it down to the users on the ground. Whether it is a ground station or a user on the edge

So we made an investment, and procured a used (Gulfstream III) in the aircraft market with partners that we worked with in industry, We constructed a first set of sensors, and perhaps more importantly, we put on the aircraft a hardware and a software infrastructure that allowed those sensors over time to be plugged and played — that is, we could configure the hardback of the aircraft and the software infrastructure of it, the ability to take a sensor from various suppliers, whether it be one of our own or from a supplier in industry that was wanting to partner with us, and put it onboard the aircraft, and do that very very quickly.

Continue reading

Standard