If recent events are any indication, we could be seeing big changes to agency cloud migration plans in 2019. The Federal government could be rethinking its role in owning and operating its own data centers, strongly questioning whether that is a job best left to government employees.

“It’s not necessarily an inherent core competency of the Federal government to run high-tech data centers,” said Department of Health and Human Services CTO and Acting CIO Ed Simcox.

That acknowledgment came during the FITARA scorecard hearing, held by the House Oversight and Government Reform Committee on Dec. 12, 2018. Under FITARA, each of the major Federal agencies receives scorecards–with A to F grades–tracking their progress on IT acquisition and modernization.

Despite a seven-month period of near-government-wide improvement, one of the FITARA categories still plagued agencies with middling grades–the Data Center Optimization Initiative (DCOI).

New updates to DCOI metrics are on the way, following a draft and period of public comment at the end of last year. With those updates comes potentially significant changes in how the government tracks the efficiency–and in turn the viability–of the facilities it operates without vendor assistance.

Former Rep. Darrell Issa, R-Calif., who co-authored the FITARA legislation, noted during the December hearing that data centers left in government’s hands struggle mightily with timely upkeep, patching, OS updates, and everything that contributes to strong cybersecurity.

He called attention to his own chamber, saying the House of Representatives had even requested an extension on the obsolescence of an operating system from which it was unable to migrate in time. He contrasted that with the opportunities afforded by the cloud.

“You’re getting updated on a moment-by-moment basis,” Issa said of service from commercial cloud providers like Amazon Web Services (AWS). Real-time patching and security updates ensure the efficient operation of government servers. Simcox concurred with the sentiment.

“There are companies that are very, very good at this,” he said. “They leverage economies of scale associated with doing that work, and they offer what I think is extremely important, and that is elasticity. So the amount of compute power and storage and things that you need to run the business of government are available on demand.”

The impetus for government’s move to the cloud–including its new Cloud Smart policy–is clear at this point. And the lagging government data centers? The problem goes back to those optimization metrics. Government-owned data centers deemed essential due to high server utilization aren’t being earmarked for migration to the cloud.

Issa also questioned government-operated data centers that he said “might be considered fully-utilized but are not being managed at the same level as the best contracts would be” in commercial cloud environments.

“Fully-utilized sites with sufficient bandwidth moved to a single cloud still gain huge efficiencies–efficiencies in peak load, efficiencies in the ongoing maintenance,” he said.

Issa acknowledged that there are some “tough nuts” that aren’t easily consolidated and migrated to the cloud, but indicated that there are disagreements over the concept of utilization and the use of metrics as a justification for keeping facilities running.

“Just because we have a data center at full capacity doesn’t necessarily mean we should still be in the business of running data centers in the Federal government,” Simcox added.

The updated DCOI metrics, hopefully, will provide more clarity into where government can arrive at more efficient operation. The draft update removes the metric of facility utilization entirely, and revises metrics around server utilization.

The Defense Department has long struggled with its data center efforts, earning failing grades in each of the last four FITARA scorecards. DoD is now poised to adopt an enterprise commercial cloud to alleviate those woes.

As the Pentagon considers its own cloud migration strategy, the issue of security is again coming to the forefront, in light of significant breaches of U.S. Navy contractor and subcontractor networks uncovered last December.

The Wall Street Journal reported that Chinese hackers had gained access to maintenance data, missile plans, and “highly sensitive, classified information about advanced military technology.” Hackers targeted universities hosting military research labs, and contractors big and small.

“It’s extremely hard for the Defense Department to secure its own systems,” said former Homeland Security Advisor Tom Bossert. “It’s a matter of trust and hope to secure the systems of their contractors and subcontractors.”

Control sets like FedRAMP, have helped agencies confidently leverage commercial cloud technology securely. A recent Gartner FedRAMP Demystified report states, “Seven years after FedRAMP launched, there are now two kinds of cloud providers–those with a FedRAMP seal of approval, and those without.” However, the research goes on to outline that “in the DoD environment, impact levels fall into four SRG categories,” and therefore we believe many cloud providers maintain both FedRAMP and SRG levels.

AWS has achieved FedRAMP High and hosts a Secret Region that has received the highest security authorization, IL-6, from DoD. Other cloud providers are trying to keep pace and meet the same stringent requirements; however, no other provider has received an IL-6 authorization.

As one of several means to help improve its cyber posture, the Pentagon is expected to award various commercial cloud contracts in 2019. With the threat of breaches continually looming, leveraging and implementing leading commercial technology becomes critical to securing our Nation’s most sensitive data and helping protect our country.

Source: Gartner, Inc., FedRAMP Demystified, Katell Thielemann December 17, 2018

Read More About
About
Kate Polit
Kate Polit
Kate Polit is MeriTalk's Assistant Copy & Production Editor covering the intersection of government and technology.
Tags