For agencies to achieve the data center of tomorrow, they must first understand the data that they have today, according to industry experts speaking at MeriTalk’s Data Center Brainstorm on Thursday.

“The root of the problem is understanding the data that you have in your environment today to make the informed decisions to go in the direction that you need to go in the future. The technology of tomorrow and the technology of today can get you that data center of tomorrow […] but it doesn’t really mean anything if you don’t understand the data,” said Brian Houston, chief technology officer for Hitachi Data Systems Federal. “Understanding the data that you have today and making informed decisions will give you the data center of tomorrow.”

Doug Bourgeois, managing director of Federal technology strategy and architecture at Deloitte, explained that agencies should strive for a heterogeneous model of data centers and clouds, because different commercial services are better at addressing different needs.

“Not all clouds are created equal, not all data centers are created equal. They’re built to a specific purpose,” Bourgeois said. “In my opinion, the data center of tomorrow is by and large commercial.”

“The very primary thing is being able to understand where I am today and what I’m capable of. That’s the challenge. Once you get there and you start to build a template from that, you’ll find it’s much, much easier,” said Anthony Vicinelly, Federal technology director at Nlyte Software, adding that sometimes previous investments can actually help accomplish a mission more easily than buying something new.

 


Agency CIOs Still Have Questions About MGT Act

Vicinelly explained that customers often think they can just keep plugging new things into their system, rather than setting priorities for what’s needed.

“Where I’ve seen people become very successful is that there’s a decision process of: Should we buy that or should it be procured from the cloud?” Vicinelly said.

When purchasing something new from a vendor, Bourgeois suggested that agencies go into the process knowing what their environment is like and what problems they typically run into. That way they can make sure that the provider they’re working with knows how to address those problems and needs.

Vicinelly also championed the Data Center Optimization Initiative as an enabler of innovative data center optimization efforts.

“I think that the DCOI initiative has helped people focus on where they can have optimization, where they can have innovation within their data centers,” said Vicinelly. “And I think it’s caused a conversation, given the opportunity for a conversation between the IT side of the house and the facilities side of the house, as far as data centers are concerned.”

Read More About
About
Jessie Bur
Jessie Bur
Jessie Bur is a Staff Reporter for MeriTalk covering Cybersecurity, FedRAMP, GSA, Congress, Treasury, DOJ, NIST and Cloud Computing.
Tags