Artificial intelligence (AI) and machine learning (ML) are driving breakthroughs in innovation, but Federal agencies struggle to keep up with the growth of data that these technologies produce. According to a MeriTalk study, 88 percent of Federal IT leaders are frustrated with their agency’s data management strategies.

Many agencies have used public cloud services to process and store their data but cost and accessibility are challenges for teams innovating with AI, especially when those technologies are working at edge locations.

Scott Beliveau, branch chief of Advanced Analytics at the United States Patent and Trademark Office (USPTO), recently shared it’s essential that Federal agencies have a data management strategy to leverage emerging technologies. “As innovation grows, data grows. So, we must have a data management strategy to leverage all that [incoming] data,” he said.

Where and how agencies store that data directly impacts management costs, efficiency, and ability to support the growing use of emerging technologies such as AI and ML. Teams must consider their mission and decide what solutions will provide the flexibility, security, and analytics they need to keep up.

Public Cloud Isn’t Cutting It

Public cloud services are a strong solution for data storage in urban, highly connected areas, and when agencies aren’t downloading massive volumes of data for analysis. However, Gartner reported that by 2022, more than 50 percent of enterprise-generated data will be created and processed outside the central data center or public cloud.

Data collection at the edge often occurs in remote areas, leading to latency when public cloud services are used to collect, access, and move data. Agencies may also be stymied by egress charges assessed when they download data for analysis, noted Mike Lamb, product manager of solution infrastructure at ViON Corp.

“Large egress charges that agencies incur when they pull data down from the public cloud contradict the effectiveness we expect from public clouds,” Lamb said in a recent interview with MeriTalk. “This leads to agencies feeling a lack of control when it comes to where and how they access their data.”

Agencies need an agile, scalable system to manage and access data from the edge to core to cloud – whenever and wherever they carry out their mission.

Object Storage Provides Fast Access to Data

“Agencies need to explore using modern technologies, such as object storage, to achieve the cloud-like convenience they’re looking for, closer to the source of the data, said Rob Renzoni, director of technical sales for the Americas at Quantum.

Object storage enables agencies to build a private cloud storage environment on-premises and unlock edge computing capabilities. With object storage, data and metadata are attached to one object, and a unique identifier is attached to it. Unstructured data can then be consolidated into a single storage area, a data lake, where teams can use APIs to make calls against their data. The ease of access makes this approach ideal for AI and ML, which require data that is readily available.

Renzoni pointed to 911 call centers as an example. “With the adoption of 5G, there’s going to be a tremendous increase in data that’s flowing into 911 call centers. A private cloud and object storage – with an AI/ML workflow analyzing the data – can provide actionable insights faster than one person or team ever could.”

Data Management Must Be Mission-Centric

As agencies look to adopt new data management and storage solutions, Renzoni and Lamb offer advice and urge leaders to keep their technology mission-centric and secure.

Do your homework. “When you are designing any architecture for a workflow, you really need to do your homework,” Renzoni advises. “The first thing to look at is ease of use. Secure, commercial-off-the-shelf products and services are readily available, so agencies shouldn’t need to invest in custom solutions. Not all solutions scale easily, he added. Agencies should ensure their technology capabilities can grow along with their needs.

Enable flexibility for different types of data. Cost and accessibility depend on how agencies are using their data, and flexibility is crucial. “Agencies need to make sure that their data is stored in the proper storage tier based on their needs – from active data that needs to be stored in a high-performance tier, down to inactive data that can be stored more cost effectively with object storage or a tape library,” Lamb says.

Prioritize data security. Data security must be inherent within data management solutions, whether through keyless encryption, object immutability, or system partitioning. “Some legacy technologies are also resurfacing,” Renzoni says. “For example, tape libraries are a truly air-gapped way to provide effective protection against ransomware and malware and to mitigate the damages of successful attacks.”

Start small and grow. As agencies increase their use of AI and ML, their needs may change. Lamb encourages teams to embrace this evolution. “Anticipating those changes is difficult, but it’s an important exercise. Leverage as-a-Service solutions to start small and grow as your needs change,” he says.

Read More About
More Topics