Reflecting on the last nearly six months of the coronavirus pandemic – and his two decades in Federal IT experience – Booz Allen Hamilton’s Vice President of Digital Transformation Dan Tucker is looking ahead to more government investments in open data architectures, cloud services, and human-centric digital products as Feds adapt to the shifting telework and service delivery landscape.

Dan Tucker, vice president of digital transformation at Booz Allen Hamilton

At Booz Allen, Tucker leads the company’s digital platform capability team and has worked with Federal agencies including the Treasury Department to scale capabilities during the ongoing public health crisis. Working with Treasury’s Fiscal Service, Tucker’s team helped the agency reimagine USASpending.gov to account for trillions of Federal COVID-19 relief spending with new visualizations and data downloads available.

The foundation of these Treasury Department investments, however, relies on open data standards and government interoperability that Tucker suggests will help ensure Federal agility and innovation for the future. We talked with him last week to find out more about what government modernization investments are leading Feds through COVID-19 service delivery and telework.

The interview below has been edited for length and clarity.

MeriTalk: During the COVID-19 pandemic, adapting to a “new normal” to smoothly continue government operations has taken center stage as agencies act on continuity of operations plans. How can they also foster innovation during these unstable times?

Tucker: Certainly, there’s a requirement for public sector innovation and a need to accelerate the pace of addressing mission needs we haven’t seen in generations. In the world of a global pandemic, the priorities have shifted and we’re seeing necessity become either the mother of invention or the mother of adoption.

Whether it is product centricity, human-centered design, open data standards, more use of collaboration platforms, maturing distributed development models, data sharing – all these things are on the roadmap, but the timing of getting there has been up in the air. Now, when you have citizens, healthcare providers, local state governments, people seeking benefits, frontline workers needing up to the minute information or public sector mission capabilities to be delivered, they need that trusted data and they need the user centricity. The need is amplified so it’s almost as if they have no choice. They have to invest now in the innovation.

MeriTalk: Are we ever going go back to the way it was? What does that mean for the modernization that agencies are already in the middle of?

Tucker: There are mission needs, particularly in the classified space, that there is a justified requirement for physical, on-premises presence. There are networks that need to be accessed physically for incredibly good reasons and so those operations will persist and there will always be on-prem, but I think what’s been proven is it’s been possible to accelerate the timeline.

There were regulatory requirements to make data available, to have it published, and you couldn’t go through a traditional application development, deployment lifecycle, testing lifecycle that was going to take three times what they’ve been able to do it in. It proved to the agencies and the ecosystem that you can create these open data standards and you can publicize the information. You can develop iteratively and deploy iteratively.

MeriTalk: As Federal agencies are building back public trust during and after the pandemic, what role will data transparency and management play in ensuring that accountability?

Tucker: It’s probably not lost on you or anyone that only a fraction of Americans are extremely confident that any commercial organization or company or the government has their data completely secure and protected. There’s always a balance between the power of data sharing and open data standards, and data privacy. What I’m bullish on is the innovation and modernization that we’ve seen around these open data architectures and cloud native data ops, where you’re building security and you’re building data privacy into how the data is acquired, how it’s processed, how it’s stored, how it’s accessed.

It’s been a combination of policy, and a combination of technical innovation. Last year, the Federal Data Strategy was released, and it really laid out what I thought was a very clear and codified action plan for meeting this north star around how government can accelerate implementation of better data sharing and open standards. What we’ve seen over the past five, six months has started to prove some of that out and accelerated some of the tenants of the of the Federal Data Strategy.

MeriTalk: How does legislation like the DATA Act fit into this?

Tucker: Prior to the DATA Act, the main platform for reflecting all this information, USASpending.gov, only provided a summary of Federal grants and other assistance awards and contracts. The DATA Act, when that was enacted, required Treasury and OMB to: 1) establish governmentwide data standards for the spending information that agencies then report to Treasury and OMB and GSA; and then 2) required publication of the standardized spending data for access and download.

We were really proud to partner with Treasury and Fiscal on that because coming out of that, the data is now available for anyone to download and use. Normally, it’s the Federal budget, but now you have the additional dollars coming out of the CARES Act to add to that as well. It’s a tremendous amount, when we’re talking about hundreds of different data sources, we’re talking about terabytes and terabytes of data, hundreds of millions of transactions. To be able to pull that in, that doesn’t get done without open data standards with an innovative data management pipeline really to be able to bring all that together.

MeriTalk: From your work with Federal agencies, what investments have you noticed pay off the most, for the telework transition or to aid service delivery to citizens during the crisis?

Tucker: The agencies, more and more, are placing an imperative and placing investment in product-centric design and human-centric design. That’s everything from experience measurement to customer research to service design, and that whole feedback loop around customer service as well.

One thing that we’ve done with a number of agencies that has generated a really positive return on investment is just the use of contemporary chatbots for that tier zero/tier one assistance and help for people seeking information. Agencies only have so much money to spend on call centers. That makes it easier for people to get that information.

MeriTalk: More generally, what are some of the biggest lessons learned about the role of technology in the government’s pandemic response and service delivery so far?

Tucker: What I’ve seen is that we’ve been able to deliver and meet mission needs as effectively or more so through distributed development, through distributed delivery, so I think that has proven out. If there was some hesitancy or scrutiny or perception that this couldn’t get done at the pace or at the efficiency that it could with physical colocation, hopefully that myth has been busted.

The second lesson learned is the pace of innovation continues to accelerate and the pace of adoption continues to accelerate. It was a necessity for states, local governments, and the Federal government to be able to combine datasets that might have been siloed in different agencies. Through necessity, people have figured out ways to open up their data architectures and to strike that balance between data privacy and data sharing.

Over the course of the coming years, I would imagine that there will be probably more product centricity. When you think about the need to share data, and to share services across the Federal government to meet big problems that we face as a country or as a globe, there’s just more and more need to share information and shared services. It’s not just open data architectures but it’s open services, and connectivity across those services to develop different insights. There’s more of a push within agencies and then across agencies for more open architecture, data sharing, application sharing.

MeriTalk: Where should agencies be targeting future investments to prepare for the possibility of other crises or recovering from this crisis?

Tucker: There’s still a lot of on-prem data work and that requires a lot of expensive, labor-based licensing, operations, and maintenance. Moving forward, continuing the migration to cloud infrastructures, to those cloud-native data management techniques, and cloud native application architectures that allow for the interoperability – that’s going to be needed.

Government is most effective when this data is shared, and systems are interoperable. That’s going to prepare us best for any future challenges that we face – everything from open data architectures, cloud- native data services, and human-centric digital products.

Read More About
About
Katie Malone
Katie Malone
Katie Malone is a MeriTalk Staff Reporter covering the intersection of government and technology.
Tags