As the U.S. government advances to a war-fighting posture that increasingly integrates emerging technologies, the Department of Defense (DoD) is modernizing computing infrastructure to support new capabilities. At the same time, virtualized, mission-critical applications must run uninterrupted. In a recent interview, Travis Steele, chief architect at Red Hat, discussed best practices for virtualization today, how virtualization and containers can coexist, and how a common platform for both creates a pathway to modernization.

MeriTalk: Virtualization as a concept and technology is about 20 years old. Where is it used most effectively today, especially among DoD organizations?

Steele: Virtualization is pervasive. It’s embedded. In the DoD, hundreds of thousands of virtual machines (VMs) – if not more – contribute to IT operations across all agencies and all classification levels. Many support mission-critical applications and systems.

MeriTalk: Why and how should VMs be used in application development and deployment? What are some best practices enterprise users should follow?

Steele: Certainly, there are use cases that make sense for virtualization. For example, if you need to run multiple operating system environments on a single or shared computing resource, virtualization is great. The same goes for legacy applications that often require older versions of software, binaries, and resources. Then there are your typical large and monolithic applications that are not cloud native and have a lot of tightly coupled resources and services – those are good for virtualization, too, because everything works together. From a best practice approach for the enterprise, organizations should consider a technology platform that supports traditional virtualized workloads and containerized workloads to drive a single unified development lifecycle and workflow.

MeriTalk: What factors should enterprises consider in determining the right virtualization technology provider, especially military customers?

Steele: Above all, security and interoperability – especially for DoD organizations. They should examine if the virtualization provider can meet stringent DoD compliance requirements, from the hardware through all of the software. Zero trust adaptability is also important. Interoperability is a big factor as well – can that virtualization infrastructure run on any hardware, anywhere at the tactical edge while meeting dynamic mission needs? That’s a big component for the military. Other key questions to ask are: Does the virtualization provider support cloud-native development and delivery capabilities? Are integrated development tools available? More broadly, does this virtualization technology offer a pathway to modernization?

MeriTalk: What does the pathway to modernization look like where virtualization is pervasive?

Steele: Let’s use the DoD as an example. Today, the geopolitical climate is dynamic, especially in the face of the great power competition. As a result, the government is hyper focused on approaches to outpace the adversary. However, pathways to modernization seem insurmountable at times because of significant technical debt. On the one hand, existing systems, applications, and operations have to run 24×7. On the other hand, IT has to modernize to deliver new applications and systems faster, in multiple formats, across multiple platforms and locations – on everything from a tank to an F-22 Raptor, to various mobile devices. And everything has to be interoperable.

In many cases if they haven’t already, DoD IT operations and application providers are going to turn to containers to better leverage emerging capabilities such as artificial intelligence (AI) and other novel technologies. Because containers have a much smaller footprint than VMs, they increase agility and lower cost. They also enable faster code development and release cycles across a distributed and highly scalable computing capability. This modern approach helps DoD organizations take advantage of AI across the enterprise and increase deployment options while increasing data processing and decision-making at the edge.

At the same time, virtualized environments must be supported throughout the modernization journey, because they are essential to the continuous operations of existing applications. The good news is that virtualization can work side by side with containers, given the right technology platform.

MeriTalk: Many times, VMs and containers are hosted on different platforms. What are the challenges associated with that?

Steele: One of the challenges is that your virtualization environment is running on a legacy-hosted platform. To maintain it, you need licensing, tools, and support. And if you’re also running a modern Kubernetes container platform – for example, an OpenShift environment – and you’re only using it for containers, then you are doubling your workload and cost. I’d say the majority of DoD organizations today have these two environments on some level.

MeriTalk: How does Red Hat help the DoD and other government organizations overcome the challenges of running multiple platforms and modernize for the future?

Steele: We help the DoD and others reduce the complexities and costs of multiple environments by operating them on the same platform – Red Hat OpenShift, our hybrid cloud application platform that is designed to streamline the application lifecycle, from development to delivery to management of app workloads. It includes virtualization capabilities in the subscription so you don’t have to maintain that cost – and your virtualization skills are transferable to the OpenShift environment.

Perhaps most importantly for this conversation, OpenShift provides the opportunity to migrate, move, or even deploy new VMs in OpenShift at enterprise scale. They run side by side on the same platform as the containers. This allows DoD organizations to reduce their operational risk, to run virtualized, mission-critical systems and applications 24×7, and to take advantage of the pathway to modernization with containers and microservices.

OpenShift really is a bridge to next-generation capabilities like DevSecOps and GitOps, which is especially important for DoD given the complexity of its platforms and shifting agile combat needs. OpenShift delivers that modernization in a secure, trusted, and reliable way. I cannot emphasize enough how much it reduces operational risk – and provides many deployment options.

Read More About
Recent
More Topics