In the ever-evolving landscape of cloud computing, Microsoft Azure takes a significant leap forward by bolstering its hardware-backed confidential computing. This enhancement encompasses safeguarded environments for Virtual Machines (VMs), containers, and Graphics Processing Units (GPUs), eliminating the necessity for intricate specialized code.
The prevalent challenge faced by enterprises in the public cloud realm is the inherent publicity. While applications run in isolated virtual machines and data resides in dedicated virtual storage, the risk of data exposure persists, especially in multitenant environments. Concerns about data security and regulatory compliance often lead businesses to retain sensitive data on-premises, forfeiting the scalability and global accessibility offered by the cloud.
Economically, maintaining on-premises data incurs challenges, including expensive egress charges for cloud-hosted data and the need for robust security organizations. Addressing these concerns, confidential computing emerges as a viable middle ground, leveraging modern silicon advancements.
Advancements in Confidential Computing:
Microsoft’s journey into confidential computing began with the utilization of Intel’s secure extensions to processor instruction sets, forming the foundation for Azure’s confidential computing. Over the years, the market has witnessed significant progress, transcending from working with encrypted memory chunks to encrypting the entire working memory of VMs and hosted services. Notably, the scope now extends to a diverse array of silicon hardware, including support from AMD and Arm.
A groundbreaking development is Nvidia’s integration of confidential computing features into GPUs. This facilitates the construction of machine learning models using confidential data while ensuring the protection of data used in mathematical modeling. With GPUs operating at scale, the cloud transforms into a supercomputer, and the addition of confidential computing capabilities to GPUs enables efficient partitioning and sharing of compute capabilities.
Simplifying Confidential Computing on Azure:
Microsoft Azure’s commitment to advancing confidential computing aligns with hardware evolution. The platform initially focused on providing protected, encrypted memory for data. Recent updates, announced at Ignite 2023, expand the protection to entire environments for VMs, containers, and GPUs. A noteworthy aspect is the elimination of the need for specialized code; code and data can now be encapsulated in secure, isolated, and encrypted spaces.
This approach facilitates the usage of the same applications on both regulated and unregulated data by targeting the appropriate VM hosts. Additionally, confidential VMs and containers offer the ability to seamlessly migrate on-premises applications to the cloud while ensuring regulatory compliance.
Azure Confidential VMs with Intel TDX:
The newly introduced Azure confidential VMs, leveraging the latest Xeon processors and Intel’s Trust Domain Extensions (TDX), provide robust support for attestation techniques ensuring VM integrity. Key management flexibility allows users to manage their keys or leverage the underlying platform. With OS support for Windows Server and Linux, these VMs cater to hefty workloads, especially those demanding substantial memory.
Microsoft has initiated the preview rollout of these new confidential VMs, starting in one European and two US Azure regions, with an additional European region scheduled for early 2024.
Azure Confidential VMs with GPU Support:
A notable evolution is the addition of GPU support to confidential VMs, significantly expanding available compute capabilities. Based on Nvidia H100 GPUs, commonly employed in training various AI models, these confidential VMs enable the use of private information as a training set, exemplified in scenarios like product evaluation model training on prototype components or working with medical data for diagnostic tool training.
Rather than embedding a GPU in a VM and encrypting the entire VM, Azure maintains a separation, utilizing encrypted messaging to link the encrypted GPU with the confidential computing instance. Both operate within their trusted execution environments (TEE), ensuring data security. This concept aligns with using an external GPU over Thunderbolt or another PCI bus.
Confidential Containers on Kubernetes:
Expanding the realm of confidential computing, Microsoft’s managed Kubernetes service, Azure Kubernetes Service (AKS), introduces support for confidential containers. Unlike full VMs, these containers operate within host servers, built on AMD’s hardware-based confidential computing extensions. AKS’s confidential containers, an implementation of the open-source Kata containers, leverage utility VMs (UVMs) to host secure pods.
These confidential containers, running within UVMs, enable the same AKS host to support both secure and insecure containers, accessing hardware support through the underlying Azure hypervisor. Similar to confidential VMs, these containers accommodate existing workloads, including bringing in existing Linux containers.
In conclusion, the latest strides in Azure’s confidential computing capabilities pave the way for overcoming barriers to migrating regulated workloads to the cloud. This introduces a new on-ramp to deliver scalable and burst use of secure computing environments. While additional configuration and management steps are imperative around key management and attestation, these align with standard practices for handling sensitive information both on-premises and in the cloud.
Confidential Computing as a Security Imperative:
The integration of these features into Azure positions confidential computing as an essential component when dealing with sensitive and regulated information. By fortifying the cloud with these capabilities and ensuring support in the underlying silicon, Microsoft makes the cloud an increasingly attractive option, particularly for industries in health and finance.