As businesses continue to embrace cloud technology, public cloud adoption remains a key driver of digital transformation.
While many choose the hybrid infrastructure model that hosts services and resources both on-premises and in the cloud, others take a multi-cloud approach, using services from numerous cloud providers at once.
Managing resources across different environments can be costly, time-consuming, and confusing. Consequently, the rise of hybrid and multi-cloud use has led large cloud providers to focus on giving their users the means to manage all their workloads from a single control center.
Both Microsoft and Google have joined this trend, developing Azure Arc and Anthos, respectively, to solve these problems. Microsoft’s Azure Arc helps customers simplify the use of different environments by offering traditional Azure resources on any type of infrastructure, regardless of its location.
Hybrid Kubernetes management
When managing a hybrid or multi-cloud environment, it’s important to understand how applications are configured and handled, how secure they are, and how the client monitors the infrastructure’s health and productivity.
Both Azure Arc and Google Anthos address these elements, but a comparison of their core components highlights some essential differences.
Multi-cluster management
While both Azure Arc and Anthos implement Kubernetes, its use is optional in Azure Arc. This is because Azure Arc also supports edge computing environments, enabling deployment on any infrastructure.
In Anthos, however, Kubernetes is a core part of the tool. It even provides additional tools for converting virtual and bare-metal workloads to containers.
Azure Arc uses Kubernetes to deploy and manage container-based applications by attaching and configuring clusters. When a Kubernetes cluster is attached to Azure Arc, it appears as a separate identity and can be tagged like any other Azure resource. To connect a Kubernetes cluster to Azure located on any cloud or on-premises infrastructure, the cluster administrator must deploy agents.
These agents are responsible for securely connecting to Azure, collecting logs and metrics, and watching for configuration requests. This process transforms the management of multi-cloud environments into the more familiar management of Azure objects and resources.
Google Anthos also relies on Kubernetes as its primary computing environment, using Google Kubernetes Environment (GKE) on Google Cloud, GKE on-premises, or GKE on AWS to manage Kubernetes hybrid environments.
Application and сonfiguration management
How you configure your environments determines the tools and services you can use in the future.
Anthos Config Management integrates with Anthos clusters — either on-premises or in the cloud — and allows you to deploy and monitor configuration changes stored in a central Git repository. You can use GitOps workflows in any of the available environments: AWS, GCP, or on-premises.
With Git being the central point for your configurations, you can change your environment with simple pull requests to ensure compatibility and alignment. Additionally, any YAML or JSON that can be applied with kubectl commands can also be merged with Anthos Config Management and applied to your Kubernetes clusters.
Azure Arc enables Kubernetes to deploy configurations or applications using the GitOps approach. It relies on a Kubernetes Operator — in this case, the Flux Operator — to listen to changes in the Git repository.
With Cluster-level GitOps Configuration, the goal is to have a baseline for the "horizontal" or "management" components deployed on your Kubernetes cluster and then used by your applications.
Security and compliance
Security is of the utmost importance for any digital infrastructure, and while Arc and Azure differ in how they protect you, both provide a secure infrastructure.
Azure Policy extends Gatekeeper v3, an admission controller webhook for Open Policy Agent (OPA). This allows you to apply security policies and safeguards at scale to all your clusters across all your environments. For the compliance state of your Kubernetes clusters, Azure Policy makes it possible to manage and report on it from one place.
Monitoring and logging
Access to logs for applications and infrastructure components is critical for effectively running and maintaining a production infrastructure.
Azure Monitor for Containers provides numerous monitoring features that show the health and performance of your Azure Arc clusters. It collects memory and processor metrics from controllers, nodes, and containers available in Kubernetes through the Metrics API.
These metrics are then saved in a metrics store, and your log data is written to the logs store associated with your Log Analytics workspace. This integration provides all the Kubernetes-related performance metrics, telemetry, and metadata you need, including from any applications deployed on the clusters.
For Anthos on Google Cloud, logs from workloads running inside your clusters are automatically enriched with relevant labels, such as pod and cluster names.
Other Services
Although the core of the two platforms is built using the same general services and practices, Google Anthos and Azure Arc offer unique multi-cloud management solutions.
Google Anthos Managed Service Mesh
Anthos Service Mesh is a networking layer for managing and configuring communication between different environments and services. It is based on Istio, an open-source implementation of the service mesh infrastructure layer.
Anthos Service Mesh uses sidecar proxies to enhance network security, reliability, and visibility. The functions inside the pods are implemented in a common out-of-process proxy, which is then delivered as a separate container in the same pod.
Google Anthos Serverless
Another Anthos-specific service is Cloud Run, which manages how your services run in both the cloud and on-premises. It also optimizes resource utilization, horizontal scaling, and integration with networking and Anthos Service Mesh. Powered by the Knative open-source project, Cloud Run allows you to take advantage of the swift container-to-production approach.
Azure Arc for Servers
Alongside Kubernetes clusters, Azure Arc also organizes other resource types, like Windows and Linux servers. Servers running outside of Azure — such as AWS EC2 instances, on-premises, physical machines, VMware environments, or edge devices — can all be projected as Azure resources.
Using Azure Policy and resource tags, these resources can then be managed like Azure-native virtual machines, complete with updates, change tracking, monitoring, and more.
Azure Arc Data Services
Azure Arc also gives you the ability to run Azure data services on-premises, in multi-cloud, and in edge environments using the Kubernetes-centered infrastructure of your choice.
Summary
Google Cloud Anthos and Microsoft Azure Arc share similar technical approaches. At their core, both platforms allow companies to host workloads in hybrid and multi-cloud infrastructures. Both also leverage Kubernetes and containers to provide a seamless experience anywhere: on-premises, in their own public cloud, or in a competitor’s cloud.
Yet, even though Microsoft and Google approached some problems from similar perspectives, both have also implemented a few unique solutions that are only available on their individual platforms.
Dive deeper into the world of multi-cloud management and make an informed decision between Azure Arc and Google Anthos.
Start a conversation with us