Lifting and shifting virtual machines into Google Compute Engine (GCE) offers many benefits, including enhanced agility, significantly reduced overhead compared to a data center, and a familiar usage and management environment.
However, there are also significant advantages to modernizing VMs using Google’s Migrate for Anthos, which orchestrates and simplifies the process of moving VMs into Google Kubernetes Engine (GKE) containers.
Despite its name, Migrate for Anthos does not require an Anthos subscription. The tool is free for all customers migrating workloads to Google Cloud Platform (GCP), whether or not they intend to deploy Anthos. (But Anthos is great, and you might want to take a look at it!)
1. Choose the Right Migration Path for the Right Workloads
Historically, enterprises had only two choices when migrating on-prem workloads to the cloud: migrate first and containerize later, or containerize first and migrate later. Neither was a particularly attractive option, especially for large enterprises with hundreds, perhaps thousands of apps, all of which would have to be manually refactored to support containerization.
Migrate for Anthos works alongside Migrate for Compute Engine to give organizations freedom of choice. Instead of having to shoehorn all of their workloads into the same migration strategy, organizations now have the freedom to customize their digital transformation journey to fit their specific needs and timelines. VMs that are well-suited to containerization can be upgraded and migrated into GKE. Those that are better left as VMs can be lifted and shifted into GCE; the organization always has the option to migrate and modernize them at a later time.
2. Migrate and Modernize in One Fell Swoop
Migrate for Anthos combines migration and modernization into one step, removing complexity by doing a lot of the “heavy lifting” behind the scenes. With just a few clicks, and without the need for code rewrites or other complex manual processes, VMs can be moved from on-prem data centers, GCE, or other clouds directly into containers in GKE. This makes modernization a realistic option even for enterprises with small IT teams and, in some cases, for VMs that were previously written off as “too complicated to upgrade.”
3. Get the Benefits of Containerization
Enterprises benefit greatly from migrating and modernizing apps because containers offer many advantages over VMs, including:
- Containers virtualize at the operating system level, with multiple containers running atop the OS kernel directly, making them far more lightweight and agile than VMs and providing developers with a sandboxed view of the OS logically isolated from other applications.
- Developers can create consistent, predictable environments that are isolated from other applications, allowing them to spend less time configuring and debugging code for different environments and more time developing and shipping new apps and functionality.
- Once written, containers can run pretty much anywhere, on virtually every OS and in nearly any data environment, including public clouds, on-prem, bare metal, within VMs, and on local machines.
- Applications are siloed from each other unless they are explicitly connected, and they don’t run directly on the host OS, providing an additional layer of application security.
4. Take Advantage of the Benefits of GKE
Kubernetes makes everything associated with container orchestration easier, with features such as automated rollouts and rollbacks, automatic scaling, service health monitoring, and declarative management. Enterprises are eagerly embracing Kubernetes, with adoption increasing from 27% in 2018 to 48% in 2019.
However, it can be tricky for companies to configure, deploy, and maintain Kubernetes on their own. GKE, Google’s fully managed Kubernetes solution, makes it easy for businesses to get up and running with Kubernetes right away without having to install, manage, or operate their own Kubernetes clusters. Cloud admins can also take advantage of advanced cluster management features such as GCP load-balancing for GCE instances, automatic scaling of GKE clusters’ node instance counts, node pools to designate subsets of nodes within a cluster, security optimized node kernels with automatic upgrades, node auto-repair, and Stackdriver logging and monitoring tools.
5. Incorporate Modern Services Into Legacy Apps
Legacy apps that are migrated into GKE can be easily upgraded with modern functionality through a large library of add-ons, such as Istio, an open service mesh that provides a uniform way to connect, manage, and secure containers without having to rewrite application code.