Google this week launched the beta of Cloud Composer, a managed Apache Airflow service for building workflows that promises to simplify management of integrated DevOps processes. Apache Airflow is an incubation project being developed under the auspices of the Apache Software Foundation that enables IT teams to programmatically author, schedule and monitor workflows using directed acyclic graphs (DAGs) that visualize pipelines running in production, monitor their progress and can be used to troubleshoot issues.
James Malone, a product manager for Google Cloud, said Cloud Composer leverages Apache Airflow to not only make it simpler to manage workflows, but also port them between various public and private clouds, including on-premises IT environments.
Cloud Composer also includes the Google Developer Console and Cloud software development kit (SDK), a cloud identity-aware proxy, Stackdriver logging and monitoring, identity access management (IAM) and Python support, along with simplified DAG management and streamlined Airflow runtime.
Google plans to add support for autoscaling and availability in additional Google Cloud regions over time, Malone said, as well as integrate Google Composer with Kubernetes to extend the reach of Apache Airflow workflows to containers.
Pricing for Cloud Composer is consumption-based as measured by vCPU/hour, GB/month and GB transferred/month.
Malone said Cloud Composer will play a critical role in enabling DevOps teams to construct pipelines spanning hybrid cloud computing environments. Today IT administrators regularly make use of text editors or scripts they write to automate a process. But it’s difficult for those scripts to be shared between multiple IT administrators. Cloud Composer and Apache Airflow make it easier to share workflows that automate processes across DevOps teams. Just as importantly, those workflows will be accessible to IT administrators who don’t have the programming skills required to develop a script.
Those workflows will also play a critical role in unifying DevOps and DataOps processes around a common set of continuous integration/continuous deployment (CI/CD) processes, Malone added.
Most organizations today are managing multiple clouds in isolation from one another. But as organizations look to build applications faster, they want to be able to deploy those applications on the most optimal platform available without having to develop a unique set of pipelines for each cloud environment. Having an application locked into a specific cloud platform not only is economically unacceptable, it’s also fundamentally inefficient from an application development and deployment perspective.
Apache Airflow is not the only tool that can be employed to construct pipelines. But Malone noted that because it is an open source Apache project, organizations can be assured that workflows they develop can run anywhere now and into the future. Google chose Apache Airflow to build Cloud Composer largely because an active development community is emerging around this open source project, he said.
The way IT platforms are managed is transforming rapidly. The days when servers and applications where managed in isolation are over. IT administrators are being asked to manage fleets of server and storage systems running more applications than ever. The tools required to achieve that goal are thankfully now becoming a lot more accessible to everyone in IT.