Introduction to Airflow in Python
Delivering data on a schedule can be a manual process. You write scripts, add complex cron tasks, and try various ways to meet an ever-changing set of requirements—and it’s even trickier to manage everything when working with teammates. Airflow can remove this headache by adding scheduling, error handling, and reporting to your workflows. In this course, you’ll master the basics of Airflow and learn how to implement complex data engineering pipelines in production. You'll also learn how to use Directed Acyclic Graphs (DAGs), automate data engineering workflows, and implement data engineering tasks in an easy and repeatable fashion—helping you to maintain your sanity.
1. Intro to Airflow
In this chapter, you’ll gain a complete introduction to the components of Apache Airflow and learn how and why you should use them.
2. Implementing Airflow DAGs
What’s up DAG? Now it’s time to learn the basics of implementing Airflow DAGs. Through hands-on activities, you’ll learn how to set up and deploy operators, tasks, and scheduling.
3. Maintaining and monitoring Airflow workflows
In this chapter, you’ll learn how to save yourself time using Airflow components such as sensors and executors while monitoring and troubleshooting Airflow workflows.
4. Building production pipelines in Airflow
Put it all together. In this final chapter, you’ll apply everything you've learned to build a production-quality workflow in Airflow.