Data Engineering - Apache Airflow
Eman Ali Mughal
4:06:44
Description
Apache Airflow is a workflow orchestration tool, and primarily used in Big Data related project.
What You'll Learn?
- To get the basic understanding of Workflows, Apache Airflow basic features, Core Concepts, Familiarization with Airflow UI, XComs, Variables, Connections Hooks
- You will also learn how to install Apache Airflow on Linux / Ubunto, in 2 different flavours, Install in Docker Container and Vanilla Installation
- This course will help you enhance knowledge of ETL pipelines and how to orchestrate those ETL pipelines in Airflow Dags.
- Dags in this course contains working examples of Apache Spark, Elasticsearch, Snowflake, Ecommerce Dag as well.
Who is this for?
What You Need to Know?
More details
DescriptionAirflow is an extensive tool and has many different features. I will explain in details how we can take benefit from Airflow in our Big Data-related projects. In this Apache Airflow I will start from basic concepts, then move to Installation, Later will deep dive into advanced concepts. In the end, I will create a few big data-related projects and create Airflow Dag to give you a flavor of how all components of Airflow work together.
Please note that Apache Airflow is not a Data Flow tool rather its a Workflow Orchestration tool, With such a nice and beautiful User Interface and a bundle of rich components available. When data is flowing through different processes it is difficult to monitor each and every process, if we are working in CLIÂ based environment. Rather using Apache Airflow we can get notified by color scheming of tasks, and we can start the task right from where it has failed.
I will show you different use cases of Big Data universe, in which we can take leverage from Apache Airflow.
After completing this course Part 1, you will be find your self equipped with a new tool just to solve your Big Data use cases.
Who this course is for:
- This Course is intended for Data Engineers, Big Data Architects, or for students those does not have prior knowledge of Apache Airflow.
Airflow is an extensive tool and has many different features. I will explain in details how we can take benefit from Airflow in our Big Data-related projects. In this Apache Airflow I will start from basic concepts, then move to Installation, Later will deep dive into advanced concepts. In the end, I will create a few big data-related projects and create Airflow Dag to give you a flavor of how all components of Airflow work together.
Please note that Apache Airflow is not a Data Flow tool rather its a Workflow Orchestration tool, With such a nice and beautiful User Interface and a bundle of rich components available. When data is flowing through different processes it is difficult to monitor each and every process, if we are working in CLIÂ based environment. Rather using Apache Airflow we can get notified by color scheming of tasks, and we can start the task right from where it has failed.
I will show you different use cases of Big Data universe, in which we can take leverage from Apache Airflow.
After completing this course Part 1, you will be find your self equipped with a new tool just to solve your Big Data use cases.
Who this course is for:
- This Course is intended for Data Engineers, Big Data Architects, or for students those does not have prior knowledge of Apache Airflow.
User Reviews
Rating
Eman Ali Mughal
Instructor's Courses
Udemy
View courses Udemy- language english
- Training sessions 34
- duration 4:06:44
- Release Date 2024/06/25