2019-09-02T05:33:00Z

What is your primary use case for Apache Airflow?

Miriam Tover - PeerSpot reviewer
  • 2
  • 563
PeerSpot user
21

21 Answers

Mikalai Surta - PeerSpot reviewer
MSP
Top 5
2024-02-22T08:57:39Z
Feb 22, 2024

We use Apache Airflow for the orchestration of data pipelines.

Search for a product comparison
FB
Real User
Top 20
2024-01-15T09:28:17Z
Jan 15, 2024

Our use cases are a bit complex, but primarily for data extraction, transformation, and loading (ETL) tasks.

Punit_Shah - PeerSpot reviewer
Reseller
Top 10
2023-12-22T14:15:00Z
Dec 22, 2023

We utilize Apache Airflow for two primary purposes. Firstly, it serves as the tool for ingesting data from the source system application into our data warehouse. Secondly, it plays a crucial role in our ETL pipeline. After extracting data, it facilitates the transformation process and subsequently loads the transformed data into the designated target tables.

Luiz Cesar Gosi - PeerSpot reviewer
Real User
Top 5Leaderboard
2023-10-19T14:21:11Z
Oct 19, 2023

We use Apache Airflow for data orchestration.

SabinaZeynalova - PeerSpot reviewer
Real User
Top 5
2023-09-22T12:41:00Z
Sep 22, 2023

We use Apache Airflow for the automation and orchestration of model deployment, training, and feature engineering steps. It is a model lifecycle management tool.

SUDHIR KUMAR RATHLAVATH - PeerSpot reviewer
Real User
Top 5
2023-07-26T20:06:26Z
Jul 26, 2023

Apache Airflow is like a freeway. Just as a freeway allows cars to travel quickly and efficiently from one point to another, Apache Airflow allows data engineers to orchestrate their workflows in a similarly efficient way. There are a lot of scheduling tools in the market, but Apache Airflow has taken over everything. With the help of airflow operators, any task required for day-to-day data engineering work becomes possible. It manages the entire lifecycle of data engineering workflows.

Learn what your peers think about Apache Airflow. Get advice and tips from experienced pros sharing their opinions. Updated: March 2024.
765,386 professionals have used our research since 2012.
AT
Real User
Top 20
2023-06-28T06:37:50Z
Jun 28, 2023

We use Apache Airflow to send our data to a third-party system.

Ravan Nannapaneni - PeerSpot reviewer
Real User
Top 10
2023-03-31T09:53:26Z
Mar 31, 2023

Apache Airflow is utilized for automating data engineering tasks. When creating a sequence of tasks, Airflow can assist in automating them.

Fadi Bathish - PeerSpot reviewer
Real User
Top 5
2023-02-20T16:14:00Z
Feb 20, 2023

We use this solution to monitor BD tasks.

Joaquin Marques - PeerSpot reviewer
Real User
Top 5Leaderboard
2022-12-01T16:03:48Z
Dec 1, 2022

Our primary use case for the solution is setting up workflows and processes applied everywhere because most industries are based on workflows and processes. We've deployed it for all kinds of workflows within the organization.

Mahendra Prajapati - PeerSpot reviewer
Real User
Top 5
2022-08-31T14:14:49Z
Aug 31, 2022

Our primary use case for this solution is scheduling task rates. We capture the data from the SQL Server location and migrate it to the central data warehouse.

Nomena NY HOAVY - PeerSpot reviewer
Real User
Top 10
2022-06-20T12:19:00Z
Jun 20, 2022

Currently, I am a lead data scientist. Our primary use cases for Apache Airflow are for all orchestrations, from the basic big data lake to machine learning predictions. It is used for all the MLS processes. It is also used for some ELT, to transform, load, and export all big data from restricted, unrestricted, and all phase processes.

NK
Real User
Top 20
2021-10-04T07:20:31Z
Oct 4, 2021

The primary use case is the orchestration and automation of ELT/ETL data pipelines. 


Apache Airflow is great in this respect and there are scheduling options to make it fully automated based on the used case.

JR
Real User
2021-10-05T14:49:32Z
Oct 5, 2021

I've used it at past companies to build a data warehouse for analytics (populating redshift/snowflake).



My current company is using it for similar purposes but more to pull data from data sources across the company, join them into a central data repository (we're currently using postgres), and build datasets with this data. Having this central data repository will help serve other use cases for us in the future.


JR
Real User
2021-03-26T23:33:18Z
Mar 26, 2021

I'm a data engineer. In the past, I used Airflow for building data pipelines and to populate data warehouses. With my current company, it's a data product or datasets that we sell to biopharma companies. We are using those pipelines to generate those datasets.

MW
Real User
Top 20
2021-02-11T12:31:46Z
Feb 11, 2021

There are a few use cases we have for Apache Airflow, one being government projects where we perform data operations on a monthly basis. For example, we'll collect data from various agencies, harmonize the data, and then produce a dashboard. In general, it's a BI use case, but focusing on social economy. We concentrate mainly on BI, and because my team members have strong technical backgrounds we often fall back to using open source tools like Airflow and our own coded solutions. For a single project, we will typically have three of us working on Airflow at a time. This includes two data engineers and a system administrator. Our infrastructure model is hybrid, based both in the cloud and on-premises.

CP
Real User
2021-01-15T22:07:12Z
Jan 15, 2021

We mainly used the solution in banking, finance, and insurance. We are looking for some opportunities in production companies, but this is only at the very early stages.

AJ
Real User
2020-12-23T23:10:23Z
Dec 23, 2020

Our primary use case is to integrate with SLAs.

JP
Real User
2020-12-22T20:05:08Z
Dec 22, 2020

We normally use the solution for creating a specific flow for data transformation. We have several pipelines that we use and due to the fact that they're pretty well-defined, we use it in conjunction with other tools that do the mediation portion. With Airflow, we do the processing of such data.

SG
Real User
2020-04-13T06:27:00Z
Apr 13, 2020

We are a technology, media, and entertainment-technology company. We are using Apache Airflow for architecting our media workflows. We are using it for two major workflows. We have had it set up for some time on our own cloud. Recently, we migrated the setup to AWS.

AN
Real User
Top 20
2019-09-02T05:33:00Z
Sep 2, 2019

The primary use case for this solution is to automate ETL process for datawarehouse.

Apache Airflow is an open-source workflow management system (WMS) that is primarily used to programmatically author, orchestrate, schedule, and monitor data pipelines as well as workflows. The solution makes it possible for you to manage your data pipelines by authoring workflows as directed acyclic graphs (DAGs) of tasks. By using Apache Airflow, you can orchestrate data pipelines over object stores and data warehouses, run workflows that are not data-related, and can also create and manage...
Download Apache Airflow ReportRead more