Try our new research platform with insights from 80,000+ expert users
Thulani David Mngadi - PeerSpot reviewer
Data architect at Old Mutual
Real User
Top 5
Apr 8, 2024
Data flow feature is valuable for data transformation tasks
Pros and Cons
  • "The workflow automation features in GitLab, particularly its low code/no code approach, are highly beneficial for accelerating development speed. This feature allows for quick creation of pipelines and offers customization options for integration needs, making it versatile for various use cases. GitLab supports a wide range of connectors, catering to a majority of integration needs. Azure Data Factory's virtual enterprise and monitoring capabilities, the visual interface of GitLab makes it user-friendly and easy to teach, facilitating adoption within teams. While the monitoring capabilities are sufficient out of the box, they may not be as comprehensive as dedicated enterprise monitoring tools. GitLab's monitoring features are manageable for production use, with the option to integrate log analytics or create custom dashboards if needed. The data flow feature in Azure Data Factory within GitLab is valuable for data transformation tasks, especially for those who may not have expertise in writing complex code. It simplifies the process of data manipulation and is particularly useful for individuals unfamiliar with Spark coding. While there could be improvements for more flexibility, overall, the data flow feature effectively accomplishes its purpose within GitLab's ecosystem."
  • "Azure Data Factory could benefit from improvements in its monitoring capabilities to provide a more robust feature set. Enhancing the ease of deployment to higher environments within Azure DevOps would be beneficial, as the current process often requires extensive scripting and pipeline development. It is also known for the flexibility of the data flow feature, particularly in supporting more dynamic data-driven architectures. These enhancements would contribute to a more seamless and efficient workflow within GitLab."

What is our primary use case?

I can describe a scenario where I was tasked with developing a beta data trace and integration system. GitLab served as the data integration platform responsible for creating pipelines to extract data from various sources, including on-premises and cloud-based systems. Alongside GitLab Data Factory, we utilized Azure Logic Apps for orchestration and Azure Key Vault for securely storing data-related information. This combination enabled us to manage data extraction, transformation, and loading efficiently. The solution also involved Azure Data Lake for further data transformations, culminating in a comprehensive data processing engine that I played a key role in implementing.

What is most valuable?

The workflow automation features in GitLab, particularly its low code/no code approach, are highly beneficial for accelerating development speed. This feature allows for quick creation of pipelines and offers customization options for integration needs, making it versatile for various use cases. GitLab supports a wide range of connectors, catering to a majority of integration needs. Azure Data Factory's virtual enterprise and monitoring capabilities, the visual interface of GitLab makes it user-friendly and easy to teach, facilitating adoption within teams. While the monitoring capabilities are sufficient out of the box, they may not be as comprehensive as dedicated enterprise monitoring tools. GitLab's monitoring features are manageable for production use, with the option to integrate log analytics or create custom dashboards if needed.

The data flow feature in Azure Data Factory within GitLab is valuable for data transformation tasks, especially for those who may not have expertise in writing complex code. It simplifies the process of data manipulation and is particularly useful for individuals unfamiliar with Spark coding. While there could be improvements for more flexibility, overall, the data flow feature effectively accomplishes its purpose within GitLab's ecosystem.

What needs improvement?

Azure Data Factory could benefit from improvements in its monitoring capabilities to provide a more robust feature set. Enhancing the ease of deployment to higher environments within Azure DevOps would be beneficial, as the current process often requires extensive scripting and pipeline development. It is also known for the flexibility of the data flow feature, particularly in supporting more dynamic data-driven architectures. These enhancements would contribute to a more seamless and efficient workflow within GitLab.

For how long have I used the solution?

I have been using Azure Data Factory for the past 3 years. 

Buyer's Guide
Azure Data Factory
March 2026
Learn what your peers think about Azure Data Factory. Get advice and tips from experienced pros sharing their opinions. Updated: March 2026.
885,264 professionals have used our research since 2012.

What do I think about the stability of the solution?

I find GitLab to be quite stable overall. I haven't encountered significant issues with its stability and consider it a reliable platform.

What do I think about the scalability of the solution?

In terms of scalability, there are a few aspects of GitLab that I find disappointing. For instance, the limitation on self-hosted integration run time to just four VMs restricts scalability, especially for handling large volumes of data. Improvements are needed in this area to support more than four VMs for scalability. The documentation regarding bandwidth support is unclear, making it difficult to assess the full scalability potential. While GitLab performs well in cloud scalability in terms of compute power, the limitations on self-hosted integration run time are a concern for certain use cases. scalability in GitLab is highly dependent on specific use cases and could benefit from enhancements in self-hosted integration run time capabilities.

How are customer service and support?

As for technical support from Microsoft the response times and solutions can sometimes be delayed. 

Which solution did I use previously and why did I switch?

Our organization did not switch from a previous solution to Azure Data Factory; rather, we implemented Azure Data Factory as a new solution to enable cloud data processing capabilities.

How was the initial setup?

The deployment was handled in-house, and initially, we had a team of five working on the project in 2019. 

What about the implementation team?

We have approximately six to seven data engineers from various departments utilizing the solution in our organization. As for the frequency of using GitLab, it varies depending on the workload and projects, but on average, there is someone working on the platform at least several times a week, as our data engineers are involved in various tasks beyond GitLab.

What's my experience with pricing, setup cost, and licensing?

I am aware of the pricing of Azure Data Factory, but I prefer not to disclose specific details.

What other advice do I have?

In terms of handling complex data transformations and cleansing, Azure Data Factory is capable for simple to medium tasks, but for more complex tasks, we resort to custom coding solutions. Overall, I would recommend Azure Data Factory for data integration and management, and I would rate it an eight out of ten for its flexibility and ability to support third-party integrations.

Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
reviewer2394240 - PeerSpot reviewer
Complementary Worker On Assignment at a manufacturing company with 10,001+ employees
Real User
Top 20
Nov 5, 2024
Efficient data integration with seamless cloud orchestration
Pros and Cons
  • "The valuable feature of Azure Data Factory is its integration capability, as it goes well with other components of Microsoft Azure."
  • "Customer service is not satisfactory. Third-party personnel handle support and rely on a knowledge repository."

What is our primary use case?

We use Azure Data Factory to build data analytics products.

How has it helped my organization?

Azure Data Factory helps in data integration and data orchestration in a self-service way, and it is a native component to the Azure platform.

What is most valuable?

The valuable feature of Azure Data Factory is its integration capability, as it goes well with other components of Microsoft Azure.

What needs improvement?

I'm not confident in highlighting any potential room for improvement with Azure Data Factory at this time. To the best of my knowledge, it is satisfactory as it is.

For how long have I used the solution?

I have been using Azure Data Factory for the past six years.

What do I think about the stability of the solution?

I haven't encountered any stability issues with Azure Data Factory. However, I am not deeply technical and cannot comment on specifics.

What do I think about the scalability of the solution?

Azure Data Factory is scalable enough to deal with medium to large-size projects.

How are customer service and support?

Customer service is not satisfactory. Third-party personnel handle support and rely on a knowledge repository. Resolution times are long, and their ability to resolve issues could be improved.

How would you rate customer service and support?

Positive

Which solution did I use previously and why did I switch?

In the past, Talend Data Integration Studio was used, however, Azure was chosen for better integration with other Microsoft Azure components.

How was the initial setup?

Azure Data Factory does not require an initial setup since it's a cloud-based service.

Which other solutions did I evaluate?

We previously considered Talend for the same use case.

What other advice do I have?

Azure Data Factory is specifically meant for data integration and nothing more. For reporting and other capabilities, different Microsoft tools should be used.

I'd rate the solution nine out of ten.

Which deployment model are you using for this solution?

Public Cloud

If public cloud, private cloud, or hybrid cloud, which cloud provider do you use?

Microsoft Azure
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
Buyer's Guide
Azure Data Factory
March 2026
Learn what your peers think about Azure Data Factory. Get advice and tips from experienced pros sharing their opinions. Updated: March 2026.
885,264 professionals have used our research since 2012.
Monalisha Nayak - PeerSpot reviewer
Senior Data Engineer at Shell
Real User
Top 5
Jul 30, 2024
Helps to pull data from on-premises systems and supports large data volumes
Pros and Cons
  • "The solution handles large volumes of data very well. One of its best features is its ability to integrate data end-to-end, from pulling data from the source to accessing Databricks. This makes it quite useful for our needs."
  • "The main challenge with implementing Azure Data Factory is that it processes data in batches, not near real-time. To achieve near real-time processing, we need to schedule updates more frequently, which can be an issue. Its interface needs to be lighter."

What is our primary use case?

My main use case for Azure Data Factory is to pull data from on-premises systems. Most data transformation is done through Databricks, but Data Factory mainly pulls data into different services.

What is most valuable?

The solution handles large volumes of data very well. One of its best features is its ability to integrate data end-to-end, from pulling data from the source to accessing Databricks. This makes it quite useful for our needs.

What needs improvement?

The main challenge with implementing Azure Data Factory is that it processes data in batches, not near real-time. To achieve near real-time processing, we need to schedule updates more frequently, which can be an issue. Its interface needs to be lighter. 

One specific issue is with parallel executions. When running parallel executions for multiple tables, I noticed a performance slowdown.

For how long have I used the solution?

I have been working with the product for five years. 

What do I think about the stability of the solution?

We haven't faced any issues with the tool's stability. 

What do I think about the scalability of the solution?

The solution can handle large datasets. 

How are customer service and support?

I am satisfied with Microsoft's support. They provide solutions to our challenges. 

How would you rate customer service and support?

Positive

What's my experience with pricing, setup cost, and licensing?

The solution is cheap. 

What other advice do I have?

I rate the overall product an eight out of ten. 

Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
Ramya Kuppala - PeerSpot reviewer
Technical Manager at PalTech
Real User
Top 5
Jul 30, 2024
Provides orchestration and data flows for transformation for integration
Pros and Cons
  • "The data flows were beneficial, allowing us to perform multiple transformations."
  • "When we initiated the cluster, it took some time to start the process."

What is our primary use case?

We use the solution for building a few warehouses using Microsoft services.

How has it helped my organization?

We worked on a project for the textile industry where we needed to build a data warehouse from scratch. We provided a solution using Azure Data Factory to pull data from multiple files containing certification information, such as CSV and JSON. This data was then stored in a SQL Server-based data warehouse. We built around 30 pipelines in Azure Data Factory, one for each table, to load the data into the warehouse. The Power BI team then used this data for their analysis.

What is most valuable?

For the integration task, we used Azure Data Factory for orchestration and data flows for transformation. The data flows were beneficial, allowing us to perform multiple transformations. Additionally, we utilized web API activities to log data from third-party API tools, which greatly assisted in loading the necessary data into our warehouse.

What needs improvement?

When we initiated the cluster, it took some time to start the process. Most of our time was spent ensuring the cluster was adequately set up. We transitioned from using the auto integration runtime to a custom integration runtime, which showed some improvement.

For how long have I used the solution?

I have been using Azure Data Factory for four years.

What do I think about the stability of the solution?

When running the process server, we encountered frequent connection disconnect issues. These issues often stemmed from internal problems that we couldn’t resolve then, leading to repeated disruptions.

I rate the stability as seven out of ten.

What do I think about the scalability of the solution?

20 people are using this solution daily. I rate the scalability around eight out of ten.

How are customer service and support?

Customer service supported us whenever we needed it.

How would you rate customer service and support?

Positive

Which solution did I use previously and why did I switch?

We have used SQL Server.

How was the initial setup?

The initial setup is easy and takes four to five hours to complete.

What was our ROI?

They have reduced the infrastructure burden by 60 percent.

What's my experience with pricing, setup cost, and licensing?

Pricing is reasonable when compared with other cloud providers.

What other advice do I have?

We have used the Key value pair for authentication with the adoption. I can rate it around eight out of ten.

I recommend the solution.

Overall, I rate the solution a nine out of ten.

Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
PiyushAgarwal - PeerSpot reviewer
Associate Specialist at Synechron
Real User
Apr 11, 2023
We can integrate our Databricks notebooks and schedule them
Pros and Cons
  • "ADF is another ETL tool similar to Informatica that can transform data or copy it from on-prem to the cloud or vice versa. Once we have the data, we can apply various transformations to it and schedule our pipeline according to our business needs. ADF integrates with Databricks. We can call our Databricks notebooks and schedule them via ADF."
  • "I rate Azure Data Factory six out of 10 for stability. ADF is stable now, but we had problems recently with indexing on an SQL database. It's slow when dealing with a huge volume of data. It depends on whether the database is configured as general purpose or hyperscale."

What is our primary use case?

We are currently migrating from on-prem to the cloud, and our on-prem tables are getting data from upstream. We used ADF to build a pipeline to facilitate this migration. A team of 15-20 people currently uses ADF, and more will join once it goes live.

What is most valuable?

ADF is another ETL tool similar to Informatica that can transform data or copy it from on-prem to the cloud or vice versa. Once we have the data, we can apply various transformations to it and schedule our pipeline according to our business needs. ADF integrates with Databricks. We can call our Databricks notebooks and schedule them via ADF. 

For how long have I used the solution?

I have used Azure Data Factory for about six months.

What do I think about the stability of the solution?

I rate Azure Data Factory six out of 10 for stability. ADF is stable now, but we had problems recently with indexing on an SQL database. It's slow when dealing with a huge volume of data. It depends on whether the database is configured as general purpose or hyperscale. 

How was the initial setup?

I rate Azure Data Factory eight out of 10 for ease of setup. The deployment time depends on the data volume. Four million records will take longer than four thousand. Migrating our full load from on-prem to the cloud took around 16-18 hours because the volume was 17 million. 

What's my experience with pricing, setup cost, and licensing?

I rate ADF six out of 10 for affordability. The cost depends on the services we use. It's usage-based. 

What other advice do I have?

I rate Azure Data Factory seven out of 10. Companies that want to migrate from on-prem to the cloud have lots of options. I haven't explored them all, but Azure, GCP, and AWS are essentially all the same.

Which deployment model are you using for this solution?

Public Cloud

If public cloud, private cloud, or hybrid cloud, which cloud provider do you use?

Microsoft Azure
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
Director - Emerging Technologies at Speridian Technologies
Real User
Top 5
Jul 30, 2024
Helps to orchestrate workflows and supports both ETL and ELT processes
Pros and Cons
  • "Data Factory allows you to pull data from multiple systems, transform it according to your business needs, and load it into a data warehouse or data lake."
  • "While it has a range of connectors for various systems, such as ERP systems, the support for these connectors can be lacking."

What is our primary use case?

Azure Data Factory is primarily used to orchestrate workflows and move data between various sources. It supports both ETL and ELT  processes. For instance, if you have an ERP system and want to make the data available for reporting in a data lake or data warehouse, you can use Data Factory to extract data from the ERP system as well as from other sources, like CRM systems.

Data Factory allows you to pull data from multiple systems, transform it according to your business needs, and load it into a data warehouse or data lake. It also supports complex data transformations and aggregations, enabling you to generate summary and aggregate reports from the combined data. Data Factory helps you ingest data from diverse sources, perform necessary transformations, and prepare it for reporting and analysis.

How has it helped my organization?

I have extensive experience building things independently, with over twenty years of experience in SQL, ETL, and data-related projects. Recently, I have been using Azure Data Factory for the past two years. It has proven to be quite effective in handling large volumes of data and performing complex calculations. It allows for the creation of intricate data workflows and processes faster. Azure Data Factory is particularly useful for enterprise-level data integration activities, where you might deal with millions of records, such as in SAP environments. For example, SAP tables can contain tens or hundreds of millions of records. Managing and maintaining the quality of this data can be challenging, but Azure Data Factory simplifies these tasks significantly.

What is most valuable?

It is a powerful tool and is considered one of the leading solutions in the market, especially for handling large volumes of data. It is popular among large enterprises.

What needs improvement?

While it has a range of connectors for various systems, such as ERP systems, the support for these connectors can be lacking. Take the SAP connector, for example. When issues arise, it can be challenging to determine whether the problem is on Microsoft's side or SAP's side. This often requires working with both teams individually, which can lead to coordination issues and delays. It would be beneficial if Azure Data Factory provided better support and troubleshooting resources for these connectors, ensuring a smoother resolution of such issues.

For how long have I used the solution?

I have been using Azure Data Factory for two years.

What do I think about the stability of the solution?

I rate the solution's stability a nine out of ten.

What do I think about the scalability of the solution?

It's pretty good. There are no issues with scalability.

How are customer service and support?

The support has been good.

How would you rate customer service and support?

Positive

How was the initial setup?

It is straightforward to set up. However, ensuring its security requires careful configuration, which can vary depending on the organization's requirements. While the basic setup is user-friendly and doesn’t necessarily require advanced technical skills, securing the environment involves additional steps to prevent unauthorized access and ensure that data is only accessible from permitted locations. This can be more complex depending on the specific setup and organizational needs.

Setting up the infrastructure typically takes about two to three weeks and usually requires the effort of two people.

What was our ROI?

Azure Data Factory serves several important purposes. One key reason for using it is to build an enterprise data warehouse. This is crucial for centralizing data from various sources. Another reason is to gain insights from that data. By consolidating data in a unified location, you enable data scientists and engineers to analyze it and generate valuable insights.

Customers use Azure Data Factory to bring their data together, creating opportunities to understand their data better and extract actionable insights. However, simply consolidating data is not enough; the actual value comes from how you analyze and utilize it. This involves deriving insights, creating opportunities, and understanding customers better, which can significantly benefit the organization.

What's my experience with pricing, setup cost, and licensing?

Pricing is fine. It's a pay-as-you-go option.

It is in the same price range as other major providers. However, costs can vary depending on enterprise agreements and relationships.

What other advice do I have?

Overall, I rate the solution a nine out of ten.

Disclosure: My company has a business relationship with this vendor other than being a customer. Partner
PeerSpot user
PeerSpot user
Data Architect at World Vision
Real User
Top 5Leaderboard
Dec 13, 2022
The good, the bad and the lots of ugly
Pros and Cons
  • "The trigger scheduling options are decently robust."
  • "There is no built-in pipeline exit activity when encountering an error."

What is our primary use case?

The current use is for extracting data from Google Analytics into Azure SQL Database as a source for our EDW.  Extracting from GA was problematic with SSIS

The larger use case is to assess the viability of the tool for larger use in our organization as a replacement for SSIS for our EDW and also as an orchestration agent to replace SQL Agent for firing SSIS packages using Azure SSIS-IR.

The initial rollout was to solve the immediate problem while assessing its ability to be used for other purposes within the organization. And also establish the development and administration pipeline process.  

How has it helped my organization?

ADF allowed us to extract Google Analytics data (via BigQuery) without purchasing an adapter.  

It has also helped with establishing how our team can operate within Azure using both PaaS and IaaS resources and how those can interact. Rolling out a small data factory has forced us to understand more about all of Azure and how ADF needs to rely upon and interact with other Azure resources.

It provides a learning ground for use of DevOps Git along with managing ARM templates as well as driving the need to establish best practices for CI.  

What is most valuable?

The most valuable aspect has been a large list of no-cost source and target adapters.

It is also providing a PaaS ELT solution that integrates with other Azure resources. 

Its graphical UI is very good and is even now improving significantly with the latest preview feature of displaying inner activities within other activities such as forEach and If conditions.   

Its built-in monitoring and ability to see each activity's JSON inputs/outputs provide an excellent audit trail.

The trigger scheduling options are decently robust.

The fact that it's continually evolving is hopeful that even if some feature is missing today, it may be soon resolved. For example, it lacked support for simple SQL activity until earlier this year, when that was resolved. They have now added a "debug until" option for all activities. The Copy Activity Upsert option did not perform well at all when I first started using the tool but now seems to have acceptable performance.  

The tool is designed to be metadata driven for large numbers of patterned ETL processes, similar to what BIML is commonly used for in SSIS but much simpler to use than BIML. BIML now supports generating ADF code although with ADF's capabilities I'm not sure BIML still holds its same value as it did for SSIS.

What needs improvement?

The list of issues and gaps in this tool is extensive, although as time goes on, it gets shorter. It currently includes:

1) Missing email/SMTP activity

2) Mapping data flows requires significant lag time to spin up spark clusters

3) Performance compared to SSIS. Expect copy activity to take ten times that of what SSIS takes for simple data flow between tables in the same database

4) It is missing the debug of a single activity. The workaround is setting a breakpoint on the task and doing a "rerun from activity" or setting debug on activity and running up to that point

5) OAuth 2.0 adapters lack automated support for refresh tokens

6) Copy activity errors provide no guidance as to which column is causing a failure

7) There's no built-in pipeline exit activity when encountering an error

8) Auto Resolve Integration runtime should never pick a region that you're not using (should be your default for your tenant)

9) IR (integration runtime) queue time lag. For example, a small table copy activity I just ran took 95 seconds of queuing and 12 seconds to actually copy the data. Often the queuing time greatly exceeds the actual runtime

10) Activity dependencies are always AND (OR not supported). This is a significant missing capability that forces unnecessary complex workarounds just to handle OR situations when they could just enhance the dependency to support OR like SSIS does. Did I just ask when ADF will be as good as SSIS?  

They need to fix bugs. For example:

1) The debug sometimes stops picking up saved changes for a period of time, rendering this essential tool useless during that time

2) Enable interactive authoring (a critical tool for development) often doesn't turn on when enabled without going into another part of the tool to enable it. Then, you have to wait several minutes before it's enabled which is time you're blocked from development until it's ready.  And then it only activates for up to 120 minutes before you have to go through this all over again. I think Microsoft is trying to torture developers

3) Exiting the inside of an activity that contains other activities always causes the screen to jump to the beginning of a pipeline requiring re-navigating where you were at (greatly slowing development productivity)

4) Auto Resolve Integration runtime (using default settings) often picks remote regions (not necessarily even paired regions!) to operate, which causes either an unnecessary slowdown or an error message saying it's unable to transfer the volume of data across regions

5) Copy activity often gets the error "mapping source is empty" for no apparent reason. If you play with the activity such as importing new metadata then it's happy again. This sort of thing makes you want to just change careers. Or tools. 

For how long have I used the solution?

I have been using this product for six months.

What do I think about the stability of the solution?

Production operation seems to run reliably so far, however, the development environment seems very buggy where something works one day and not the next. 

What do I think about the scalability of the solution?

So far, the performance of this solution is abysmal compared to SSIS. Especially with small tasks such as copying activity from one table to another within the same database. 

How are customer service and support?

Customer support is non-existent. I logged multiple issues only to hear back from 1st level support weeks later asking questions and providing no help other than wasting my time. In one situation it was a bug where the debug function stopped working for a couple of days. By the time they got back to me, the problem went away. 

How would you rate customer service and support?

Negative

Which solution did I use previously and why did I switch?

We have been and still rely on SSIS for our ETL. ADF seems to do ELT well but I would not consider it for use in ETL at this time.  Its mapping data flows are too slow (which is a large understatement) to be of practical use to us. Also, the ARM template situation is impractical for hundreds of pipelines like we would have if we converted all our SSIS packages into pipelines as a single ADF couldn't take on all our pipelines. 

How was the initial setup?

Initial setup is the largest caveat for this tool. Once you've organized your Azure environment and set up DevOps pipelines, the rest is a breeze. But this is NOT a trivial step if you're the first one to establish the use of ADF at your organization or within your subscription(s). Instead of learning just an ETL tool, you have to get familiar with and establish best practices for the entire Azure and DevOps technologies. That's a lot to take on just to get some data movements operational. 

What about the implementation team?

I did this in-house with the assistance of another team who uses DevOps with Azure for other purposes (non-ADF use). 

What's my experience with pricing, setup cost, and licensing?

The setup cost is only the time it takes to organize Azure resources so you can operate effectively and figure out how to manage different environments (dev/test/sit/UAT/prod, etc.). Also, how to enable multiple developers to work on a single data factory without losing changes or conflicting with other changes.

Which other solutions did I evaluate?

We operate only with SSIS today, and it works very well for us. However, looking toward the future, we will need to eventually find a PaaS solution that will have longer sustainability.

Which deployment model are you using for this solution?

Public Cloud

If public cloud, private cloud, or hybrid cloud, which cloud provider do you use?

Microsoft Azure
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
VismayChawla - PeerSpot reviewer
DGM - Business Intelligence at a comms service provider with 1,001-5,000 employees
Real User
Top 10
Jan 17, 2025
Cloud integration and flexible data handling meet our needs effectively
Pros and Cons
  • "I find that the solution integrates well with cloud technologies, which we are using for different clouds like Snowflake and AWS"
  • "I find that the solution integrates well with cloud technologies, which we are using for different clouds like Snowflake and AWS."
  • "I do not have any notes for improvement."

What is our primary use case?

I'm a customer. I'm using Azure Data Factory.

What is most valuable?

I find that the solution integrates well with cloud technologies, which we are using for different clouds like Snowflake and AWS. It is much more flexible in terms of transferring data from on-premise or on cloud. There is no need to create different mappings for different tables. The platform has the capability to handle metadata efficiently. So all our needs are being fulfilled with the platform we have right now.

What needs improvement?

I do not have any notes for improvement.

For how long have I used the solution?

I have used it for almost six years.

What do I think about the stability of the solution?

The stability is quite good.

Which solution did I use previously and why did I switch?

We wanted to move on to the cloud.

How was the initial setup?

The initial setup is almost not difficult for technical people.

What other advice do I have?

I used to work on Informative Support Center. Now we are using Azure Data Factory

I rate it eight out of ten. 

Which deployment model are you using for this solution?

Public Cloud

If public cloud, private cloud, or hybrid cloud, which cloud provider do you use?

Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
Buyer's Guide
Download our free Azure Data Factory Report and get advice and tips from experienced pros sharing their opinions.
Updated: March 2026
Buyer's Guide
Download our free Azure Data Factory Report and get advice and tips from experienced pros sharing their opinions.