How do you or your organization use this solution?
Please share with us so that your peers can learn from your experiences.
Thank you!
We are currently using it as an ETL (Extract, Transform, and Load) tool. We are using it to connect to various information providers or, in general, to various sources, to extract data, and then to insert it to our storage devices, databases, or data warehouses.
We use this solution to perform ELTs so that we do not need to keep code within a database.
My primary use case of Azure Data Factory is supporting the data migration for advanced analytics projects.
We had an old, traditional data warehouse. We decided to put it into the cloud and we used Azure Data Factory to reform the EEL process from SQL server integration services to extra data.
Our customers use it for data analytics on a large volume of data. So, they're basically bringing data in from multiple sources, and they are doing ETL extraction, transformation, and loading. Then they do initial analytics, populate a data lake, and after that, they take the data from the data lake into more on-premise complex analytics. Its version depends on a customer's environment. Sometimes, we use the latest version, and sometimes, we use the previous versions.
We primarily used this solution for getting data from a client's server, or online data, to an Azure Data Lake. We create pipelines to orchestrate the data flow from source to target.
I use Azure Data Factory in my company because we are implementing a lot of different projects for a big company based in the USA. We're getting certain information from different sources—for example, some files in the Azure Blob Storage. We're migrating that information to other databases. We are validating and transforming the data. After that, we put that data in some databases in Azure Synapse and SQL databases.
The solution is primarily used for data integration. We are using it for the data pipelines to get data out of the legacy systems and provide it to the Azure SQL Database. We are using the SQL data source providers mainly.
My primary use case is getting data from the sensors. The sensors are installed on the various equipment across the plant, and this sensor gives us a huge amount of data. Some are captured on a millisecond basis. What we are able to do is the data into Azure Data Factory, and it has allowed us to scale up well. We are able to utilize that data for our predictive maintenance of the assets of the equipment, as well as the prediction of the breakdown. Specifically, we use the data to look at predictions for future possible breakdowns. At least, that is what we are looking to build towards.
We are not using this product specifically as a data factory. We have taken Synapse Analytics as the entire component for the data warehousing solution. Azure Data Factory is one of the components of that, and we are using it for ETL.
Azure Data Factory is for data transformation and data loading. It works from your transaction systems, and we are using it for our HRMS, Human Resource Capital Management System. It picks up all the transactional data pick and moves into the Azure Data Warehouse. From there, we would like to create reports in terms of our financial positions and our resource utilization project. These are the reports that we need to build onto the warehouse. The purpose of Azure Data Factory is more about transformations, so it doesn't need to have a good dashboard. But, it has a feeding user interface for us to do our activities and debug actions. I think that's good enough.
The primary use case of this solution is for data integration.
We use this solution for data integration. We use it to feed operational data into a data warehouse. We also use it for creating connections between applications. Within our organization, there are a few thousand users of Azure Data Factory. We believe that the number of customers and usage of this product will extend over the next few years. For this reason, we invest a lot of resources in building skills, and we make sure to hire consultants who know their way around Data Factory.
The primary use case is integrating data from different ERP systems and loading it into Azure Synapse for reporting. We use Power BI for the reporting side of it. We also have customers who are migrating to Azure Data Factory and we are assisting them with making the transition.
We are using this solution to gather information from SCADA systems, analyze it using AI and machine learning, and then sending the results to our users. They receive and view the data using the Power BI interface.
I primarily use the solution for my small and medium-sized clients.
We are a tech services company and this is one of the tools that we use when implementing solutions for our clients. I am currently managing a team that is working with the Azure Data Factory. Our clients that use this solution are migrating their data from on-premises to the cloud. One of our clients is building an integrated data warehouse for all of their data, using this solution. It is used to extract all of the data from different servers and store it into one place.
It's an integration platform, we migrate data across hybrid environments. We have data in our cloud environment or on-prem system so we use it for when we want to integrate data across different environments. It was a problem for us to get data from different hybrid environments.
The use cases are more related to logistics, our finance, and back-office activity.
There was a need to bring a lot of CRM and marketing data for some PNL analysis. We are connecting to the Salesforce cloud. In it, there's a specific solution in Salesforce Core CRM for the pharmaceutical industry. We are using the solution to connect to that and we are bringing in the various dimensions and transactions from that data source.
We are working on a data warehouse integration which means that I am working on some big data projects. I'm preparing data for the licensing. One of the projects is preparing data in Azure Data Lake, to run some transformation scripts, perform some ETL processing, and to fulfill the stage layer of the data warehouse. It means that I help with ETL use cases.
Used Azure Data Factory, Data Flow (private preview) and Databricks to develop data integration processes from multiple and varied external software sources to an OLTP application Azure SQL database. The tools are impressively well-integrated, allowing quick development of ETL, big data, data warehousing and machine learning solutions with the flexibility to grow and adapt to changing or enhanced requirements. I can't recommend it highly enough.
Hi peers,
When evaluating Data Integration tools, what aspects do you think are the most important to look for? Let the community members know what you think.
Thank you for sharing your knowledge!