Coming October 25: PeerSpot Awards will be announced! Learn more
Ariel Lindenfeld - PeerSpot reviewer
Director of Content at PeerSpot (formerly IT Central Station)
  • 23
  • 309

When evaluating Data Integration, what aspect do you think is the most important to look for?

Hi peers,

When evaluating Data Integration tools, what aspects do you think are the most important to look for? Let the community members know what you think. 

Thank you for sharing your knowledge!

PeerSpot user
24 Answers
Arthur Kancelarovicz - PeerSpot reviewer
BI Specialist at a educational organization with 501-1,000 employees
Real User
08 July 21


My experiences says :

1. Project budget;

2. Needed connection(s) available(s) (natively preferred - without third party drivers to do that) - think about web services requirements and cloud storage;

3. Easily and quickly to understand and start developing;

4. Quantity of professionals in market who knows how to maintain it (human resources are volatiles);

5. Performance for get, transform and stock the data (internal and big data if needed);

6. Capacity to stock the last well execution of a scheduled job - and to retrieve from the unsuccessfully point;

7. Versioning available (Git, Source control, embedded one) for simultaneous development and easy way to deploy it in multiples environments.

Sure that many other questions needs to be answered, but the very first is always ROI.



Kirill Slivchikov - PeerSpot reviewer
Owner at 7Spring Consult
Real User
30 September 19

Connections - what data sources and targets it can connect to.

Flexibility - can you code transformation rules on Java, C#, Python.

Data Quality features.

Usability of tracing and monitoring instruments.

Stability of work and ability of "try-except" transformations.

Brian Dandeneau - PeerSpot reviewer
Business Process and Strategy Specialist Advisor at NTTData
Top 5Leaderboard
28 April 16

When evaluating data integration, think about versioning and audibility. 

Other ETL/ELT tools preach it, however ODI lives and breaths it. Also, look at reusability. 12c especially has lots of cool reusable parts that will make development easy and quick. 

Security should also be at the top of the list. 

Can you lock someone down to a single job or even a portion of that job? ODI you can. 

Are you looking for a data warehouse tool of just something to copy a file from one place to another? Even though ODI can do both I would say that you would be killing a fly with an atom bomb if you just need to shuffle files around. 

Think about what you need to "hook" into now and in the future. 

ODI you can create custom connections, so even if you forgot about something most likely you can connect to it. I have even hooked it to iTunes reports.

Djalma Gomes, Pmp, Mba - PeerSpot reviewer
Managing Partner at Data Pine
Top 5
07 July 21

There are 2 types of data integration.  The one you need to use some sorte of ETL to load the adjusted data into another database and the one you use virtualization data tool to adjust the data but keep them in their original places.

Costs are totally different and you need to really think through your business needs in order not to buy salespeople speech.

Then, you need to think a cohexistence between validated data and non validated data.  You will probably need them both since the timing to adjust data can be long depending on system and processes reviews

You will also need a data catalog to keep track of data and have some governance on the data you have

And finaly, you will need to think of a sustained solution.  You will probably prioritize the data to be integrated and cleansed and types of data and connectors may change along the time (don´t make the mistake to think your data and connectors currently need will remain unchanged in the years to come)

Antonio Carlos Murayama - PeerSpot reviewer
Senior Sales Account Executive - Software at First Decision
06 July 21

Capacidade em atender ambientes híbridos considerando plataformas, banco de dados, sistemas operacionais e aplicações variadas que rodam de forma isolada mas requerem algum tipo de comunicação e integração. Pode ser aberta do tipo OpenSource, fazer uso extensivo de API´s, fáceis de usar oferecendo performance, compatibilidades, segurança nas autenticações e manutenção gerenciada (DevOps). 

Anisa Patel - PeerSpot reviewer
Senior Consultant at RXP Services Limited
Real User
30 September 20

Ease of connecting to multiple source system. Inbuilt testing facility in between the ETL pipeline.

User friendly GUI .

Should include templates for generic task such as SCD1, SCD2 , Delta Load 

Find out what your peers are saying about Informatica, Microsoft, Talend and others in Data Integration Tools. Updated: September 2022.
633,952 professionals have used our research since 2012.
Phil Wilkins - PeerSpot reviewer
Enterprise Integration Architect at Capgemini
Top 5Leaderboard
07 November 19

I would be looking for things like:
- types of connections supported
- data transformation capabilities
- throughput
- can it support micro batching
- can a process be triggered by a data source
- security
- how does it work in a Hybrid scenario (assuming the organization isn't cloud-born)
- licensing and support costs (even open source has support implications - even if it's being patched by your own devs)
- expertise in the product, and product roadmap/life -- if it's difficult to get expertise in using a product or at least support until your own team is competent a problem can incur a lot of delays. If a product is approaching end of life - then skills with the product will disappear, you'll eventually need to change your solution

Nida Fatima - PeerSpot reviewer
Marketing Communications Manager at Astera
05 November 18

- Ease of use - The solution should offer the same level of usability to both IT and business users.
- Support for both batch and transaction-based integration
- Workflow automation - I would not want to spend my time scheduling and monitoring recurring jobs. Therefore, there should be support for time-based and event-based scheduling.
- Connectivity - Any business today works with a plethora of legacy and modern data sources. So the solution should offer out-of-the-box connectivity to a range of source and target databases.

Kieran Connolly - PeerSpot reviewer
Senior Application Engineer at Swiss Re at a insurance company with 10,001+ employees
Real User
12 June 18

Flexibility - can you code complex business domain rules using VB or C++?
Connections - what data sources it connects with and how it connects to them.
Stability - will it crash in development mode?
Reuse - can you create and re-use modules in multiple projects and deploy to server tasks?

Theodore Omtzigt - PeerSpot reviewer
Real User
30 January 20

For advanced data integration flows that ingest time series and similar type of measurement data that comes of a physical process (anything IoT), you stand to benefit from a characterization and resampling flow. Most ETL tools are database oriented instead of model characterization and model prediction oriented. When dealing with sensor networks of any kind, ETL system are not the right tool for the job.

18 November 19

Ease of modelling and deployment, connectors availability out of the box,,workflow automation,ETL capability,audit and control, transaction and batch processing, continuous Synch,low code,visual interface.

Solution Architect at Larsen & Toubro Infotech Ltd.
Real User
13 September 19

Ease of use for ETL
Advanced ETL features for flexibility
Easy to test/debug
Templates/Pre-built functionalities

it_user621570 - PeerSpot reviewer
Director of Human Resources at a tech company with 51-200 employees
06 March 17

Ease of use for ETL
Advanced ETL features for flexibility
Easy to test/debug
Templates/Pre-built functionalities

PeerSpot user
Business Intelligence and Decision Support Team Leader at a university with 1,001-5,000 employees
31 July 15

Data profiling, easy to use, connectivity capabilities to different kind of sources (unstructured data, flat files, common rdms, soap and json ) advanced data transform capabilties

PeerSpot user
Project Associate at a tech vendor with 1,001-5,000 employees
27 April 15

Less code more productivity

PeerSpot user
Owner at Crystal Solutions
07 April 15

Data Quality, Data governance, Data profiling, and advanced ETL functions embedded, multiples and native connectivity with structured and unstructured data.

Gurcan Orhan - PeerSpot reviewer
Data Quality Software Development Manager at Yapı Kredi Bank
Real User
26 October 16

1. Flexibility. A DI tool should be like water to fit the shape of each glass every time. Ability to learn.!
2. Ease of development, installation, implementing topology architecture.
3. Reusability of coding.
4. Ease of maintenance, management and operation.
5. Learning curve.
6. Ability to talk with related products (Data Quality, Replication, etc.) fully integrated and out-of-the-box.

PeerSpot user
BI Architect /Project Manager, Manufacturing Domain at Tata Consultancy Services
Real User
11 April 16

Ease of data extract, Ability to support complex Integration between desperate systems, Ability to feed data to different downstream systems, Ability to perform data quality check and Availability of ETL out of box functions..

PeerSpot user
Senior Director, Transition Services at a tech services company with 501-1,000 employees
30 September 15

Ease of use (modeling), flexible options for transformations and custom code, data source agnostic, efficient processing engine, real time monitoring and solid debug tools, good reuse options (refactoring segments of a process to new projects or flows, etc.) good but flexible governance and good documentation (or strong Google search results).

PeerSpot user
Technology Analyst at a consultancy with 5,001-10,000 employees
31 August 15

Data quality, data governance, possibility for advance data transformations in a much more easier manner

PeerSpot user
Owner at a tech services company
29 July 15

Data Quality, Governance, Data profiling, Flexibility and ease of use

PeerSpot user
Consultant - Oracle ACE with 51-200 employees
23 July 15

Reusability, Flexibility, Data Governance, Data Quality, Connectivity

PeerSpot user
Project Manager / Team Leader / Siebel Expert / Technical Architect at a tech vendor
10 March 15

Easy to use, Easy to manage

PeerSpot user
Database And BI Developer at a tech consulting company
22 December 14

Data Quality, Data Volume, Frequency Of Update (Schedule) and Cross Object communication In an overview.

There could other factors as well but primarily I will go to evaluate this.

Related Questions
Netanya Carmi - PeerSpot reviewer
Content Manager at PeerSpot (formerly IT Central Station)
Sep 14, 2022
Why do you recommend this particular data integration tool?
See 2 answers
Beth Safire - PeerSpot reviewer
Tech Blogger
04 September 22
I highly recommend Informatica PowerCenter as a data integration tool if you are looking for a solution for a large enterprise. I would say that it is one of the most stable and reliable platforms for data governance. Informatica PowerCenter is ideal for integrating and performing transformations on very large amounts of data in a very short amount of time. This is a very mature and complete product that easily performs complex data transformations. It is very stable and reliable and can easily handle large data sources at any given time. We have all types of different files coming into our systems and Informatica PowerCenter is able to accept all these different sources, such as spreadsheets and databases, and then perform all types of transformations on their data. One of the features that we find to be most valuable is the metadata repository because it can easily understand the lineage of first target mapping. After you provide the source and the target, mappings are automatically done, which makes it easy to use for the development team. We can use Informatica PowerCenter to transfer transitional databases to and from the data warehouse. Using this method, we can easily find our data reports and filter the data, which we use to create AI models. We also use it in order to create data warehouse applications, which can then be used for analytics, helping us generate business value. Informatica PowerCenter has a very nice user interface. Our staff finds that it is really intuitive and user friendly - it is easy to work with and can be picked up in just a few days. We are very satisfied with the technical support that Informatica PowerCenter offers. They are available 24/7. The support staff is knowledgeable and responsive. There is also a good online knowledge base. The basic deployment is straightforward. But for systems with data integration services and other services, the deployment will take more time. Some of the benefits and advantages of Informatica PowerCenter include: Performance: Informatica PowerCenter is a very reliable and robust tool for data management and data governance strategy. It can manage and transform huge quantities of data. Stability: The system is reliable and does not crash or stall. Customer support: Customer support is available for customers 24/7. It is also very easy to find helpful information online. Ease-of-use: The system is user-friendly and users are able to quickly learn how to effectively use it in a short amount of time. The main downside of the product is its price. It is a bit on the expensive side and probably too pricey for mid and small sized companies. Another feature we would like to see integrated with the tool is monitoring. Currently, there is no viable monitoring solution. Informatica PowerCenter provides us with all the functionality that we need and more. I definitely recommend it. Users that are familiar with ETL tools will find it easy and worthwhile to use Informatica PowerCenter. I rate it a nine out of ten.
Deepak Damodarr - PeerSpot reviewer
Data Office Lead at a comms service provider with 501-1,000 employees
14 September 22
As in most platform selection situations, you should create a capability framework that will satisfy your use case and then use the budget criteria to shortlist the vendors.
Avigail Sugarman - PeerSpot reviewer
Community Manager at PeerSpot (formerly IT Central Station)
May 11, 2016
He wrote "As with all ETL tools in general, it does not include a good data reporting feature." Do you agree with this assessment? Read full review here:
See 2 answers
PeerSpot user
SQL DBA at a computer software company with 51-200 employees
29 September 14
To determine complexity and system loads easier to end users, companies build up ETL tools to make the process easier and less time consuming. These are used especially in business intelligence and data warehouse projects. On the back of ETL tools there have been some large no. of growth in market. Microsoft introduced SQL Server 2005 with products like SSIS (SQL Server Integration Services). Yes, for improvement it needs to add field names for new groups. Overall ETL tools are good to be considered reporting tools.
Brian Dandeneau - PeerSpot reviewer
Business Process and Strategy Specialist Advisor at NTTData
11 May 16
WIth Cloud implementations there are plenty of analytics that you can reference when moving data using ETL tools. Even outside of the Cloud space I think that there are plenty of options as long as you are on Oracle Products to do this type of analytics. But really what it boils down to is do you have the relationship with the DBA to make these analytics happen and give you access to the dashboard. Basically, the option is there its just not a "in your face solution". Would it be great to have that on the standard ETL tool....sure, but really even if you had it do you know how to use it properly. I know what you are thinking...."But I know how to use it so therefore I should have it".......OK....Maybe you know but you in the lower percentage with me and a few other people. ROI on that isn't there which is why they haven't done this.
Related Categories
Download Free Report
Download our free Data Integration Tools Report and find out what your peers are saying about Informatica, Microsoft, Talend, and more! Updated: September 2022.
633,952 professionals have used our research since 2012.