Ariel Lindenfeld - PeerSpot reviewer
Director of Community at PeerSpot
  • 22
  • 286

When evaluating Data Integration, what aspect do you think is the most important to look for?

Hi peers,

When evaluating Data Integration tools, what aspects do you think are the most important to look for? Let the community members know what you think. 

Thank you for sharing your knowledge!

PeerSpot user
24 Answers
BI Specialist at a educational organization with 501-1,000 employees
Real User
Jul 8, 2021


My experiences says :

1. Project budget;

2. Needed connection(s) available(s) (natively preferred - without third party drivers to do that) - think about web services requirements and cloud storage;

3. Easily and quickly to understand and start developing;

4. Quantity of professionals in market who knows how to maintain it (human resources are volatiles);

5. Performance for get, transform and stock the data (internal and big data if needed);

6. Capacity to stock the last well execution of a scheduled job - and to retrieve from the unsuccessfully point;

7. Versioning available (Git, Source control, embedded one) for simultaneous development and easy way to deploy it in multiples environments.

Sure that many other questions needs to be answered, but the very first is always ROI.



Search for a product comparison in Data Integration Tools
Owner at 7Spring Consult
Real User
Sep 30, 2019

Connections - what data sources and targets it can connect to.

Flexibility - can you code transformation rules on Java, C#, Python.

Data Quality features.

Usability of tracing and monitoring instruments.

Stability of work and ability of "try-except" transformations.

Business Process and Strategy Specialist Advisor at NTTData
Top 5Leaderboard
Apr 28, 2016

When evaluating data integration, think about versioning and audibility. 

Other ETL/ELT tools preach it, however ODI lives and breaths it. Also, look at reusability. 12c especially has lots of cool reusable parts that will make development easy and quick. 

Security should also be at the top of the list. 

Can you lock someone down to a single job or even a portion of that job? ODI you can. 

Are you looking for a data warehouse tool of just something to copy a file from one place to another? Even though ODI can do both I would say that you would be killing a fly with an atom bomb if you just need to shuffle files around. 

Think about what you need to "hook" into now and in the future. 

ODI you can create custom connections, so even if you forgot about something most likely you can connect to it. I have even hooked it to iTunes reports.

Managing Partner at Data Pine
Top 5
Jul 7, 2021

There are 2 types of data integration.  The one you need to use some sorte of ETL to load the adjusted data into another database and the one you use virtualization data tool to adjust the data but keep them in their original places.

Costs are totally different and you need to really think through your business needs in order not to buy salespeople speech.

Then, you need to think a cohexistence between validated data and non validated data.  You will probably need them both since the timing to adjust data can be long depending on system and processes reviews

You will also need a data catalog to keep track of data and have some governance on the data you have

And finaly, you will need to think of a sustained solution.  You will probably prioritize the data to be integrated and cleansed and types of data and connectors may change along the time (don´t make the mistake to think your data and connectors currently need will remain unchanged in the years to come)

Senior Sales Account Executive - Software at First Decision
Jul 6, 2021

Capacidade em atender ambientes híbridos considerando plataformas, banco de dados, sistemas operacionais e aplicações variadas que rodam de forma isolada mas requerem algum tipo de comunicação e integração. Pode ser aberta do tipo OpenSource, fazer uso extensivo de API´s, fáceis de usar oferecendo performance, compatibilidades, segurança nas autenticações e manutenção gerenciada (DevOps). 

Enterprise Integration Architect at Capgemini
Top 5Leaderboard
Nov 7, 2019

I would be looking for things like:
- types of connections supported
- data transformation capabilities
- throughput
- can it support micro batching
- can a process be triggered by a data source
- security
- how does it work in a Hybrid scenario (assuming the organization isn't cloud-born)
- licensing and support costs (even open source has support implications - even if it's being patched by your own devs)
- expertise in the product, and product roadmap/life -- if it's difficult to get expertise in using a product or at least support until your own team is competent a problem can incur a lot of delays. If a product is approaching end of life - then skills with the product will disappear, you'll eventually need to change your solution

Learn what your peers think about SSIS. Get advice and tips from experienced pros sharing their opinions. Updated: February 2023.
685,707 professionals have used our research since 2012.
Marketing Communications Manager at Astera
Nov 5, 2018

- Ease of use - The solution should offer the same level of usability to both IT and business users.
- Support for both batch and transaction-based integration
- Workflow automation - I would not want to spend my time scheduling and monitoring recurring jobs. Therefore, there should be support for time-based and event-based scheduling.
- Connectivity - Any business today works with a plethora of legacy and modern data sources. So the solution should offer out-of-the-box connectivity to a range of source and target databases.

Senior Application Engineer at Swiss Re at a insurance company with 10,001+ employees
Real User
Jun 12, 2018

Flexibility - can you code complex business domain rules using VB or C++?
Connections - what data sources it connects with and how it connects to them.
Stability - will it crash in development mode?
Reuse - can you create and re-use modules in multiple projects and deploy to server tasks?

Real User
Jan 30, 2020

For advanced data integration flows that ingest time series and similar type of measurement data that comes of a physical process (anything IoT), you stand to benefit from a characterization and resampling flow. Most ETL tools are database oriented instead of model characterization and model prediction oriented. When dealing with sensor networks of any kind, ETL system are not the right tool for the job.

Nov 18, 2019

Ease of modelling and deployment, connectors availability out of the box,,workflow automation,ETL capability,audit and control, transaction and batch processing, continuous Synch,low code,visual interface.

Solution Architect at Larsen & Toubro Infotech Ltd.
Real User
Sep 13, 2019

Ease of use for ETL
Advanced ETL features for flexibility
Easy to test/debug
Templates/Pre-built functionalities

it_user621570 - PeerSpot reviewer
Director of Human Resources at a tech company with 51-200 employees
Mar 6, 2017

Ease of use for ETL
Advanced ETL features for flexibility
Easy to test/debug
Templates/Pre-built functionalities

it_user252576 - PeerSpot reviewer
Business Intelligence and Decision Support Team Leader at a university with 1,001-5,000 employees
Jul 31, 2015

Data profiling, easy to use, connectivity capabilities to different kind of sources (unstructured data, flat files, common rdms, soap and json ) advanced data transform capabilties

it_user229764 - PeerSpot reviewer
Project Associate at a tech vendor with 1,001-5,000 employees
Apr 27, 2015

Less code more productivity

it_user219864 - PeerSpot reviewer
Owner at Crystal Solutions
Apr 7, 2015

Data Quality, Data governance, Data profiling, and advanced ETL functions embedded, multiples and native connectivity with structured and unstructured data.

Senior Consultant at RXP Services Limited
Real User
Sep 30, 2020

Ease of connecting to multiple source system. Inbuilt testing facility in between the ETL pipeline.

User friendly GUI .

Should include templates for generic task such as SCD1, SCD2 , Delta Load 

Data Quality Software Development Manager at Yapı Kredi Bank
Real User
Oct 26, 2016

1. Flexibility. A DI tool should be like water to fit the shape of each glass every time. Ability to learn.!
2. Ease of development, installation, implementing topology architecture.
3. Reusability of coding.
4. Ease of maintenance, management and operation.
5. Learning curve.
6. Ability to talk with related products (Data Quality, Replication, etc.) fully integrated and out-of-the-box.

it_user422436 - PeerSpot reviewer
BI Architect /Project Manager, Manufacturing Domain at Tata Consultancy Services
Real User
Apr 11, 2016

Ease of data extract, Ability to support complex Integration between desperate systems, Ability to feed data to different downstream systems, Ability to perform data quality check and Availability of ETL out of box functions..

it_user320619 - PeerSpot reviewer
Senior Director, Transition Services at a tech services company with 501-1,000 employees
Sep 30, 2015

Ease of use (modeling), flexible options for transformations and custom code, data source agnostic, efficient processing engine, real time monitoring and solid debug tools, good reuse options (refactoring segments of a process to new projects or flows, etc.) good but flexible governance and good documentation (or strong Google search results).

it_user302505 - PeerSpot reviewer
Technology Analyst at a consultancy with 5,001-10,000 employees
Aug 31, 2015

Data quality, data governance, possibility for advance data transformations in a much more easier manner

it_user278967 - PeerSpot reviewer
Owner at a tech services company
Jul 29, 2015

Data Quality, Governance, Data profiling, Flexibility and ease of use

it_user278157 - PeerSpot reviewer
Consultant - Oracle ACE with 51-200 employees
Jul 23, 2015

Reusability, Flexibility, Data Governance, Data Quality, Connectivity

it_user205893 - PeerSpot reviewer
Project Manager / Team Leader / Siebel Expert / Technical Architect at a tech vendor
Mar 10, 2015

Easy to use, Easy to manage

it_user173649 - PeerSpot reviewer
Database And BI Developer at a tech consulting company
Dec 22, 2014

Data Quality, Data Volume, Frequency Of Update (Schedule) and Cross Object communication In an overview.

There could other factors as well but primarily I will go to evaluate this.

Related Questions
Malte Horstmann - PeerSpot reviewer
Co-Founder & Managing Partner at OMM Solutions GmbH
Feb 6, 2023
Hy Community, I'm currently searching for a tool in the field of KPI Data Management. The current idea is to have a lot of ETL Processes in Azure Data Factor filling a simple SQL Database with around 100 predefined KPIs which are defined to manage this global active SME with about 2k employees.  What's missing is an interface to access this "KPI Database" for non-technical people. I dreamed o...
2 out of 3 answers
Director at ProfitFromERP
Feb 3, 2023
Without doing free research to find out a little more, which pays much less than it used to, here are some starters for data management and integration. Start with Celigo and Dell Boomi - I don't know what's their wheelhouse but they could tell you more than I. We have had some clients create 'drag and drop' interfaces for managing data between databases. You can also use BI tools Qlik and Tableau to create the integrations and then be able to manage the datasets from there. Anaplan is probably way off base, but check. Hope this helps - our practice is more on the business side than the IT side, so we're focusing on end-user functionality and working from that direction. 
Sarath Boppudi - PeerSpot reviewer
Data Strategist, Cloud Solutions Architect at BiTQ
Feb 4, 2023
Hi Malte Your dream is really good, and most companies are on this journey.  One option that you can consider for non-technical staff is a Data Catalogue. This gives you the technical meta-data and links it to a business glossary. If you are in the Microsoft ecosystem, consider a service like Microsoft Purview for your requirements.  Kind Regards,  Sarath
Content Manager at PeerSpot (formerly IT Central Station)
Dec 6, 2022
What are the best data integration methods in your experience?Which one is the most efficient one?
See 2 answers
Tech blogger
Dec 6, 2022
In my experience, the best data integration approaches are the ones that fit your organization. You have the following: Uniform data access Common data storage Application-based integration Common user interface Middleware data integration These are all methods to integrate your data and they are all good, but which to choose solely depends on your goals. I have worked in different-sized companies and obviously, the method for integrating data in each of them depends on many factors. Mostly, I would say you should choose based on the load of your data and the number of sources it is stored at. The most efficient method would be to consider those things, research the best tools that do the job and fit your budget, and from there see which approach will work for you. It may be trial and error but hopefully, you’ll find the best one.
Beth Safire - PeerSpot reviewer
Tech Blogger
Dec 6, 2022
The best data integration in my approach has been through middleware. In my company we previously relied on manual data integration and in my opinion, this was highly unproductive. I do not blame my team; they did all they could. But it is a hard job for a person, despite their tech skills, to integrate data manually. That is why in my experience the most efficient and effective way to do it is through a tool. Luckily there are many tools that you can try on the market. You can select where to store your data, how to integrate it, and what automations to add. This makes it all easy and fast and saves on human resources too!
Related Articles
Anna Odrynska - PeerSpot reviewer
Chief Strategy Officer at Alpha Serve
Oct 11, 2022
Business owners cannot rely on their own market assessments and strategic assumptions to make informed tactical decisions. Rapid technology development and tough competition mean that thorough data analysis is vital for success. In this article, we explain how business analysts, IT teams, and company managers can benefit from Microsoft Power BI integration with the help of BI connectors. Why i...
EPM/BI certified Consultant, Oracle ACE and TeraCorp Consulting CEO at TeraCorp Consulting
Mar 22, 2021
Hey guys, how are you?  In this post, I’ll talk about my preferred tips and tricks that I use in ODI (any version). I’m always using these pieces of code for pretty much everything I do in ODI especially because it makes the code more elegant, dynamic, and easier to change later. First, let’s talk about my all-time preferred, Loops. Normally we learn to loop in ODI using a count variable and ...
Related Categories
Related Articles
Anna Odrynska - PeerSpot reviewer
Chief Strategy Officer at Alpha Serve
Oct 11, 2022
Power BI Integration: Why Should You Integrate Power BI and Your Software?
Business owners cannot rely on their own market assessments and strategic assumptions to make inf...
EPM/BI certified Consultant, Oracle ACE and TeraCorp Consulting CEO at TeraCorp Consulting
Mar 22, 2021
How to use ODI to generate Dynamic code for you
Hey guys, how are you?  In this post, I’ll talk about my preferred tips and tricks that I use in...
Download Free Report
Download our free SSIS Report and get advice and tips from experienced pros sharing their opinions. Updated: February 2023.
685,707 professionals have used our research since 2012.