Once you figure it out, it is a powerful and simple ETL tool. Its stability has been very satisfactory.
DW Admin at a hospitality company with 1,001-5,000 employees
It collects business information for centralized reporting and analytics.
Pros and Cons
- "Once you figure it out, it is a powerful and simple ETL tool. Its stability has been very satisfactory."
- "The UI is outdated and old-fashioned, at least in our current version. Also, we have experienced some stability issues with the Workflow Monitor application."
What is most valuable?
How has it helped my organization?
Like any other ETL tool, it collects business information for centralized reporting and analytics.
What needs improvement?
The UI is outdated and old-fashioned, at least in our current version. Also, we have experienced some stability issues with the Workflow Monitor application.
For how long have I used the solution?
Our company has used it since 2002, I started using it in 2009.
Buyer's Guide
Informatica PowerCenter
April 2025

Learn what your peers think about Informatica PowerCenter. Get advice and tips from experienced pros sharing their opinions. Updated: April 2025.
856,873 professionals have used our research since 2012.
What do I think about the stability of the solution?
The integration service itself is very stable. The applications suffer from minor stability problems.
What do I think about the scalability of the solution?
The product scales very well, beyond our needs.
How are customer service and support?
The technical support team is eager to help but the problems that we have faced were usually too complex. However, there has always been a workaround.
Which solution did I use previously and why did I switch?
Our company has always used this solution.
How was the initial setup?
I was personally not involved with the initial setup, but from what I have heard, it was fairly straightforward.
What's my experience with pricing, setup cost, and licensing?
We have found the pricing very cost-effective. The licensing is CPU and data source-based.
In the new version, it is supposed to allow a variety of data sources with the standard license.
Which other solutions did I evaluate?
I was not involved with choosing the product.
What other advice do I have?
Choose the right tool for the right job.
For the purpose that this product was designed, I believe it is still the best in the market.
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
Data Warehousing and Business Intelligence Lead at Bank of America
It can work with any kind of database, including NoSQL ones, but when a database object changes then object definitions do not get refreshed automatically.
What is most valuable?
- Able to access heterogeneous databases: It can work with any kind of database available in the market. In fact, it can be applied to NoSQL databases as well.
- SQL override allows users to write simple to complex queries inside mapping for performance tuning the load process, pre and post SQL can help control the load process to get desired outputs.
- Error handling techniques: Informatica provide a suite of options to handle errors. From logging error in database tables to e-mail notifications with log files, to decision/assignment tasks for data control, it can handle error at every stage of workflow. Restart options for batched work-flows.
- Debugging, is one other feature I found extremely useful. Before running a full load, you can run a subset of data from source-target and check for values generated from logic at every transformation.
- While in debugger mode Target tables do not load any physical data. The best part of debugging in Informatica is that, in the debugger mode, you can alter logic in expression transformation and check for the value generated from the logic. This will allow you to change logic if it's needed later.
- Transformations - the types of transformations (SQL Logics) provided are unmatched to any other ETL tool out there. When it comes to using transformations, even a beginner can understand with minimal effort, how to use them or what the expected output will be using that particular transformation. Most of them are self-explanatory, and provide the desired output. Especially Expression Transformation (which is the heart of any mapping).
- Can be used to build any type of logic with flexibility giving the number of logical functions (date/time/string etc...). Functions have a syntax that is very easy to understand.
- Informatica provides all the functions indexed in a proper fashion in expressions editor with syntax to allow users to build logic with ease.
- Version control, as check-in and check-out are easy and allow space for adding comments. These comments can come in handy during deployment when writing queries.
How has it helped my organization?
Our organization had to import data from AS400, implement a sales cost analysis and load data into Data Mart, which was further consumed by QlikView. This was done with data from five different countries in different time zones. From the point of developing mapping, replicating them for other countries given their respective databases to loading the data, everything was automated and scheduled with notification about the loads. It changed the way we did business. Customers orders, modification, and tracking was all smooth. Every morning, we would receive one e-mail with consolidated load statistics and it all starts with one e-mail.
What needs improvement?
There are three areas where they can make a significant improvement:
- Live feeds, where if a database object changes then object definitions should automatically get refreshed as well. This would avoid re-import of objects. Auto refresh will effect all the short-cut objects, but ultimately if the object in the database has changed, then the mapping will fail or provide incorrect data given the position of the column or name of the column doesn't exist anymore.
- The GUI Interface. Instead of having to open a separate window for Designer, WF Manager and WF Monitor. If these three windows could be merged into three separate tabs or built into the hierarchy of building sub-tasks, for example, workflow opens a session, and the session opens a mapping unlike opening only mapping properties currently, that would be nice. SAP BODS has that structure and I would like to see something along those lines, where I don't have to refresh the mapping and session every time something changes
- Version rollback, where version control is a blessing and boon. While the version control is a good feature, sometimes it becomes a huge burden of the database and the Repository should have a way to rollback keeping most current object and purge all other versions.
For how long have I used the solution?
Starting with version 7.1, I have been using Informatica for 7+ years.
What was my experience with deployment of the solution?
Deployment can get complicated depending on Queries. Adding proper labels and comments during version control can make Deployment very smooth. I did not come across any technical issues using deployment. Rollback feature adds a lot of value to deployment. With a single click, you can rollback all the objects if you notice any discrepancy with objects between environments.
What do I think about the stability of the solution?
Informatica is very stable tool. Only a few times, where Informatica server is remotely based, connectivity can be slow at times. I have had a few instances when expression editors gets grayed out when using RDP or Docking Laptop but editing "regedit" file resolved this issue. Otherwise this is a very stable, powerful and robust tool.
What do I think about the scalability of the solution?
Informatica can handle extremely large volumes of data very well. With features like CDC, Incremental Load, Mapping Parameters and variables, dynamic look-up, pre and post SQL, Informatica provides flexibility in handling huge volumes of data with ease. Certainly a lot depends on optimized mappings and work-flows (batching and performance tuning).
How are customer service and technical support?
Customer Service:
We've only had to contact customer service twice in 6+ years, and they were very good with their responses and were very professional.
I would say 9/10. For what I needed, technical support was able to resolve it in timely fashion. I also appreciate their follow-ups.
Which solution did I use previously and why did I switch?
We were using a conventional RPG programming tool to do analysis, but every time you add more tables for data analysis, profiling, quality, manipulation, it was turning into pages and pages of code. A user friendly GUI interface like Informatica solution, provided the right kind of solution and was easy to migrate from programming to Informatica.
How was the initial setup?
The initial setup was a conventional data warehouse pattern. Later on, when we started implementing CRM, SCM and ERP it started getting a bit complex. However breaking down projects into multiple data models and organizing
What about the implementation team?
We had a boot-camp training and an Informatica expert onsite for a few months. Later, we picked up a fair amount of technology and started implementing it in-house.
What was our ROI?
Informatica is 100% value for money. The kind of flexibility and stability it offers in dealing with heterogeneous data is amazing.
What's my experience with pricing, setup cost, and licensing?
It is not an economical software, but if you are planing for a long term robust end-to-end enterprise level tool that can handle any kind of data, and any type of task relating to BI or data warehousing, it does not require a lot of thinking. You can bank on Informatica for your solutions.
Which other solutions did I evaluate?
Data Stage
Ab Initio
MicroStrategy
What other advice do I have?
Informatica is a great product. If you can spend a good amount of time researching what you want,have a proper SDLC in place, and work with the technicality of Informatica, I am sure most of the projects can roll out into production in a timely fashion and produce results. In my experience, not having proper road-map in place, and not auditing change requests, business analysts will struggle with their requirements. This causes more bottlenecks and rework than an actual development. Having said that, no project is a walk in the park, but Informatica can be the icing on the cake if the foundation is good.
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
Buyer's Guide
Informatica PowerCenter
April 2025

Learn what your peers think about Informatica PowerCenter. Get advice and tips from experienced pros sharing their opinions. Updated: April 2025.
856,873 professionals have used our research since 2012.
Business Intelligence Analyst at a comms service provider with 10,001+ employees
There were many performance problems during the implementation, but you will be happy with the final result.
What is most valuable?
With this product, we managed to implement all of our ETL, including those with many sources, and complex transformations.
How has it helped my organization?
With this product we can centralize all our ETL needs in one technology. One unique team with the right expertise is responsible for developing, implementing, and tuning it. We eliminated the variety of different solutions and different technologies. This means that the knowledge and skills are not spread-out.
What needs improvement?
One area for improvement is with huge ETLs, where the product has to extract large amounts of information. That's because some sources wait for others inside the map to finish, and only afterwards do they continue extracting data from other sources in the database.
For how long have I used the solution?
I've used it for six years.
What was my experience with deployment of the solution?
No issues encountered.
What do I think about the stability of the solution?
No, but after some errors, the cache directory, where we keep the temp files, fills up, and you have to drop them in manually or have a process to delete unused files periodically.
What do I think about the scalability of the solution?
A fast SSD disk is needed for better performance to locate the temp files.
How are customer service and technical support?
Customer Service:
3/10.
Technical Support:3/10.
Which solution did I use previously and why did I switch?
We previously used Oracle Warehouse Builder. We switched because Warehouse Builder generated PL/SQL code and the transformation resided inside the database. With PWC, the transformation is on a different server.
How was the initial setup?
The initial setup was complex. There were many performance problems and some migrations took a long time to be implemented. After many meetings with different experts both problems were solved.
What about the implementation team?
We used a vendor team, who were 4/10.
Which other solutions did I evaluate?
We also looked at ODI (Oracle data integration).
What other advice do I have?
Be persistent in trying to implement all kinds of transformations with Power Center, you will be happy with the final result.
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
There is still a lack in this areas, but you can manage to solve that lack:
- The conditional execution is not directly supported. We manage to use a second workflow that write a file under the condition we want. The main workflow wait until the appearance of the file to run and then delete the file, and so on. Also the same written file, or another, can have de parameter for the execution.
- You can have a log file for each mapping execution, and you can set the log level. Also there is a tool to get reports from the repository (which has performance problem). But for some needs we had to write sql over the repository, wich is hard because there is not an easy ER model.
- We don't use the push-down. We design, and develope all our maps without that feature. We made some probes a few months ago and find that it require a re-design to get benefits (which was out of our planning). But we heard that it manage to get really good performance.
Data Quality and Conversion with 1,001-5,000 employees
As ETL tools in general, it does not include a good data reporting feature.
What is most valuable?
- The possibility to create a data profiling mapping automatically from any imported source.
- The client installation includes a macro that can be imported into Excel. This will contains standard documents for technical design and documentation in general. The technical design developed in Excel can be directly imported into Powercenter using Powercenter Repository Manager and the corresponding mapping is automatically created. This can save a lot of development time for simple or not very complex mappings. This feature is called Mapping Analyst for Excel.
- The default error handling, automatically created, is quite good for those architectures that don't have a proper one or a budget for a custom one.
What needs improvement?
As all ETL tools in general, it does not include a good data reporting feature.
Another one is the component called "Union" which includes some bugs:
- Adding new groups, the field-names are lost
- Adding new fields, the field type should be reset for all fields in the "Union" or the mapping validation will fail.
There is no possibility of handle non matching records in a "joiner" as in DataStage.
For how long have I used the solution?
4-5 years, different versions.
What was my experience with deployment of the solution?
No
What do I think about the stability of the solution?
No
What do I think about the scalability of the solution?
No
Which solution did I use previously and why did I switch?
I used other tools like IBM DataStage. The final decision is made by my clients but what I see is that clients using DB2 prefer to use DataStage over Informatica as both (DB2 and DataStage) are IBM products.
How was the initial setup?
Very straightforward.
What about the implementation team?
In-house
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
Strictly based on my experience with Informatica and DataStage, I find Informatica PowerCenter sufficiently advanced in ETL capability terms. As Informatica proclaims to be leader in data integration, the tool inherently is not designed for reporting requirements. However, it still provides one the best metadata schema, to monitor and report the performance of ETL processes.
Informatica was among the 1st to provide database driven metadata access, providing ease of query-ing & reporting on ETL processes.
With SOA driven architecture and Big data integration in new versions, Informatica continue to maintain lead among choice of ETL platforms.
However, pricing and very large volume (billions of records) processing may be major consideration while planning and deciding about usage of Informatica.
Manager of Data Analytics at a tech services company with 10,001+ employees
Costly yet Effective & Efficient tool for moderate sized DW projects. Performance issues with very large data volumes
Informatica has grown steadily in the data integration domain, ranking in the leaders quadrant of Gartner's analysis. It has been largely in pace with industry demands, serving a large set of requirements through various products. Not surprisingly, Informatica as a Data Integration platform is ahead of many competitors in the domain, with collective force of a product suite strengthening it's position as a leading Data Integration platform.
Here's my assessment of Informatica PowerCenter suite:
- Ease of Development is HIGH. Intuitive GUI, along with drag & connect features, bode well for developers when pressed against stringent timelines.
- Component Architecture, based on SOA, smoothens scalability and flexibility.
- Ease of Integration with nearly all market leading products and application standards, for both Source & Target. Like XML, RDBMS, Message Queues, Flat File, SAP, TIBCO, Mainframe etc.
- Seamless integration with many widely used components: shell scripts in UNIX, Stored Procedures, FTP, VSAM, Salesforce.com, Control-M, Maestro, Autosys etc.
- Enables Real Time integration and Change Data Capture.
- Well organized Community Support, Partner support and documentation.
- People availability with relevant skill level for development is comparatively easier.
- License cost is reasonably HIGH.
- Performs well for DWH of up to Moderate data volumes. When data volume grow beyond millions records per day/ week, performance degrades severely.
- Product release cycle is stiff, with frequent upgrades must to maintain relevant support level. Upgrade cost for Data Integration tool is rather tightly budgeted and hence, a challenging ordeal for Infrastructure, Development and Stakeholders.
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
Consultant at a energy/utilities company with 10,001+ employees
PowerCenter Express is not PowerCenter but it's good enough for small development
Informatica unveiled their newest product in the PowerCenter line, the PowerCenter Express, at Informatica World this year (find Smartbridge’s experience of the convention here).
The sales pitch is certainly catchy: Free PowerCenter! First I heard of it, I wasn’t sure what to think – is this a marketing gimmick? What’s the catch? But hey, at that price, it is easy enough to find out by one’s self, and that is precisely what I did. And color me pleasantly surprised.
The Limitations of PowerCenter Express
To their credit, Informatica is upfront with the limitations of the product. This is a good thing – no easier way to shoot yourself in the foot than sneak small print on your clients under a guise of a no-strings-attached free download.
As you could expect, PowerCenter Express is not PowerCenter – the free Express version can only process a quarter of a million rows per day – good enough for small development, but it is best considered a demo version. The paid version includes multi-user support, and removes the processing limitation, but is still limited to five users and no job parallelization.
If your company is already using PowerCenter, you are probably long past the point where you could realistically choose to downsize to Express. But if your company was too small for the behemoth that is PowerCenter, then Express may be exactly what you need.
I suspect that Informatica sees Express as a way to reach to clients that, until now, were too small to warrant their larger products – maybe a way to get them to dip their toe in the waters.
Do not think, however, that this is “PowerCenter lite”. Express is a product on its own right (the paid version, more so than the free). A small-to-medium company that finds itself in need of an ETL can do much worse than invest in Express. Even when I was building PowerCenter ETLs for a large bank, we seldom ran more than two or three medium jobs in parallel – the strain it puts on the source and target is just not worth the time savings – and the larger jobs usually ran on their own.
The lack of parallelization will hit only if you had a large number of small jobs; and even then, serializing them shouldn’t be more than a small inconvenience, although not having faced the actual issue, do take this prediction with a grain of salt.
Express Installation
Grabbing a copy and installing it was simplicity itself. I have always felt that PowerCenter’s greatest strength is its ease of use, beyond even its connectivity. I’m happy to see Informatica expand the ease of use to the installation.
A stand-alone install program is all it takes to be up and running. I was building my first test mapping less than an hour after deciding to download Express, and ran it successfully in less than two hours (it wasn’t a very interesting mapping, admittedly, but it was a reasonably complex join of flat file data against a local database, aggregated and sent to a remote location – the kind of “simple” ETL that has been known to cause me headaches when attempted in unvarnished SQL).
One word of caution: Express is not a toy. Even the free version has a fully functional PowerCenter server. When turned on, my laptop went into permanent spin, and my memory and CPU use climbed several notches. I found myself turning it off just to give my poor laptop a break. It worked for testing, but if you are going to use it to develop an actual ETL, consider installing the server portion on an actual server.
PowerCenter vs. PowerCenter Express
PowerCenter Express is by no means ‘lite’.
As a long time user of PowerCenter, this part is actually tricky to write. How many of the changes are “bad” and how many of them is just me being an old curmudgeon? It’s difficult to say. The good news: you needn’t worry. They did not strip PowerCenter down. Every transformation you can find in “classic” PowerCenter is in Express as well.
Express even includes a bunch of direct connections to social media to speed up your mapping development: Twitter, LinkedIn, Facebook, you name it. And I loved that they finally dropped the “source” and “target” – an unnecessary distinction, when most external entities end up being both. Express automatically assumes that, and the whole is more compact for it.
I am less happy about the lack of Sessions. They are not gone completely – the workflow is still a sequence of objects that are associated to mappings – but without my usual central point for redefining sources and targets, I was left scrambling to find where to do so. I suspect this is more my muscle memory that led me to looking in all the wrong places, though. As always, F1 brought up the help, and once I had read the manual, it became easy again.
There are a few other nits I could pick – I am not entirely convinced I like the new graphics, the ribbon or the “all in one” approach – and I cannot even guess at what other differences I would eventually find, if given enough time, but these are minor.
Express is PowerCenter, and the old approaches to mapping design will still work. It is still visual, intuitive, and easy to use.
So Does Express Pass the Test?
If Express’ name wasn’t attached to Informatica PowerCenter, I’d considered it a basic ETL, with potential for growth and useful mostly for small deployments.
The equation changes, though, when you consider that if you do outgrow the capabilities of Express, you can easily upgrade to PowerCenter. It is an interesting approach, and I could almost say Informatica has managed to square the circle.
This first visit to the tool has proven successful enough that, were I to be required to use Express as the ETL tool, nary a complaint would escape my lips – and those of you that have met me know how rare an occasion that is.
Disclaimer: The company I work for is partners with several vendors including Informatica
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
BI Expert at a healthcare company with 1,001-5,000 employees
Informatica PowerCenter vs. Microsoft SSIS - each technology has its advantages but also have similarities
Technology has made it easier for businesses to organize and manipulate data to get a clearer picture of what’s going on with their business. Notably, ETL tools have made managing huge amounts of data significantly easier and faster, boosting many organizations’ business intelligence operations.
There are many third-party vendors offering ETL solutions, but two of the most popular are PowerCenter Informatica and Microsoft SSIS (SQL Server Integration Services). Each technology has its advantages but there are also similarities on how they carry out the extract-transform-load processes and only differ in terminologies.
If you’re in the process of choosing ETL tools and PowerCenter Informatica and Microsoft SSIS made it to your shortlist, here is a short comparative discussion detailing the differences between the two, as well as their benefits.
Package Configuration
Most enterprise data integration projects would require the capacity to develop a solution in one platform and test and deploy it in a separate environment without having to manually change the established workflow. In order to achieve this seamless movement between two environments, your ETL technology should allow the dynamic update of the project’s properties using the content or a parameter file or configuration.
Both Informatica and SSIS support this functionality using different methodologies. In Informatica, every session can have more than one source and one or more destination connections. There are different kinds of connections the primary being relational connections. Every session can be set up dynamically by changing parameters contained in a parameter file.
The same thing can be achieved in SSIS via Configurations. Using the SSIS Configuration Wizard, configuration data is saved in XML files. Unlike Informatica wherein there can be multiple connections, SSIS only allows a single defined connection that can be applied across all tasks in a package.
In short, Informatica parameters are defined at the session level while SSIS configurations are set at the package level.
Data Staging
When you use SSIS, you will use the Connection Manager to generate a connection defining the physical location of the file. Multiple files that need to be loaded from multiple connections would require multiple connections. All information set in the connection manager can be incorporated in the Configuration File and can be dynamically updated during run-time.
On the other hand, if you’re using Informatica, you will use the Workflow Manager tool to assign a location to each file. Every task that needs to access that file can be configured with the location and name of that specific file.
Value Extraction
One
of the main functions of ETL tools is being able to extract meaning from the
information that is currently being ran or to supplement that information with
extra information obtained from the current information in the data processing
pipeline.
Both
SSIS and Informatica have this functionality through the use of derived columns
or the capacity to draw new information from existing data. Informatica does
this via its expression transformation component while Microsoft SSIS does this
via the Derived Column transformation.
The logic used to complete both operations is the same and the syntax involved is also identical. The difference between the two technologies lies in the expression language utilized to obtain the new data and the notation style involved. Informatica uses Character, Conversion, Data, Numerical, Scientific, Special and Test. On the other hand, SSIS uses Mathematical, String, Date/Time, NULL, Type Casts and Operators.
Sorting
Simply defined, sorting is having the ability to sort information into a chronological data set. While the order of the information may appear to be immaterial for loading into a relational data warehouse or database, it may matter for the other tasks later on in the transformation process.
The difference on how SSIS and Informatica carry out this functionality cannot be subtler. Informatica’s Sorter and SSIS’ Sort can both chronologically organize data and eliminate duplicate data. In SSIS, de-duplication can be done by setting the eliminate duplicates option to TRUE. For Informatica, this can be done by selecting the distinct option.
Detection of Data Quality Issues
Similar to all data integration solutions, ETL technologies can be susceptible to data quality problems. Fortunately for users of Microsoft SSIS, it allows for the creation of checkpoints within the data transformation process that can reveal and repair data quality problems. SSIS has a feature called Fuzzy Lookup transform that pairs incoming “dirty” information – unexpected abbreviations, null fields, inserted or missing tokens, truncations, misspellings and other data abnormalities – with clean records contained in a reference table. There is also the Fuzzy Grouping tool that finds similarities among input rows and unites duplicate data.
Unfortunately, Informatica does not have the same functionality out of the box. If you want to recreate this capability in Informatica, it would require human intervention to make a mapping table that contains every referrer value that came across the input stream and their equivalent mapping values.
However, take note that even if you’re using SSIS, you may still need to manually intervene to detect and repair data quality issues. Even the most advanced algorithm may miss something so you still need to manually check for the accuracy and integrity of your data.
Modularity
Modularity is concerned with the manner in which the work units that make up an end-to-end ETL solution are created and reused.
There’s a slight difference between PowerCenter Informatica and Microsoft SSIS on how they build modular data integration and ETL solutions.
Informatica involves a bottom-up framework to ETL implementation by permitting a library of components – mappings, mapplets, transformations, targets, and sources – that can be employed across numerous worklets in the solution. A worklet is composed of a sequence of mapping instances.
On the contrary, SSIS uses a top-down approach in wherein a general sequence of tasks is defined before setting the specifics on how these tasks are going to be carried out. Reusability of ETL components is achieved by creating libraries of packages which can then be implemented together with a master package. A package is the counterpart of Informatica’s worklet.
Tracking Changes in Slowly Changing Dimensions
Slowly changing dimensions address the issue of capturing and documenting a history of modifications or changes to entities within a database that are not reflected in a System of Record for that particular data. A common example of slowly changing dimensions is an item moving to another product category in a department store. This will modify the said product’s category attribute but its SKU will remain unchanged.
Informatica and SSIS both have the functionality to track these changes with very similar features. Informatica is equipped with a “slowly changing dimension wizard” that will allow you to create the sources, transformations, ports and destinations that are pertinent to accommodate these slowly changing requirements. SSIS also comes with a slowly changing dimension wizard that works similarly. Aside from the ability to keep track of slowly changing attributes, it can also recognize changes to attributes that are not supposed to change. These are known as fixed attributes. You have the option to enable the wizard to raise an error warning accordingly when this happens.
Dimension Loading
In terms of dimension loading, a surrogate key is necessary. As a substitute to a natural key, a surrogate key is where every join between fact tables and dimension tables are based.
Informatica
and SSIS have varying ways on how they generate surrogate keys.
Out of the box, Informatica PowerCenter comes with a component dubbed as Sequence transformation that has the capacity to create a surrogate key. It produces an incremental value for every row in the pipeline which can then be incorporate into a destination table via a surrogate key column.
SSIS does not have a Sequence transformation component. Instead, it uses its Script transform component to generate surrogate keys.
Fact Table Loading
Populating fact tables usually involved two processes: (1) aggregating the data to the needed granularity and (2) retrieval of dimensional surrogate keys.
Informatica PowerCenter carries out these operations via a transformation dubbed as “Aggregator.” This component cuts across groupings of values from chosen input columns. SSIS has the same capability through the component called “Aggregate.” The slight difference between the two is that SSIS only has the most used functions such as Minimum and Maximum, Average, Count Distinct, Count and Sum. Informatica has all these and other extra capabilities such as Variance, Standard Deviation, Percentile, Median, First and Last.
Which ETL Technology is Right for Your Business?
There are other aspects that illustrate the difference between Informatica PowerCenter and Microsoft SSIS such as Design Time Debugging, Collaborative Development, Security, Integration with Packaged Applications, and Web Services & Service Oriented Architecture. However, the things discussed above cover the basic concepts in ETL technology.
As businesses encounter bigger challenges to synergize data from a constantly increasing number of different systems, your choice of an ETL solution to fit your needs is more crucial than ever.
As for the choice between Informatica PowerCenter and Microsoft SSIS, many analysts consider Informatica as the leader in ETL technology while reputable research firm Forrester firm once called SSIS’ price-to-performance ration “downright seductive.”
However, proclaiming a winner in this battle between two ETL technology giants would greatly depend on your business requirements. Of course, there are pricing differences between the two technologies and notable difference in their features, capabilities and differences as well as their level of usability. It’s for you to analyze which technology is the perfect fi
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
Use of configuration files for SSIS is a really bad idea and in fact is deprecated by Microsoft with 2012. Not only is using them no longer advisable but the new method of using job and package parameters significant advantage over many competing ETL tools. Managing parameters through use of files of any sort is an incredibly maintenance and problematic method and I would not advise using any ETL tool that still requires this.
Head of Databases at a retailer with 501-1,000 employees
High performance and scalable data integration
Valuable Features:
• As it supports large data volumes, so it meets the demands of our company for security and performance.
• It helps us in making better and timelier business decisions by providing us the right information at the right time.
• Delivers data throughout our enterprise by accessing and integrating data in all formats.
• Provides security, scalability, performance to our system and establishes a foundation for our enterprise-wide data integration initiatives.
• Enables our teams of developers, analysts, and administrators to work faster and better together and also reduces our IT costs by encouraging collaboration and minimizing the development complexities.
Room for Improvement:
• It is very expensive. High end.
• We provided training of SQL to our staff before using this product. Because a good understanding of SQL is mandatory for using Powercenter.
• GUI: Lack of clarity and consistency created problems for our staff. So it requires some improvements.
• We used the services of highly skilled persons for installation of powercenter. Novice staff can’t install this product properly.
Other Advice:
Our organization is using it because it is one of the leading data integration tools available in the market and supports large data volumes and meets our demands for security and performance. PowerCenter is serving as the data integration foundation for our enterprise integration initiatives including data warehousing, data governance, data migration, SOA), and MDM. Powercenter in my opinion has the best data transfer technologies including its standard, advanced, big data, real time, data virtualization and cloud editions.
Disclosure: My company does not have a business relationship with this vendor other than being a customer.

Buyer's Guide
Download our free Informatica PowerCenter Report and get advice and tips from experienced pros
sharing their opinions.
Updated: April 2025
Popular Comparisons
Informatica Intelligent Data Management Cloud (IDMC)
Azure Data Factory
Oracle Data Integrator (ODI)
Palantir Foundry
IBM InfoSphere DataStage
Talend Open Studio
Oracle GoldenGate
Qlik Sense
SAP Data Services
Qlik Replicate
Buyer's Guide
Download our free Informatica PowerCenter Report and get advice and tips from experienced pros
sharing their opinions.
Quick Links
Learn More: Questions:
- Microsoft SSIS vs. Informatica PowerCenter - which solution has better features?
- What Is The Biggest Difference Between Informatica PowerCenter and Microsoft SSIS?
- How do you compare Informatica PowerCenter with IBM DataStage?
- Which Informatica product would you choose - PowerCenter or Cloud Data Integration?
- A recent review wrote that PowerCenter has room for improvement. Agree or Disagree?
- Which is better - SSIS or Informatica PowerCenter?
- How do you evaluate the pricing model of Informatica PowerCenter?
- How do you rate the public cloud support of PowerCenter?
- What are the most frequent use cases of Informatica PowerCenter?
- How does Azure Data Factory compare with Informatica PowerCenter?
GaryM, I understand that automatic refresh can cause data mapping errors, but identifying the changes, notifying the users and letting the user decide how to apply the changes (update, ignore, create a copy of mapping and edit ...). When the object in DB changes, chances that mapping will fail are high depending on the type of change, why not know about it and prepare for it before hand. Either you need to run a SQL and identify DB changes periodically and apply them in Informatica manually or Informatica identifies them and let's users decide (when repository reconnects).