We performed a comparison between Informatica PowerCenter, Quest SharePlex, and SSIS based on real PeerSpot user reviews.
Find out what your peers are saying about Microsoft, Informatica, Oracle and others in Data Integration."It is an excellent ETL tool."
"The support is valuable. There are also open-source ETL products, which work very well, but there is no support. When we face a production problem, being able to get support is valuable, and it brings efficiency. With an open-source solution, we can't engage anyone to resolve the problem as quickly as possible."
"It is easy to use, and it is quick for developing things. It is fairly powerful, and it can integrate with a lot of different platforms without much hassle."
"It has a Data Catalog that uses the Model repository."
"We can scale the product."
"To me, what's most valuable in Informatica PowerCenter is the flexibility in building the integration pipeline. Usually, you need to have a platform to be able to integrate with different technologies, including legacy data such as the mainframe. The platform should also be rich enough to transform the data per your business requirement, with no restrictions. Rich integration and rich transformation capabilities are the two key capabilities in Informatica PowerCenter. The solution also offers ease of use. Another valuable feature of Informatica PowerCenter is the drag-and-drop integration because it's GUI-based, similar to IBM and Oracle."
"It reduces a lot of legacy coding."
"One of the most valuable features for us is the metadata repository because it can easily understand the lineage of first target mapping. My company and I also find Informatica really easy to use—when a consultant joins our company, in just a few days to a few weeks, they can understand how to use it—so we prefer to use this ETL tool."
"The core features of the solution we like are the reliability of the data transfer and the accuracy of data read and write. The stability of the solution is also excellent."
"There are some capabilities within SharePlex where you can see how the data is migrating and if it still maintains good data integrity. For example, if there are some tables that get out of sync, there are ways to find them and fix the problem on the spot. Since these are very common issues, we can easily fix these types of problems using utilities, like compare and repair. So, if you find something is out of sync, then you can just repair that table. It basically syncs that table from source to target to see if there are any differences. It will then replicate those differences to the target."
"I like SharePlex's Compare and Repair tool."
"Because of the volume of the transactions, we heavily use a feature that allows SharePlex to replicate thousands of transactions. It's called PEP, Post Enhancement Performance, and that has helped us scale tremendously."
"The core replication and its performance. Performance is crucial, and SharePlex is by far the fastest. The way it handles replication to multiple targets along with basic filtering, as well as from multiple sources to a single target, is very efficient."
"The UI is very user-friendly."
"It is easy to set up the solution."
"The performance is better than doing it in some alternative ways. We don't have to worry about so much manual work."
"The debugging capabilities are great, particularly during data flow execution. You can look into the data and see what's going on in the pipeline."
"The workflow features have been very valuable. You can have automated workflows and all the steps are controlled. The workflow functionality of integration services is excellent."
"The initial setup was easy."
"There are many good features in this solution including the data fields, database integration, support for SQL views, and the lookups for matching information."
"The main value of any Microsoft product is the ease of use. You can achieve more with less time. That's what's beneficial for me. With many competitors, you might need to spend more time coming up with a solution because you have to focus on taking care of the product."
"What I didn't like about it is that the platform itself is not great at distributed processing. When you need high parallel processing, it has some inherent issues. We had to use Java transformation, and it did not go very well. I have heard that it is going to the cloud, but we haven't tried that."
"There is a need to buy a separate license if one wishes to connect with some kind of SAP system, such as SalesForce."
"I would like to see an improvement in the digital adoption."
"The solution must improve the integration with new services."
"What needs improvement in Informatica PowerCenter is the cloud experience because, nowadays, other companies, such as AWS, Azure, and Google, have more experience in the cloud. The pricing for Informatica PowerCenter on the cloud is also very expensive for customers, so some customers prefer open-source tools or lower-priced tools, such as Azure. From my point of view, Informatica must work on the pricing policy and review the policy on the cloud for Informatica PowerCenter or propose more tools with lower pricing. Clients want the automatic integration of Informatica PowerCenter with other tools. Currently, the integration process is manual, and you have to add other tools to facilitate the integration, especially with the DevOps methodology. You need scripts and tools for the integration, and you'll need to use other integration tools if you want automatic deployment for Informatica PowerCenter, so this is another area for improvement in the solution. What I'd like to see in the next release of the solution is for the integration with APIs to be simpler, because currently, the API integration feature of Informatica PowerCenter is very difficult. It's not intuitive. You have to facilitate API integration and the real-time streaming of messages in Kafka, for example, so that should be improved."
"Requires an established data center because there is no option for software as a service."
"The solution does not scale."
"We had stability issues, mostly with JVM size."
"I would like more ability to automate installation and configuration in line with some of the DevOps processes that are more mature in the market. That would be a considerable improvement."
"For its function in relation to replication (i.e. filtering), I'd give it a six or seven out of 10. GoldenGate has much more functionality by comparison."
"The reporting features need improvement. It would be very good for users to have a clear understanding of the status of replication."
"I would like the solution to have some kind of machine learning and AI capabilities. Often, if we want to improve the performance of posting, we have to bump up a parameter. That means we need to stop the process, come up with a figure that we want to bump the parameter up to, and then start SharePlex. Machine learning and AI capabilities for these kinds of improvement would tremendously help boost productivity for us."
"I don't know how easy it would be to change the architecture in an already implemented replication. For example, if we have a certain way of architecting for a particular database migration and want to change that during a period of time, is that an easy or difficult change? There was a need for us to change the architecture in-between the migration, but we didn't do it. We thought, "This is possibly complicated. Let's not change it in the middle because we were approaching our cutover date." That was one thing that we should have checked with support about for training."
"I would like to see more standard components out of the box, such as SFTP, and Data Compression components."
"Video training would be a helpful addition."
"The solution could improve by having quicker release updates."
"Improvement as per customer requirements."
"We'd like more integration capabilities."
"The performance of SSIS could improve when comparing it to Oracle Database."
"Generic processes should be used instead of custom code for each table."
"SSIS is cumbersome despite its drag-and-drop functionality. For example, let's say I have 50 tables with 30 columns. You need to set a data type for each column and table. That's around 1,500 objects. It gets unwieldy adding validation for every column. Previously, SSIS automatically detected the data type, but I think they removed this feature. It would automatically detect if it's an integer, primary key, or foreign key column. You had fewer problems building the model."