

Find out in this report how the two Cloud Data Integration solutions compare in terms of features, pricing, service and support, easy of deployment, and ROI.
I can specify savings of around 40 to 60%.
I have seen a return on investment; my team was able to stay extremely small even though we had a lot of data integrations with many companies.
I can testify to the return on investment with metrics regarding time saved; we have increased our efficiency by about 20 to 30 percent due to the swift migration processes facilitated by the tool.
I have noticed a return on investment with Pentaho Data Integration and Analytics in terms of time savings and staff reduction.
When working with AWS GovCloud, we often did not get an answer in time because AWS seemed more focused on the commercial side.
I am happy with the technical support from AWS.
24/7 assistance is available for the Enterprise Edition.
take the time to understand our business requirements, offering appropriate recommendations.
Communication with the vendor is challenging
Even if there was a failure, we could catch it and rerun it.
AWS's scalable nature involves a human approach, meaning it is not auto-scalable.
While scalability is good, latency exists due to our business nature.
It can be scaled well until you reach a point where you need to perform a lot of operations, and the issue arises when it runs out of memory to handle some data.
Its ability to scale horizontally in cloud-native architectures or for massive real-time processing is limited.
Pentaho Data Integration handles larger datasets better.
For DMS version upgrades, we schedule downtime during business hours so that midnight workloads are not interrupted and morning business can run smoothly.
Performance issues arise due to reliance on a flowchart-based mechanism instead of scripts, which can lead to longer execution times.
I find that version 3.1 is the most stable version I have ever used.
It's pretty stable, however, it struggles when dealing with smaller amounts of data.
DMS works within AWS ecosystem, but they also have to look for third party solutions. Now Snowflake is a bigger player, or Databricks.
Sometimes, those who implement the service face problems and resolve it, but I may not even know what problems they faced.
We should also explore more effective partitioning for parallel processing and fine-tuning database connections to reduce load times and improve ETL speed.
Pentaho Data Integration and Analytics can be improved by working with different environments, specifically the possibility to change the variables, meaning I write my variables only once and can change them for different environments such as production or development.
Pentaho Data Integration and Analytics could have real-time processing and automatic alerting, having alerts or automatic notifications when a job fails or when certain data doesn't meet certain rules.
I use the community version of Pentaho Data Integration and Analytics, and I do not need additional costs.
The setup cost was minimal, and the pricing experience was pretty good.
The company covered it and they had no problem paying for it because they saw that it was cost-effective in terms of performance afterwards.
AWS offers a way to build jobs that are scalable, expandable for new and current tables, and can be deployed quickly.
You can copy the database at first without impacting your current database, and then use CDC to copy incremental changes.
The scalability option is another valuable feature because AWS provides its own compute behind it, so I can scale up and scale down at any given point.
Pentaho Data Integration and Analytics has positively impacted my organization because it meant we didn't have to write a lot of custom API back-end processing logic; it did the majority of that heavy lifting for us.
It automates the data workflow, including extraction, cleansing, and loading into warehouses for BI reporting purposes, while also removing duplicates, validating data, and standardizing formats, enabling real-time decision-making.
Pentaho Data Integration and Analytics has positively impacted my organization because it is easier to use, and my knowledge about this work facilitates the translation from the source to my final system.

| Company Size | Count |
|---|---|
| Small Business | 8 |
| Midsize Enterprise | 8 |
| Large Enterprise | 17 |
| Company Size | Count |
|---|---|
| Small Business | 18 |
| Midsize Enterprise | 17 |
| Large Enterprise | 31 |
AWS Database Migration Service facilitates database transfers with its automation, scalability, and cost-efficiency. Supporting real-time synchronization and schema transformations, it integrates with ETL tools and offers robust security, simplifying administration while focusing on data logic.
Highly effective for migrating databases like Oracle, SQL, and PostgreSQL from on-premises to cloud environments, AWS Database Migration Service supports live replication and Change Data Capture. It aids in seamless database replication and transformation, ensuring real-time data synchronization and secure AWS data storage. Users benefit from efficient workflows, reducing complex technical tasks during large data migrations. While praised for simplifying administration, areas for improvement include integration capabilities and pricing competitiveness. Enhanced handling of large-scale migrations, network bandwidth management, and third-party ecosystem support further augment its potential.
What are the key features of AWS Database Migration Service?In terms of industry-specific implementations, AWS Database Migration Service is widely used for industries requiring reliable and efficient data solutions such as finance, healthcare, and technology. It supports companies in maintaining real-time updates and securing sensitive information during cloud transitions, making it a key asset in streamlining database management and facilitating business transformation.
Pentaho Data Integration and Analytics offers an intuitive platform for data workflows, enabling users to easily manage ETL processes across diverse data formats, ensuring seamless automation and development.
With its drag-and-drop interface, Pentaho allows for efficient ETL workflows without extensive coding. It supports a multitude of data formats and sources such as SQL, NoSQL, Hadoop, CSV, and JSON. Advanced features like metadata injection and API integration enable seamless automation. However, improvements in big data performance, better cloud service integration, and enhanced real-time processing capabilities can enhance user experience. Additional connectors and improved documentation are sought after by many. Providing support for more programming languages and optimizing memory usage also presents opportunities for enhancement.
What are the key features of Pentaho Data Integration and Analytics?Pentaho is employed across finance, healthcare, and retail industries for ETL processes. It's instrumental in integrating data from ERP, SAP systems, Excel, and APIs to develop comprehensive reports and data models. Companies rely on its capabilities for both on-premises and cloud deployments, improving data transparency and management.
We monitor all Cloud Data Integration reviews to prevent fraudulent reviews and keep review quality high. We do not post reviews by company employees or direct competitors. We validate each review for authenticity via cross-reference with LinkedIn, and personal follow-up with the reviewer when necessary.