

Find out in this report how the two Cloud Data Integration solutions compare in terms of features, pricing, service and support, easy of deployment, and ROI.
I advocate using Glue in such cases.
I have seen a return on investment; my team was able to stay extremely small even though we had a lot of data integrations with many companies.
I can testify to the return on investment with metrics regarding time saved; we have increased our efficiency by about 20 to 30 percent due to the swift migration processes facilitated by the tool.
I have noticed a return on investment with Pentaho Data Integration and Analytics in terms of time savings and staff reduction.
Upgrades occur every four months, and new developments coincide with version updates.
For complex Glue-related problems such as job failures or permission issues, their documentation is good, but having direct access to support helps cut down troubleshooting time significantly.
24/7 assistance is available for the Enterprise Edition.
take the time to understand our business requirements, offering appropriate recommendations.
Communication with the vendor is challenging
It is beneficial to upgrade jobs, and we conduct extensive testing in development before migrating to production.
It can easily handle data from one terabyte to 100 terabytes or more, scaling nicely with larger datasets.
It can be scaled well until you reach a point where you need to perform a lot of operations, and the issue arises when it runs out of memory to handle some data.
Its ability to scale horizontally in cloud-native architectures or for massive real-time processing is limited.
Pentaho Data Integration handles larger datasets better.
AWS Glue is highly stable, and I would rate its stability as nine.
Performance issues arise due to reliance on a flowchart-based mechanism instead of scripts, which can lead to longer execution times.
I find that version 3.1 is the most stable version I have ever used.
It's pretty stable, however, it struggles when dealing with smaller amounts of data.
Migrating jobs from version 3.0 to 4.0 can present compatibility issues.
With AWS, I gather data from multiple sources, clean it up, normalize it, de-duplicate it, and make it presentable.
A more user-friendly and simpler process would help speed up the deployment process.
We should also explore more effective partitioning for parallel processing and fine-tuning database connections to reduce load times and improve ETL speed.
Pentaho Data Integration and Analytics can be improved by working with different environments, specifically the possibility to change the variables, meaning I write my variables only once and can change them for different environments such as production or development.
Pentaho Data Integration and Analytics could have real-time processing and automatic alerting, having alerts or automatic notifications when a job fails or when certain data doesn't meet certain rules.
Costing depends on resource usage, and cost optimization may involve redesigning jobs for flexibility.
AWS charges based on runtime, which can be quite pricey.
The smallest cost for a project is around €700, while the largest can reach up to €7,000 based on the scale of the usage.
I use the community version of Pentaho Data Integration and Analytics, and I do not need additional costs.
The setup cost was minimal, and the pricing experience was pretty good.
The company covered it and they had no problem paying for it because they saw that it was cost-effective in terms of performance afterwards.
For ETL, I feel the performance is excellent. If I create jobs in a standard way, the performance is great, and maintenance is also seamless.
AWS Glue's most valuable features include its transformation capabilities, which provide data quality and shape for processing in ML or AI models.
AWS Glue has reduced efforts by 60%, which is the main benefit.
Pentaho Data Integration and Analytics has positively impacted my organization because it meant we didn't have to write a lot of custom API back-end processing logic; it did the majority of that heavy lifting for us.
It automates the data workflow, including extraction, cleansing, and loading into warehouses for BI reporting purposes, while also removing duplicates, validating data, and standardizing formats, enabling real-time decision-making.
Pentaho Data Integration and Analytics has positively impacted my organization because it is easier to use, and my knowledge about this work facilitates the translation from the source to my final system.

| Company Size | Count |
|---|---|
| Small Business | 11 |
| Midsize Enterprise | 6 |
| Large Enterprise | 34 |
| Company Size | Count |
|---|---|
| Small Business | 18 |
| Midsize Enterprise | 17 |
| Large Enterprise | 31 |
AWS Glue is a serverless data integration service offering seamless integration with AWS services like S3, Redshift, and Athena. Known for its flexibility with data formats and automation of ETL tasks, AWS Glue enhances data management and transformation.
AWS Glue facilitates seamless data extraction, transformation, and loading for businesses, integrating with key AWS services, allowing efficient data pipeline automation. It's valued for a user-friendly GUI, scalability, and cost-effectiveness, supporting PySpark for complex datasets and includes a robust data catalog, real-time backup capabilities, and code generation. Despite its strengths, improvements are needed in documentation, training, and broader programming language support. Users face challenges with its complex interface and integration with non-AWS products, driving demand for enhancements in its usability and performance.
What are AWS Glue's most important features?Businesses leverage AWS Glue in industries for ETL processes, data integration, and transformation. It is used to optimize data lakes or warehouses integration, enhancing data cataloging and real-time integration. Its serverless feature enables efficient data processing in sectors like finance and healthcare, where handling complex data-intensive tasks is crucial.
Pentaho Data Integration and Analytics offers an intuitive platform for data workflows, enabling users to easily manage ETL processes across diverse data formats, ensuring seamless automation and development.
With its drag-and-drop interface, Pentaho allows for efficient ETL workflows without extensive coding. It supports a multitude of data formats and sources such as SQL, NoSQL, Hadoop, CSV, and JSON. Advanced features like metadata injection and API integration enable seamless automation. However, improvements in big data performance, better cloud service integration, and enhanced real-time processing capabilities can enhance user experience. Additional connectors and improved documentation are sought after by many. Providing support for more programming languages and optimizing memory usage also presents opportunities for enhancement.
What are the key features of Pentaho Data Integration and Analytics?Pentaho is employed across finance, healthcare, and retail industries for ETL processes. It's instrumental in integrating data from ERP, SAP systems, Excel, and APIs to develop comprehensive reports and data models. Companies rely on its capabilities for both on-premises and cloud deployments, improving data transparency and management.
We monitor all Cloud Data Integration reviews to prevent fraudulent reviews and keep review quality high. We do not post reviews by company employees or direct competitors. We validate each review for authenticity via cross-reference with LinkedIn, and personal follow-up with the reviewer when necessary.