Try our new research platform with insights from 80,000+ expert users

AWS Data Pipeline [EOL] vs Skyvia comparison

 

Comparison Buyer's Guide

Executive SummaryUpdated on May 11, 2025

Review summaries and opinions

We asked business professionals to review the solutions they use. Here are some excerpts of what they said:
 

Categories and Ranking

AWS Data Pipeline [EOL]
Average Rating
8.0
Number of Reviews
2
Ranking in other categories
No ranking in other categories
Skyvia
Average Rating
9.0
Reviews Sentiment
7.8
Number of Reviews
1
Ranking in other categories
Data Integration (54th), Cloud Data Integration (33rd)
 

Featured Reviews

Geoffrey Leigh - PeerSpot reviewer
A stable, scalable, and reliable solution for moving and processing data
We're only considering enhancing the presentation layer to give a more multidimensional OLAP view that AWS seems to have decided on. Redshift with the data mart structure is like an OLAP cube. Oracle Analytics Cloud is an over-code killer and is not what we need. I was looking at Mondrian, which used to be part of the open-source stack from another vendor that works. Still, I am also looking at some of the other OLAP environments like Kaiser and perhaps decided to go to Azure with Microsoft Azure analysis cloud, but that's not multidimensional either as SSAS used to be. We tried the Mondrian, and that didn't perform how we expected. So, we are looking at resetting something to perform as an OLAP in the cloud, particularly AWS, so that we might consider an Azure solution.
RH
The product works, is simple to use, and is reliable.
Error handling. This has caused me many problems in the past. When an error occurs, the event on the connection that is called does not seem to behave as documented. If I attempt a retry or opt not to display an error dialog, it does it anyway. In all fairness, I have never reported this. I think it is more important that a unique error code is passed to the error event that identifies a uniform type of error that occurred, such as ecDisconnect, eoInvalidField. It is very hard to find what any of the error codes currently passed actually mean. A list would be great for each database engine. Trying to catch an exception without displaying the UniDAC error message is impossible, no matter how you modify the parameters in the OnError of the TUniConnection object. I have already implemented the following things myself. They are suggestions rather than specific requests. Copy Datasets: This contains an abundance of redundant options. I think that a facility to copy one dataset to another in a single call would be handy. Redundancy: I am currently working on this. I have extended the TUniConnection to have an additional property called FallbackConnection. If the TUniConnection goes offline, the connection attempts to connect the FallbackConnection. If successful, it then sets the Connection properties of all live UniDatasets in the app to the FallbackConnection and re-opens them if necessary. The extended TUniConnection holds a list of datasets that were created. Each dataset is responsible for registering itself with the connection. This is a highly specific feature. It supports an offline mode that is found in mission critical/point of sale solutions. I have never seen it implement before in any DACs, but I think it is a really unique feature with a big impact. Dataset to JSON/XML: A ToSql function on a dataset that creates a full SQL Text statement with all parameters converted to text (excluding blobs) and included in the returned string. Extended TUniScript:- TMyUniScript allows me to add lines of text to a script using the normal dataset functions, Script.Append, Script.FieldByName(‘xxx’).AsString := ‘yyy’, Script.AddToScript and finally Script.Post, then Script.Commit. The AddToScript builds the SQL text statement and appends it to the script using #e above. Record Size Calculation. It would be great if UniDac could estimate the size of a particular record from a query or table. This could be used to automatically set the packet fetch/request count based on the size of the Ethernet packets on the local area network. This I believe would increase performance and reduce network traffic for returning larger datasets. I am aware that this would also be a unique feature to UniDac but would gain a massive performance enhancement. I would suggest setting the packet size on the TUniConnection which would effect all linked datasets.
report
Use our free recommendation engine to learn which Cloud Data Integration solutions are best for your needs.
859,579 professionals have used our research since 2012.
 

Top Industries

By visitors reading reviews
Computer Software Company
27%
Financial Services Firm
20%
Educational Organization
6%
Government
6%
No data available
 

Company Size

By reviewers
Large Enterprise
Midsize Enterprise
Small Business
No data available
No data available
 

Questions from the Community

What do you like most about AWS Data Pipeline?
The most valuable feature of the solution is that orchestration and development capabilities are easier with the tool.
What is your experience regarding pricing and costs for AWS Data Pipeline?
I rate the pricing between six to eight on a scale from one to ten, where one is low price, and ten is high price.
What needs improvement with AWS Data Pipeline?
The user-defined functions have shortcomings in AWS Data Pipeline. The user-defined functions could be one of the areas where I can write a custom function and embed it as a part of AWS Data Pipeli...
Ask a question
Earn 20 points
 

Comparisons

No data available
 

Also Known As

No data available
Skyvia, Skyvia Data Integration
 

Overview

 

Sample Customers

bp, Cerner, Expedia, Finra, HESS, intuit, Kellog's, Philips, TIME, workday
Boeing, Sony, Honda, Oracle, BMW, Samsung
Find out what your peers are saying about Amazon Web Services (AWS), Informatica, Salesforce and others in Cloud Data Integration. Updated: June 2025.
859,579 professionals have used our research since 2012.