Try our new research platform with insights from 80,000+ expert users

NetApp Cloud Backup vs Teradata comparison

 

Comparison Buyer's Guide

Executive SummaryUpdated on Apr 20, 2025

Review summaries and opinions

We asked business professionals to review the solutions they use. Here are some excerpts of what they said:
 

ROI

Sentiment score
8.5
Users report substantial cost savings, efficiency, ease of use, and reliable performance with NetApp Cloud Backup, enhancing operational continuity.
Sentiment score
8.1
Teradata boosts analytics speed over 100%, enhancing customer service and satisfaction, with high ROI and user approval.
 

Customer Service

Sentiment score
6.6
NetApp Cloud Backup is praised for its efficient customer service, knowledgeable support staff, and detailed documentation, despite occasional delays.
Sentiment score
7.1
Teradata's customer service is praised for expertise but criticized for delays, with ratings ranging from 6 to 10 out of 10.
The technical support from Teradata is quite advanced.
Customer support is very good, rated eight out of ten under our essential agreement.
 

Scalability Issues

Sentiment score
7.9
NetApp Cloud Backup is highly scalable, performs well in cloud environments, and integrates seamlessly with Microsoft Azure for disaster recovery.
Sentiment score
7.4
Teradata is praised for its scalability, speed, and flexibility, despite some complexity and cost challenges in cloud environments.
This expansion can occur without incurring downtime or taking systems offline.
Scalability is complex as you need to purchase a license and coordinate with Teradata for additional disk space and CPU.
 

Stability Issues

Sentiment score
8.7
Users commend NetApp Cloud Backup for its stability, efficient large workload handling, strong integration, and minimal downtime during data restoration.
Sentiment score
8.4
Teradata excels in stability with minimal downtime, robust architecture, 99.9% uptime, and reliable performance, despite minor large dataset issues.
I find the stability to be almost a ten out of ten.
The workload management and software maturity provide a reliable system.
 

Room For Improvement

NetApp Cloud Backup users need better integration, KPI handling, ITSM support, and ease of use, with cost-effective comparisons to competitors.
Teradata users seek better transaction processing, enhanced scalability, modern interface, cloud focus, advanced analytics, and improved support and documentation.
Unlike SQL and Oracle, which have in-built replication capabilities, we don't have similar functionality with Teradata.
 

Setup Cost

Enterprise buyers appreciate NetApp Cloud Backup's cost-effective pricing, though virtual NetApp hosting is pricier than general cloud hosting.
Teradata's high cost is justified by its superior performance, competitive total ownership costs, and flexible pricing models.
Initially, it may seem expensive compared to similar cloud databases, however, it offers significant value in performance, stability, and overall output once in use.
Teradata is much more expensive than SQL, which is well-performed and cheaper.
 

Valuable Features

NetApp Cloud Backup is praised for its simplicity, seamless integration, efficiency, data protection, scalability, user-friendliness, reliability, and cost-effectiveness.
Teradata offers efficient, scalable data management with fast query performance, robust security, automation, and cloud flexibility for businesses.
The data mover is valuable over the last two years as it allows us to achieve data replication to our disaster recovery systems.
 

Categories and Ranking

NetApp Cloud Backup
Ranking in Backup and Recovery
28th
Average Rating
8.0
Reviews Sentiment
7.2
Number of Reviews
4
Ranking in other categories
Deduplication Software (10th), Disk Based Backup Systems (4th), Cloud Backup (22nd), Cloud Storage Gateways (5th)
Teradata
Ranking in Backup and Recovery
20th
Average Rating
8.2
Reviews Sentiment
7.0
Number of Reviews
76
Ranking in other categories
Customer Experience Management (6th), Data Integration (17th), Relational Databases Tools (8th), Data Warehouse (3rd), BI (Business Intelligence) Tools (10th), Marketing Management (6th), Cloud Data Warehouse (6th)
 

Mindshare comparison

As of May 2025, in the Backup and Recovery category, the mindshare of NetApp Cloud Backup is 0.2%, down from 0.4% compared to the previous year. The mindshare of Teradata is 0.1%, up from 0.1% compared to the previous year. It is calculated based on PeerSpot user engagement data.
Backup and Recovery
 

Featured Reviews

Abbasi Poonawala - PeerSpot reviewer
Simplifies our backups with an agentless backup manager, but needs better integration with in-house applications
One area that can be improved is around how we define the different KPIs. In particular, the business KPIs. I have my own in-house application for the business KPIs, so for example, with our policies around retention, which is a period of seven years, I have to read these parameters from other applications and I need them to integrate well. NetApp Cloud Backup Manager should help to get this integrated seamlessly with other applications, meaning that it will populate the data around the different parameters. These parameters could be things like the retention period, the backup schedule, or anything. It might be an ITSM ticket, where it's a workflow that is triggered somewhere, and the ITSM ticket has been created for a particular environment like my development environment, an INT environment, or a UAT environment. This kind of process needs to integrate well with my own application, and there are some challenges. For example, if it allows for consuming of RESTful APIs, that's how we will usually integrate, but there are certain challenges when it comes to integrating with our own application around KPIs, whether it's business KPIs or technical KPIs. What I want is to populate that data from my own applications. So we have have the headroom in the KPI, and we have the throughput, the volumes, the transactions per second, etc., which are all defined. And these are the global parameters. They affect all the lines of business. It's a central application that is consumed by most of the lines of business and it's all around the KPIs. Earlier, it used to be based on Quest Foglight, which is an application that was taken up and customized. It was made in-house as a core service, and used as a core building block. But our use of Quest Foglight has become a bit outdated. There is no more support available, and it's been there as a kind of legacy application for more than ten years now in the organization. And now it get down to the question: Is this an investment or will we need to divest ourselves of it? So there has to be an option to remediate it out. In that case, one possibility is to integrate the existing application and it gets completely decommissioned. Here it would help if there were some better ways of defining or handling the KPIs in the Cloud Manager, so that most of the parameters are not defined directly by me. Those will be the global parameters that are defined across all the lines of business. There are some integration challenges when it comes to this, and I've spoken to the support team who say they have the REST APIs, but the integration still isn't going as smooth as it could be. Most of the time, when things aren't working out, we need dedicated engineers to be put in for the entire integration. And then it becomes more of a challenge on top of everything. So if the Cloud Manager isn't being fed all the kinds of parameters from the backup strategy around the ITSM and incident tickets, or backup schedules, or anything related to the backup policies, then it takes a while. Ideally, I would want it to be read directly from our in-house applications. And this is more to do with our kind of product processes; that is, it's not our own choice to decide. The risk management team has mandated this as part of the compliance, that we have to strictly enforce the KPIs, the headroom, and the rest of the global parameters which are defined for the different lines of business. So if my retention period changes from seven years to, let's say, 10 years or 15 years, then those rules have to be strictly enforced. Ultimately, we would like better support for ITSM. The ITSM tools like ServiceNow or BMC Remedy are already adding multiple new features, so they have to be upgraded over a period of time, and that means NetApp has to provision for that and factor it in. Some of the AI-based capabilities are there now, and those things have to be incorporated somehow. One last thing is that NetApp could provide better flash storage. Since they're already on block storage and are doing well in that segment, it makes sense that they will have to step up when it comes to flash array storage and so on. I have been evaluating NetApp's flash array storage solutions versus some others like Toshiba's flash array and Fujitsu's storage array, which are quite cost-effective.
SurjitChoudhury - PeerSpot reviewer
Offers seamless integration capabilities and performance optimization features, including extensive indexing and advanced tuning capabilities
We created and constructed the warehouse. We used multiple loading processes like MultiLoad, FastLoad, and Teradata Pump. But those are loading processes, and Teradata is a powerful tool because if we consider older technologies, its architecture with nodes, virtual processes, and nodes is a unique concept. Later, other technologies like Informatica also adopted the concept of nodes from Informatica PowerCenter version 7.x. Previously, it was a client-server architecture, but later, it changed to the nodes concept. Like, we can have the database available 24/7, 365 days. If one node fails, other nodes can take care of it. Informatica adopted all those concepts when it changed its architecture. Even Oracle databases have since adapted their architecture to them. However, this particular Teradata company initially started with its own different type of architecture, which major companies later adopted. It has grown now, but initially, whatever query we sent it would be mapped into a particular component. After that, it goes to the virtual processor and down to the disk, where the actual physical data is loaded. So, in between, there's a map, which acts like a data dictionary. It also holds information about each piece of data, where it's loaded, and on which particular virtual processor or node the data resides. Because Teradata comes with a four-node architecture, or however many nodes we choose, the cost is determined by that initially. So, what type of data does each and every node hold? It's a shared-no architecture. So, whatever task is given to a virtual processor it will be processed. If there's a failure, then it will be taken care of by another virtual processor. Moreover, this solution has impacted the query time and data performance. In Teradata, there's a lot of joining, partitioning, and indexing of records. There are primary and secondary indexes, hash indexing, and other indexing processes. To improve query performance, we first analyze the query and tune it. If a join needs a secondary index, which plays a major role in filtering records, we might reconstruct that particular table with the secondary index. This tuning involves partitioning and indexing. We use these tools and technologies to fine-tune performance. When it comes to integration, tools like Informatica seamlessly connect with Teradata. We ensure the Teradata database is configured correctly in Informatica, including the proper hostname and properties for the load process. We didn't find any major complexity or issues with integration. But, these technologies are quite old now. With newer big data technologies, we've worked with a four-layer architecture, pulling data from Hadoop Lake to Teradata. We configure Teradata with the appropriate hostname and credentials, and use BTEQ queries to load data. Previously, we converted the data warehouse to a CLD model as per Teradata's standardized procedures, moving from an ETL to an EMT process. This allowed us to perform gap analysis on missing entities based on the model and retrieve them from the source system again. We found Teradata integration straightforward and compatible with other tools.
report
Use our free recommendation engine to learn which Backup and Recovery solutions are best for your needs.
851,451 professionals have used our research since 2012.
 

Comparison Review

it_user232068 - PeerSpot reviewer
Aug 5, 2015
Netezza vs. Teradata
Original published at https://www.linkedin.com/pulse/should-i-choose-net Two leading Massively Parallel Processing (MPP) architectures for Data Warehousing (DW) are IBM PureData System for Analytics (formerly Netezza) and Teradata. I thought talking about the similarities and differences…
 

Top Industries

By visitors reading reviews
Manufacturing Company
16%
Computer Software Company
14%
Government
8%
Financial Services Firm
6%
Financial Services Firm
26%
Computer Software Company
11%
Manufacturing Company
7%
Healthcare Company
7%
 

Company Size

By reviewers
Large Enterprise
Midsize Enterprise
Small Business
No data available
 

Questions from the Community

What's the 3-2-1 data protection that NetApp Cloud Backup offers?
Hi, the 3-2-1 data protection from this product is related to a backup strategy with the same name. I'm assuming you don't know about it so I'll tell you in a few words. In its essence, this backup...
Is NetApp Cloud Backup secure for backup?
I've just started using NetApp Cloud Backup but my initial reason behind choosing it in the first place is that they advertise their high-security approach. So basically, they give you ransomware p...
Is NetApp Cloud Backup expensive in your opinion?
It depends on how much exactly you count as expensive. For me, NetApp Cloud Backup isn't too expensive. I say that based on the services it provides and on the way it provides them. I think it's im...
Comparing Teradata and Oracle Database, which product do you think is better and why?
I have spoken to my colleagues about this comparison and in our collective opinion, the reason why some people may declare Teradata better than Oracle is the pricing. Both solutions are quite simi...
Which companies use Teradata and who is it most suitable for?
Before my organization implemented this solution, we researched which big brands were using Teradata, so we knew if it would be compatible with our field. According to the product's site, the comp...
Is Teradata a difficult solution to work with?
Teradata is not a difficult product to work with, especially since they offer you technical support at all levels if you just ask. There are some features that may cause difficulties - for example,...
 

Also Known As

No data available
IntelliFlex, Aster Data Map Reduce, , QueryGrid, Customer Interaction Manager, Digital Marketing Center, Data Mover, Data Stream Architecture
 

Overview

 

Sample Customers

Information Not Available
Netflix
Find out what your peers are saying about NetApp Cloud Backup vs. Teradata and other solutions. Updated: April 2025.
851,451 professionals have used our research since 2012.