2019-07-02T06:57:00Z

What needs improvement with NetApp AltaVault?

Miriam Tover - PeerSpot reviewer
  • 0
  • 1
PeerSpot user
8

8 Answers

PM
Real User
Top 5Leaderboard
2023-09-27T11:39:55Z
Sep 27, 2023

The capacity for file sharing in the tool right now in the solution is 10 TB. Whenever the solution reaches a full level or is probably about to be full in terms of its capacity related to file sharing, I want it to auto-increase its capacity without my need to intervene. The aforementioned area can be considered for improvement in the solution.

Search for a product comparison
Joseph Galvan - PeerSpot reviewer
Reseller
Top 10Leaderboard
2023-01-17T19:28:40Z
Jan 17, 2023

NetApp's own competing products complicate things. If I were going to take a more modernized approach, I would probably be looking at FXS ONTAP and I'd be looking at Azure NetApp files and just doing more of a cloud-native solution than doing hardware appliances. Where it'll probably stick is where people like those at the Port, where it's just so big that they have to have lots of buckets of active tier and mid-tier retention and then super long-term retention. Then there's compliance reasons where folks can't necessarily leverage cloud to do that. While it's a strong product, seemingly, some of NetApp's own products are a bit predatory on it. NetApp's not continuing to invest and develop it. I'm just not seeing that.

MF
Real User
2022-01-05T08:21:45Z
Jan 5, 2022

The solution needs faster indexing.

Joseph Galvan - PeerSpot reviewer
Reseller
Top 10Leaderboard
2021-12-10T21:26:00Z
Dec 10, 2021

It's a good product, but it's becoming outdated and moving to their other primary architecture.

AN
Real User
Top 20
2021-03-01T23:07:41Z
Mar 1, 2021

There are still a few things on SharePoint that they need to correct in terms of how they backup and restore. That's something that we had a challenge with; however, I know that they've been correcting these problems, which is good. The SharePoint Backup is one of the things that they need to look at.

Abbasi Poonawala - PeerSpot reviewer
Real User
Top 5Leaderboard
2021-02-22T21:25:11Z
Feb 22, 2021

One area that can be improved is around how we define the different KPIs. In particular, the business KPIs. I have my own in-house application for the business KPIs, so for example, with our policies around retention, which is a period of seven years, I have to read these parameters from other applications and I need them to integrate well. NetApp Cloud Backup Manager should help to get this integrated seamlessly with other applications, meaning that it will populate the data around the different parameters. These parameters could be things like the retention period, the backup schedule, or anything. It might be an ITSM ticket, where it's a workflow that is triggered somewhere, and the ITSM ticket has been created for a particular environment like my development environment, an INT environment, or a UAT environment. This kind of process needs to integrate well with my own application, and there are some challenges. For example, if it allows for consuming of RESTful APIs, that's how we will usually integrate, but there are certain challenges when it comes to integrating with our own application around KPIs, whether it's business KPIs or technical KPIs. What I want is to populate that data from my own applications. So we have have the headroom in the KPI, and we have the throughput, the volumes, the transactions per second, etc., which are all defined. And these are the global parameters. They affect all the lines of business. It's a central application that is consumed by most of the lines of business and it's all around the KPIs. Earlier, it used to be based on Quest Foglight, which is an application that was taken up and customized. It was made in-house as a core service, and used as a core building block. But our use of Quest Foglight has become a bit outdated. There is no more support available, and it's been there as a kind of legacy application for more than ten years now in the organization. And now it get down to the question: Is this an investment or will we need to divest ourselves of it? So there has to be an option to remediate it out. In that case, one possibility is to integrate the existing application and it gets completely decommissioned. Here it would help if there were some better ways of defining or handling the KPIs in the Cloud Manager, so that most of the parameters are not defined directly by me. Those will be the global parameters that are defined across all the lines of business. There are some integration challenges when it comes to this, and I've spoken to the support team who say they have the REST APIs, but the integration still isn't going as smooth as it could be. Most of the time, when things aren't working out, we need dedicated engineers to be put in for the entire integration. And then it becomes more of a challenge on top of everything. So if the Cloud Manager isn't being fed all the kinds of parameters from the backup strategy around the ITSM and incident tickets, or backup schedules, or anything related to the backup policies, then it takes a while. Ideally, I would want it to be read directly from our in-house applications. And this is more to do with our kind of product processes; that is, it's not our own choice to decide. The risk management team has mandated this as part of the compliance, that we have to strictly enforce the KPIs, the headroom, and the rest of the global parameters which are defined for the different lines of business. So if my retention period changes from seven years to, let's say, 10 years or 15 years, then those rules have to be strictly enforced. Ultimately, we would like better support for ITSM. The ITSM tools like ServiceNow or BMC Remedy are already adding multiple new features, so they have to be upgraded over a period of time, and that means NetApp has to provision for that and factor it in. Some of the AI-based capabilities are there now, and those things have to be incorporated somehow. One last thing is that NetApp could provide better flash storage. Since they're already on block storage and are doing well in that segment, it makes sense that they will have to step up when it comes to flash array storage and so on. I have been evaluating NetApp's flash array storage solutions versus some others like Toshiba's flash array and Fujitsu's storage array, which are quite cost-effective.

Find out what your peers are saying about NetApp, Commvault, Cohesity and others in Backup and Recovery. Updated: April 2024.
767,319 professionals have used our research since 2012.
MF
Real User
2020-03-16T06:56:09Z
Mar 16, 2020

In the past, there was a product called Snap Protect. If there was any sort of integration between this and the online arrays from that, it would be perfect. It would allow for a lot of integration between the on-prem and cloud. It would also help customers with migrations. It would make for a more complete platform. It would be nice if the solution was more complete. Many customers are looking for a single solution or silo that they can manage everything from. It would be nice, instead of having different products, if everything a client needed was in one solution.

PC
Real User
2019-07-02T06:57:00Z
Jul 2, 2019

I would like to see more web support.

Backup and Recovery
Data backup involves copying and moving data from its primary location to a secondary location from which it can later be retrieved in case the primary data storage location experiences some kind of failure or disaster.
Download Backup and Recovery ReportRead more