Coming October 25: PeerSpot Awards will be announced! Learn more
Buyer's Guide
Cloud Access Security Brokers (CASB)
September 2022
Get our free report covering Microsoft, Netskope, Skyhigh Security, and other competitors of Symantec CloudSOC CASB. Updated: September 2022.
634,775 professionals have used our research since 2012.

Read reviews of Symantec CloudSOC CASB alternatives and competitors

Cloud Security & Governance at a financial services firm with 10,001+ employees
Real User
Integrates well and helps us in protecting sensitive information, but takes time to scan and apply the policies and cannot detect everything we need
Pros and Cons
  • "The feature that helps us in detecting the sensitive information being shared has been very useful. In addition, the feature that allows MCAS to apply policies with SharePoint, Teams, and OneDrive is being used predominantly."
  • "It takes some time to scan and apply the policies when there is some sensitive information. After it applies the policies, it works, but there is a delay. This is something for which we are working with Microsoft."

What is our primary use case?

MCAS was onboarded for the purpose of detecting shadow IT. As the organization moved towards more SaaS solutions, we wanted to make sure that there is a way to monitor and govern the IT services coming up as shadow IT. We are a very big organization where a lot of services get onboarded, and some of the things may go unnoticed. We wanted to detect the shadow IT software being installed or shadow IT happening within a department or business unit.

We also wanted to make sure that the cloud access security broker provides a DLP kind of solution for Office 365. For example, if I am uploading a document with PI data, MCAS should scan and make sure that the right classification is applied. When the right classification is applied, the document gets encrypted, and relevant information protection is applied. If the right classification is not applied, the users are alerted to make sure that they go and remediate the document, task, file, etc.

This is how we started with this solution the last year. Going forward, as a strategic solution, we are also looking at using MCAS to govern the Office environment. We have started onboarding solutions like Microsoft Teams, SharePoint Online, OneDrive, and Exchange Online. 

Our setup is a mixture of on-premises and cloud solutions. At this point in time, the major cloud providers are AWS and Azure, and we also have on-premises products such as Symantec DLP, Doc Scan, etc.

How has it helped my organization?

There are certain regulatory requirements in our bank for personal data and confidential information that need to be monitored from a security standpoint. It is a regulatory and standard requirement to have such a solution in place. 

MCAS is a dedicated solution for Office 365 and other productivity-related solutions, and it really helps to automate some of the processes. It would have been difficult for us to find a similar product. It gels well with some of the solutions or technologies that we have, especially with Microsoft Azure and Office 365.

From a security monitoring perspective, there is a productivity improvement and fewer human errors.

In terms of user experience, if users mistakenly put PI information or some kind of data, it can detect and alert them. From that aspect, it is doing the job, but we are using it from a security standpoint. I'm more from a regulatory environment, and there are security requirements that are enforced by regulators. So, we cannot provide some of the end-user experience features, and there should always be a balance between the end-user experience and the security standpoint. MCAS is more of a backend security posture product. I won't position it as enhancing the user experience.

What is most valuable?

The feature that helps us in detecting the sensitive information being shared has been very useful. In addition, the feature that allows MCAS to apply policies with SharePoint, Teams, and OneDrive is being used predominantly.

It is a kind of unified solution. As compared to other solutions such as Netskope, Symantec, or McAfee, it provides a more unified reporting structure.

It also integrates with other technologies. We have Azure Information Protection, and it goes well with the solutions that we are already using.

What needs improvement?

It takes some time to scan and apply the policies when there is some sensitive information. After it applies the policies, it works, but there is a delay. This is something for which we are working with Microsoft.

It cannot detect all the things that are required as per our bank's standards. We are working with Microsoft to see how they are going to help us resolve this, and based on NDA, which new features are coming in because we require a unified solution. We have other security solutions that are working on top of it, but we don't want to use multiple solutions and then end up with a human error. From a security perspective, the weakest link is human error. If certain features are monitored by MCAS, certain features are handled by Zscaler, and certain features are handled by Symantec DLP, it becomes difficult to synchronize from an operational standpoint. This is the situation we are in currently, but these issues come with new products or new cloud solutions. We have to slowly orchestrate and see how to unify the solutions. So, at present, it doesn't solve all the problems. There are many problems, but at least, we have other solutions that are currently providing some mitigation.

It doesn't provide any way to scan Microsoft Teams when an external exchange of images is happening. You can always do the filtering on the documents during the chat, but if there is an image, then some kind of OCR capability is required to detect it. At present, there is no way MCAS can go and detect those kinds of images and alert us. They can maybe integrate it with an existing OCR-capable product. This is something that we are absolutely looking into. There should also be a feature to immediately increase the time to detect some PI information being exchanged via chat.

Its reporting capabilities can be better. Currently, to generate reports, you need to have Power Automate in place. If such capabilities are built into the product, it would be easier because when we bring in Power Automate, we need to make sure that Power Automate also gets monitored from the DLP and governance standpoints. MCAS doesn't have many reporting capabilities, and it's really an operational nightmare to get all these things done at this point in time by using MCAS. These are some of the operational capabilities that our engineers require from this solution from the reporting perspective. Symantec and other solutions are more mature in this area. It could be because MCAS is still an upcoming product.

For how long have I used the solution?

We onboarded Office 365 and cloud services less than two years ago. MCAS was one of the strategic and DLP kind of solutions for Office 365 and other productivity products. Because the onboarding of the cloud services is in phases and not everything can be onboarded at the same time and it requires the involvement of different security and project departments, MCAS was onboarded last year.

What do I think about the scalability of the solution?

From an enterprise perspective, it meets most of the interoperability requirements. So, scalability is there. I don't see an issue from the scalability perspective. Only features are missing here and there.

Currently, it is almost serving the entire bank. In terms of the SaaS products that MCAS is monitoring and the number of users it is serving, we have onboarded around 40,000 users for Office 365 and other SaaS products. Eventually, it will be serving the entire bank, but at this point in time, it is only serving all Office 365 and SaaS product users. 

It is more of a cybersecurity solution for the bank to comply with all the security requirements and meet the security quotient. The end users don't see MCAS as a direct solution, but MCAS is providing security services for the bank behind all the services.

How are customer service and technical support?

We have proper help desk support. For example, if someone uploads a document that has PI data and there is an issue, it is highlighted to the user asking them to remediate it. The manager is also copied. The help desk takes care of such things. 

Once the solution is implemented, it is almost auto-run. From the support perspective, it is mostly about why did I get this alert, what was wrong with this document, etc. Such things are usually taken care of by the user because users are responsible for what content they are allowed to load on a particular website, SharePoint site, or software. A robust change management process and help desk are already in place, and I don't see a big concern on this aspect.

Which solution did I use previously and why did I switch?

Previously, we didn't have any cloud product. We only had on-premise products. Our organization joined the cloud around one and a half years ago mainly because of this pandemic situation.

How was the initial setup?

It depends on the requirements. Certain requirements are really complex. The deployment itself is quite fast because MCAS is on the cloud, but there are a lot of requirements from the regulations and the bank's standards perspective.

It took us one week for the architecture and to decide things like whether we need a reverse proxy. To have all the requirements and get all the things done in an enterprise environment, typically, a simple product like MCAS can take three to six months. That's because there are a lot of governance requirements, and we need to make sure there is no PI data, and the keys are encrypted somewhere in the user ID part. 

In terms of the implementation strategy, at the high level, for Office 365 and SaaS solutions, we wanted a unified product to replace our existing one. From the strategy perspective, we wanted to go to the cloud. MCAS was able to integrate with most of our Office productivity tools. We procured the licenses and then went through the strategy of the bank and how the product can meet the needs. This was at a very high level. Of course, when we go into operations, we get operational challenges. That's why we need to have a longer time period to make a product coexist with the existing products.

What about the implementation team?

We have our own department, and they are trained in it. We also engage all sorts of vendors to provide us the results. At least for the interiors, we do not engage a third-party reseller or contractor.  

It was more of an in-house implementation, but Microsoft helped us in coming up with a service design for Azure-related products including Office 365. Based on our requirements and infrastructure, they provided high-level architecture and design documents and told us about the things to be included or considered. We took that service design document and built our operations based on that and got it to work. So, the service design came from Microsoft, but hands-on was by our bank.

In terms of maintenance, this is actually managed by security folks and cybersecurity services. Currently, it is being managed by three people. There are only three operators. Of course, when there are new things to be implemented and new policies to be created, it goes to engineering. For changes, we need one more person on average. So, there are a total of four people.

What was our ROI?

I can't give a specific number. One of the returns on investment is that we will soon be getting rid of our on-premise infrastructure and maintenance. The CapEx costs and repeated hardware refresh cycle are gone. From that perspective, there are savings. All we need is the skill set to maintain and manage a particular cloud access security broker. Today, we have four people, and tomorrow, it could be eight people because of the increase in the number of applications. The bottom line is that we will get rid of all operational issues in terms of patching and fixing different systems. We don't have to patch the Windows systems, Linux systems, etc. All these are taken care of and are maintained in the cloud.

What's my experience with pricing, setup cost, and licensing?

I'm not totally involved in the pricing part, but I think its pricing is quite aggressive, and its price is quite similar to Netskope. 

Netskope has separate licensing fees or additional charges if you want to monitor certain SaaS services, whereas, with MCAS, you get 5,000 applications with their Office 365. It is all bundled, and there's no cost for using that. You only have the operational costs. In the country I am in, it is a bit difficult to get people with the required skill sets.

Which other solutions did I evaluate?

I have been here for just around one year.  When I came, they were already using MCAS. In my previous organization, I made the decision to use MCAS for Office 365. For the entire cloud, I decided to use a dedicated cloud access broker like Cisco. It really depends on the organizational requirement and how they want to size their IT department. 

There are pros and cons. If you are totally on Microsoft products, MCAS has an integration. Otherwise, there are other products that may work better. Of course, you may still be dependent on some APIs from the cloud providers. It really depends on the organization's strategy.

What other advice do I have?

My advice would be that an organization should assess where they are today and then map out what do they want from a cloud access security broker product. After that, they should decide whether MCAS or another product meets their requirements. This is important because you may have all the things in terms of interoperability and a solution may be the best fit from an operational perspective, but if all of the requirements are not met, you may end up using multiple products. Therefore, an organization must assess its current IT infrastructure, where do they want to go, and what are the key requirements from a regulatory and IT governance standpoint. They also have to make sure they have the right skillset in the market. For example, in Singapore, if I want to implement Google Cloud, the skillset is very less as compared to the skillset for AWS.

From a vendor perspective, you should assess the reputability of the vendor and what kind of capability the vendor provides. For example, it's very obvious that Microsoft is very good at integrating its own products. They have now also started to integrate with others. These are some of the aspects you should consider before making a decision between product A or B. There is no magic silver bullet.

From a security standpoint, overall, it has satisfied 80% of our requirements in terms of regulatory and bank standards. For 20% of our requirements, we still need additional products or features. They are currently not really there, and we are trying to find the solution for those gaps. In general, MCAS has a long way to go. It is definitely a good product that integrates with Office 365 Suite very well, but from a capability perspective, other products such as SkyHigh, McAfee, or Symantec have more features. It has the potential. A lot of features are lined up in MCAS, and eventually, they'll be there. These features are mentioned on Microsoft's website, and they are in development. I am looking forward to those.

In terms of data governance, we have a very good tool, and we just need to focus on how to govern the data, DLP policies, etc. We don't have to bother about the physical data center, physical network, or physical host. The entire layer below the server is gone, and we just have to focus on the identity and security aspects. We just need to focus on what kind of security we need to put and which policies do we need to implement. We get better visibility by focusing on the key client endpoints by using MCAS. The team is now really focused. Previously, every day, teams used to come up with issues like, "Network has this problem. Data has this problem, and Host has this problem." Now the focus is, "Hey, this MCAS DLP isn't doing the job." The focus is more on the product's capability.

I would rate Microsoft Cloud App Security a seven out of 10.

Disclosure: I am a real user, and this review is based on my own experience and opinions.
Olivier DALOY - PeerSpot reviewer
Group Information Systems Security Director - CISO at Faurecia
Real User
Secures users wherever they are and enable us to inspect SSL traffic, but we encountered too many issues
Pros and Cons
  • "The fact that it is a cloud proxy solution is another feature we like. For example, if you acquire a new company, you can use it to protect that new company without the need to install anything physically on their networks."
  • "We are now transitioning to another solution. The main reason for that is that managing all of the exceptions and troubleshooting all of the issues our users have had connecting to the internet has become too significant in terms of workload, compared to what we hope we will have with another solution."

What is our primary use case?

We use it to secure the internet connection of all of our users, ensuring that they can connect as transparently as possible to all of the websites that are, of course, not hazardous. And anything hazardous is prevented as much as possible.

How has it helped my organization?

We were looking for an isolation solution so that there would be no impact at all on the systems that we are responsible for protecting. We didn't want to wait until a first attack was successful and then find out what the impact was and how we should react to it. That's why we chose Menlo. Either you have access to something or don't have access to it. And if you do, we can ensure, 100 percent of the time, that there is nothing malicious that is going to impact our system in any way. And that's for the on-prem users who are connected to the corporate offices, as well as for the users who are roaming.

The primary benefit is that it secures users wherever they are, whether they are roaming, or they are using their PC at home, at work, or at the airport. We are able to do that, and we are even able to do it with companies that we recently acquired.

Another move forward was that we started inspecting SSL traffic, which was something we were not inspecting before. We were closing our eyes to what was happening to 98 percent of the traffic because it was encrypted. Today, we are not closing our eyes. Menlo enabled us to inspect more traffic and avoid relying on traffic that clearly can be hazardous. That may be one of the reasons we discovered new use cases that were difficult to test before, and for which we have had issues configuring Menlo to handle.

Another advantage is the ability to produce reports that help us to understand what our users are doing, even within the website. For example, are they posting files or are they downloading files? That is clearly an ability that we acquired with the solution as well.

And when it comes to isolation, we haven't seen any threats that have succeeded in coming in through Menlo. I have evidence, of course, that in some cases we were infected by malware, but it was not able to avoid Menlo's protection and connect back to the internet to get instructions from the command and control service. We have clearly demonstrated that those threats just cannot harm us.

What is most valuable?

The isolation is one of the most valuable features.

The fact that it is a cloud proxy solution is another feature we like. For example, if you acquire a new company, you can use it to protect that new company without the need to install anything physically on their networks. 

Also, the ability to rewrite the links in emails so that nobody can connect to a link without going through Menlo's protection is something we have found very valuable. 

And the reporting feature, which involves a kind of programming language to query the logs or the data from the Menlo console is something we consider to be quite useful.

What needs improvement?

The solution should have no impact but it does have a bit of impact on end-users. For example, we encountered some issues in the downloads that took longer than they did without using Menlo. That is clearly not transparent for users. We expected not to have any latency when downloading anything from the internet with Menlo compared to without Menlo.

We are now transitioning to another solution. The main reason for that is that managing all of the exceptions and troubleshooting all of the issues our users have had connecting to the internet has become too significant in terms of workload, compared to what we hope we will have with another solution. In other words, we hope to get the same level of protection, while reducing the number of visible bugs, issues, latencies, impacts on performance, et cetera, that we have today with Menlo. We already solved most of them, but we still have too many such instances of issues with Menlo, even though it is protecting us for sure.

The weak point of the solution is that it has consumed far too much of my team's time, taking them away from operations and projects and design. It took far too much time to implement it and get rid of all of the live issues that we encountered when our users started using the solution. The good point is that I'm sure it is protecting us and it's probably protecting us more than any other solution, which is something I appreciate a lot as a CISO.

But on the other hand, the number of issues reported by the users, and the amount of time that has been necessary for either my team or the infrastructure team to spend diagnosing, troubleshooting, and fixing the issues that we had with the solution was too much. And that doesn't include the need to still use our previous solution, Blue Coat, that we have kept active so that whatever is not compatible or doesn't work with Menlo, can be handled by that other solution. It is far too demanding in terms of effort and workload and even cost, at the end of the day. That is why we decided to transition to another solution.

If we had known in the beginning that we would not be able to get rid of Blue Coat, we probably would not have chosen Menlo because we were planning to replace Blue Coat with something that was at least able to do the same and more. We discovered that it was able to do more but it was not able to replace it, which is an issue.

It is not only a matter of cost but is also a matter of not being able to reduce the number of partners that you have to deal with.

In addition, they could enhance the ability to troubleshoot. Whenever a connection going through Menlo fails for any reason, being able to troubleshoot what the configuration of Menlo should be to allow it through would help, as would knowing what level of additional risk we would be taking with that configuration.

For how long have I used the solution?

We have been using Menlo Security Secure Web Gateway for two years.

What do I think about the stability of the solution?

Now, the stability is quite good. I would rate it an eight out of 10.

What do I think about the scalability of the solution?

We have it deployed worldwide, in about 300 locations.

In the case where we acquired a new company with a significant number of systems, the ability to deploy Menlo to all of them, even if we were talking about 40,000 people, would not be an issue at all. 

One thing which could be a real issue is the ability of the solution, within the development plan of Menlo, to fit our needs. This is what led to our decision to remove Menlo.

Which solution did I use previously and why did I switch?

We were using Blue Coat Systems before. First, that was clearly not protecting users who were at home or roaming. Second, it was not possible to use it to protect companies that we acquired until they confirmed that they were going to implement Blue Coat appliances on their networks. So Menlo was a huge move forward.

How was the initial setup?

The initial setup was complex from the beginning, and even once it was in operation. We even needed to have an on-prem meeting with my team in charge of the implementation and the techs from Menlo to determine the best configuration settings to make it work and avoid issues as much as possible (which we still had afterward). It is not at all simple to deploy.

We had between five and 10 people involved in the setup. They were in charge of operations, meaning any changes to or troubleshooting on equipment that was live. Others were in charge of the implementation of this type of system, including defining the proper architecture and configuration and adapting and tuning the configuration.

A couple of years later, we still had a significant number of open tickets with their help desk due to issues connecting through Menlo.

It is deployed on the cloud. We were planning to use Menlo on-prem in China, but we are rerouting the traffic from China to Hong Kong and going from Hong Kong to the internet.

The maintenace is not lightweight. I don't know what portion of the time that we were spending on the tool was due to maintenance and what part was due to new issues that were raised by our users. The maintenance is a split responsibility between the local IT operational guys and the people from my team.

What about the implementation team?

Our experience with their consultants was very good. 

Our only issue is that we kept asking them how they managed, with their other customers, the issues we were encountering. An area for improvement for them would be that when they meet their customers, don't let them think that they're troubleshooting something for the first time. There is no reason that they wouldn't have seen something different with another customer.

They were not leveraging the experience they had with other customers enough to anticipate and prevent the issues on our networks; or, at least, when they happened, to solve them much quicker than they would have if they had never been seen before. We consider that as a lack. They need to learn how to let other customers benefit from the experience they had with us.

What was our ROI?

We haven't seen a decrease in the number of security alerts that our security ops team has to follow up on, but we were not even able to measure that before deploying Menlo. It's very hard to demonstrate the return on investment by looking at the decrease in the number of incidents compared to before, as we had nothing before that was truly able to demonstrate to us what was really happening. 

If we had implemented a solution from a Menlo competitor before, and we were moving to Menlo, that would have enabled us to compare both solutions. That is something we are going to do after we transition from Menlo to Skyhigh Security, even though the alerts will not, of course, have occurred at the same time. We will be comparing things that are a couple of months, or years, apart. We will try to demonstrate the different levels of protection provided by Menlo compared to Skyhigh. But that will happen half a year from now.

What's my experience with pricing, setup cost, and licensing?

The pricing is good. We were convinced that it was the right price for such a solution at that time. Again, we didn't know that we would have to keep Blue Coat. At that time, we were thinking that we would be able to get rid of Blue Coat, and for that reason, the price would be good.

Which other solutions did I evaluate?

We evaluated several other solutions, including Zscaler and the complete portfolio of Symantec as well.

We went with Menlo because of the connection to the execs of Menlo and the ability to talk to them. The size of the company, compared to Symantec, was definitely a factor, but the ability to get in touch with the right people as quickly as possible, and trust their strategy and their level of protection, were important. The ability to get a contract where they commit to protecting, 100 percent, against any threat, as long as you use isolation, was a clear improvement for us. And the fact that it was a cloud proxy solution, was another part of the decision.

What other advice do I have?

My advice is to pay attention to all of the use cases you have and try to understand what Menlo is or isn't addressing so that you don't discover that you still need to keep an old technology that may even be outdated. To do that, you need to be very clear about your use cases and how you will cover them with Menlo or if Menlo will not cover them.

While the solution provides a single console for security policy and management, which is an interesting feature, as long as you're able to connect through APIs to all your SaaS solutions, the fact that you use the very same SaaS solution or not is probably less important. I'm not saying it is not important that Menlo has a console, but it's a bit less important if you're using an orchestration automation solution. We also have Palo Alto Cortex XSOAR that we are using to automate and orchestrate.

Regarding the fact that Menlo secures the web, email, SaaS, and private applications, the latter, private applications, is very important, as is email although probably less so. The magnitude of risk is higher for private applications that are exposed without protection on internet. It depends on the use cases that you are looking to cover. If, for example, you don't have any private applications that you need to expose, then of course that type of protection is not important at all, but you still receive emails within which you need to rewrite the links. If you have both requirements, meaning a bunch of private applications that are exposed plus emails for which you need to rewrite links, in that case, rewriting the links is probably less important than ensuring the protection of your private applications.

It doesn't make sense to only perform partial protection. Everything you implement to secure the connections and the assets you are responsible for should, at some point, merge together. It should be SD-WAN and web gateways and probably even CASBs and email protection. All of that probably will tend to merge together and you can look forward to reducing costs and the number of partners.

Don't look at it as: "I have a new need, I want a new solution," because if you do that, you will end up with a huge number of vendors and solutions on your systems and it's going to be super difficult to ensure that you manage all of that consistently. Whereas if you really have a vendor that is at least addressing, if not all the possible needs, at least all of your needs, and you are able to manage that in a consistent way, even if you have to program something in your orchestration solution, you will be able to manage all of it in a consistent way and in a timely manner.

Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
Flag as inappropriate
Max Islam - PeerSpot reviewer
Associate Director at Cognizant
Real User
Top 20
Integration with Palo Alto platforms such as Cortex Data Lake and Autofocus gives us visibility into our attack surface
Pros and Cons
  • "Security is absolutely spot-on, really top-notch. It's the result of all the components that come together, such as the HIP [Host Information Profile] and components like Forcepoint, providing end-user content inspection, and antivirus. It incorporates DLP features and that's fantastic because Prisma Access makes sure that all of the essential prerequisites are in place before a user can log in or can be tunneled into."
  • "It's not really Prisma's fault, but when you try to create exceptions you don't really have those abilities. You cannot say, on the management platform, "Hey, for these users I want to create these exceptions." That is one thing that I have gotten some complaints about, and we have faced some challenges there."

What is our primary use case?

We could write a book about our use cases. It provides best-of-breed optimization in CASB and SASE together. Our primary use case is enabling users from all walks of life, and all over the planet, to have remote access in the most optimized way.

Prisma Access is a SASE-oriented solution, making it a hybrid and SaaS. Of course, it's built on Google's high-capacity backbone, but it is provider-neutral.

How has it helped my organization?

With the centralized remote access solution we had before, F5, we used to see a lot of latency and a lot of intermittent disconnects. But our people have reported that they like Prisma Access so much better in terms of speed and how it operates. The user experience is so much better in terms of throughput. They don't see as much lag. Of course, there are users who don't have the most stable internet connection, but even for those users, by optimizing data reduction, it works better. We can't really help users who have some sort of wireless connection, because if their underpinning link is not good, this overlay won't do much. But for users who are using a satisfactory type of connectivity, even for people who are on 10 Mbps, it works well.

In addition, from an application accessibility standpoint, the integrated features that come with the QoS mean you can choose what types of applications get higher priority than others. It optimizes applications for QoS prioritization.

What is most valuable?

At the end of the day, the most valuable feature of Prisma Access is user accessibility and performance. For us, it all comes down to how well this product performs.

In addition to that, we feel that the security is absolutely spot-on, really top-notch. It's the result of all the components that come together, such as the HIP [Host Information Profile] and components like Forcepoint, providing end-user content inspection, and antivirus. It incorporates DLP features and that's fantastic because Prisma Access makes sure that all of the essential prerequisites are in place before a user can log in or can be tunneled into. Until these requirements are met at a satisfactory level, it doesn't let you in. Once users are onboarded, they are going through Palo Alto's firewall inspection. Users' traffic is encapsulated and inspected well. It gives us the flexibility to apply various policies and inspections. All of these come into play and give us peace of mind that this platform is best-in-class in terms of security features and tool integration.

The architecture is essentially a fabric-type SASE-based architecture. From a technical leadership standpoint, we are very pleased and satisfied with how efficient the product is, especially, again, when it comes to security.

One of the features that we really like in Prisma Access is its integration capabilities with Palo Alto's other platforms such as Cortex Data Lake. The best thing about it is that it gives us visibility and clarity. We can say, "This is what our threat metrics framework looks like. Yesterday we had this many potential threats, and out of that, this many have been fended off or mitigated." It gives us a really good single pane of glass that tells us what our attack surface looks like and how things have been mitigated." It gives us data that we can utilize for the benefit of our users and our senior executives.

From a user standpoint, it's very easy and very usable. Our users have used F5's products and it's not much different. There can be intricacies in that you have to have your laptops' antivirus protection updated, but that's not a big deal. Those are the types of things that users have to comply with anyway.

Traffic analysis, threat prevention, URL filtering, and segmentation are some of the features that come with Palo Alto itself. On the cloud controller platforms you have the ability to enforce controls, including things like the application layer inspection, granular policy constructs, as well as app-ID-based and application layer inspection. The inspection engines, such as the antivirus, malware, spyware, and vulnerability protection, are integrated into Palo Alto's cloud services platform. These features are quintessential to our entire cloud services security fabric. Users are users. You never know what's going to happen to a user. If somebody goes to Madagascar or to Bali and gets compromised, it is our job to protect that user and the organization. All of these interrelated features come into play for those purposes.

What needs improvement?

The challenges we have faced are not connected with Prisma's core fabric, but more with the end-user. To use the GlobalProtect client and meet all the requirements, your laptop or your end-user system has to be at a point where things are up to date. It's not really Prisma's fault, but when you try to create exceptions you don't really have those abilities. You cannot say, on the management platform, "Hey, for these users I want to create these exceptions." That is one thing that I have gotten some complaints about, and we have faced some challenges there.

It's always a challenge when people at the executive level start complaining because they're using the latest version of the MacBook Pro and it's not playing very well with Prisma.

For how long have I used the solution?

I used the predecessor to Prisma Access, which was GlobalProtect Cloud Services and I have been using Prisma Access for a good two years.

How are customer service and support?

I wouldn't call their technical support a pain point, but they need to improve it. That is one of the biggest drawbacks.

How was the initial setup?

It was pretty straightforward at the PoC level. But the rollout of something like this across an enterprise is never like a one-shot thing. We went through some bumps and bruises and roadblocks along the way, but, overall, it was a pretty straightforward path.

The entire onboarding took around four months for our approximately 20,000 users.

On a day-to-day basis, we have security engineers and SMEs managing the platform. But there are not as many intricacies and challenges as there are in some of the other products that we deal with. From administrative, operational, and management standpoints, the way Prisma has let us do it, things are pretty efficient.

What about the implementation team?

We used Palo Alto's professional services.

What's my experience with pricing, setup cost, and licensing?

It's pricey, it's not cheap. But you get what you pay for.

My most crucial advice to colleagues who are looking to purchase this product would be to look at it from a 50,000-foot point of view, and then narrow it down to 40,000, 30,000, 20,000, and 10,000. The reason I say that is because, at the 50,000-foot view, the executives care about the pricing and the costing model; it's all about budget and how they can save the organization money.

If you are in a high-end organization, this is the product you had better get, hands-down. If you are an executive at a highly visible bank, please get your head out of the sand and see what is best for your organization. If you are a manufacturing company that doesn't need this level of integrative security, go get something else, something cheaper, because you don't need this extensive level of security controls and throughput. But if you want to get the best-of-breed, then Palo Alto's product is what you should definitely get.

Which other solutions did I evaluate?

Our journey with Prisma Access started out with a battlecard comparison of what Prisma Access had to offer versus what ZPA [Zscaler Private Access], Symantec, and F5 had to offer. In doing all of these comparisons, we realized that Palo Alto had built a cloud services fabric that is user-first and security-first.

If I compare Zscaler and Prisma Access, not all of the security controls that are in place with Zscaler are inherent to their own fabric. Zscaler has done a fantastic job with ZPA in terms of putting the components together. But when it comes to security enforcement, they are lagging behind on some things. One of them is the native security control component enforcement on their fabric. We feel like that is not done as efficiently as Prisma access does.

In a simple scenario when doing a side-by-side comparison, if we were onboarding and providing access to an end-user using ZPA, they would be able to get on and do their job fine. But when it comes to interoperability, cross-platform integration, and security enforcement, we feel that ZPA lacks some of the next-gen, advanced features that Prisma Access has to offer. Prisma Access provides us with cross-platform integration with things like Palo Alto AutoFocus and Cortex Data Lake, which is great. ZPA does not provide all of these extensive security features that we need. In a side-by-side comparison, this is where Prisma Access outshines its competitors.

With all of that in mind, the big question in our minds was, "Well, can you prove it?" PoCs are just PoCs. Where the rubber meets the road is when you can prove your claims. Palo Alto said, "Okay, sure. Let us show you how you can integrate with your existing antivirus platform, your existing content filtering platform, and your existing DLP platforms." We gave it a try. And then, we did various types of pen testing ourselves to see if it was really working the way they said it would. For example, could you take an encrypted file and try to bypass the DLP features? The answer was no. Prisma Access made sure that all of the compensating controls were not only in place but also being enforced. "In place" means you have a security guard, but you have told him to just keep a watch on things. If you have a robbery going on, just watch and don't do anything. Let the robbers do whatever they want. Don't even call the police. Prisma Access doesn't just watch, it calls the police.

What other advice do I have?

There are some encrypted traffic flows that you're not supposed to decrypt and intercept, but even for those we have constructs that give us at least some level of inspection. Once tunnels are established, we have policies to inspect them to a certain extent. We try to make sure that pretty much everything that needs to be inspected is inspected. All of this comes down to accountability and to protecting our users.

Organizations with a worldwide footprint and distributed-services architecture require best-in-class security. Health organizations and pharmaceutical companies also do, because they are dealing with highly sensitive patient data or customer data. Organizations like these that have public, internet-facing web applications, need top-of-the-line security. Prisma Access, from an interoperability standpoint, addresses the big question of how well their web-facing applications are protected from potential malicious attacks. And the answer is that it is all integrative, all a part of a fabric with interrelated components. It protects the users who are accessing the corporate network and the corporate network from any potential risk from those users. Prisma Access gives us the ability to design architectural artifacts, like zones and segments, that really make for effective protection for web-facing components and internal applications.

In terms of Prisma Access providing all its capabilities in a single, cloud-delivered platform, not everything gets on the cloud. You cannot take a mainframe and put it on the cloud. You have to understand the difference between Prisma Access and Prisma Cloud. Prisma Access is all about user accessibility to enterprise networks in the most secure way possible. Prisma Cloud is the platform to integrate various cloud environments into a unified fabric.

As for Prisma Access providing millions of security updates per day, I don't know if there are millions, but it is important. We take advantage of some of the automated features that Palo Alto has provided us. We try not to get into the granular level too much because it increases the administrative overhead. We don't have the time or the manpower to drill into millions of updates.

Disclosure: I am a real user, and this review is based on my own experience and opinions.
Network & Information Security Expert at Malam-Team
User
Offer excellent anti-malware, URL filtering, and anti-ransomware features
Pros and Cons
  • "It's improved the security of every single OS in the organization as well as the visibility and security capabilities."
  • "More report and alert options would be useful."

What is our primary use case?

We use the EDR solution for servers and endpoints for a lot of customers. The use case is for offering protection at the OS level. 

We wanted a better solution than legacy antivirus to secure each OS in the organization. Harmony Endpoint gives us a complete security package with a lot of security features that regularly require a lot of separate security products and a lot of overhead management. 

The environments include on-premise servers - mostly Windows - as well as laptops and desktops with Windows and Mac OS. We also have some cloud services in Azure and AWS.

How has it helped my organization?

It's improved the security of every single OS in the organization as well as the visibility and security capabilities. With Harmony Endpoint, we give each computer advanced anti-malware protection and internet browsing protection (like proxy protection), and advanced phishing protection inside websites. 

It takes care of the concern about ransomware. Today, it's more important to secure each endpoint in the organization at the OS level rather than the organization network level as users are connecting from everywhere. This is why Harmony is so important to us.

What is most valuable?

The solution offers very good features including anti-malware, URL filtering, and anti-ransomware. The product offers a complete solution in one package and it's on every single OS. 

The most valuable part of this product is the complete security package in one single endpoint that includes the legacy anti-virus protection, advanced anti-malware protection, browsing protection, and even firewall capabilities at the OS level. 

In a lot of cases, when we want to give all these security features to every endpoint, we need to implement a lot of separate security products.

What needs improvement?

More report and alert options would be useful. The reports are not good enough and alerts are not usable. 

We need more user-friendly alerts and more options for the alerts. The reports are not capable of giving important information from some parts of the system - like inventory details, etc. 

Also, the logs in the product are not very usable. If you have any blocking of a legitimate app or some problem you will have a hard time finding a log about it and most of the time you will not find any information. 

The product doesn't have an automatic shutdown switch. You must uninstall it in order to shut it down.

For how long have I used the solution?

I've used the solution for about one year.

What do I think about the stability of the solution?

It's very stable. However, they need to resolve some bugs and feature requests.

What do I think about the scalability of the solution?

It's a cloud solution. We are using the cloud-managed solution which makes it very scalable.

How are customer service and support?

The solution offers the best customer service and support in the market.

How would you rate customer service and support?

Positive

Which solution did I use previously and why did I switch?

I used Symantec and we wanted to move forward to an EDR solution that gives a more complete security solution for today's needs.

How was the initial setup?

The initial setup is very straightforward.

What about the implementation team?

We implemented it in-house. We learned how to do it by ourselves.

What's my experience with pricing, setup cost, and licensing?

There are only two types of licenses. If you don't need sandbox features, you can take the basic license and it includes everything.

Which other solutions did I evaluate?

We tried Sentinal ONE, CrowdStrike, Microsoft, Trend Micro, and McAfee.

What other advice do I have?

It's the perfect solution for endpoint protection and has a lot of features included.

Which deployment model are you using for this solution?

On-premises

If public cloud, private cloud, or hybrid cloud, which cloud provider do you use?

Other
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Flag as inappropriate
Buyer's Guide
Cloud Access Security Brokers (CASB)
September 2022
Get our free report covering Microsoft, Netskope, Skyhigh Security, and other competitors of Symantec CloudSOC CASB. Updated: September 2022.
634,775 professionals have used our research since 2012.