What is our primary use case?
I'm working for a telecommunications client. We are using BlazeMeter's Mock Services as a priority for performance testing, along with API monitoring. These functions are each used on different projects.
How has it helped my organization?
One of the projects for this client is completely based on Mock Services and, with the help of that functionality, we are able to complete end-to-end testing with all the dependent components.
There are third-company suppliers who provide components that need to be tested, but most of the time we do not get those components on time. Normally, we would wait until the development is completed and only then go for automation. But now, with the help of the Mock Services, once we get the source code, we upload it directly into the Mock Services and into the API monitoring. Once we get the tracers, we are ready to point to the actual instances and test things immediately. By the time we get to that stage, we are ready for the market. That means that even before getting the complete component, we can start working with something. There is no need to wait. As a result of the Mock Services, the time it takes us to develop API automation is minimized.
Also, we had a number of performance testing tools, but we had to integrate third-party applications for generating reports. That was a pain point for us when it came to showcasing things to stakeholders so that they could be aware of what was happening. But now, everything is available once the performance testing is completed. We can immediately see the reports, and when a test is running, I can share the execution page with anyone else, as a link. That means they can view exactly what is happening, moment by moment, regarding the response time, request time, and latency. That feature is not available in some of the other applications and possibly not in any of the other applications.
One of the areas that BlazeMeter benefits us is our test cycle times. In the past, if there was a defect with a component, we would have to wait until the issue was fixed. And even though we were not testing that particular component, because of the dependency of that component, we would have to wait until the issue was fixed. If it ended up going beyond the deadline for the release cycle, we would leave that test case for the next release.
With the help of the Mock Services, we are overcoming everything. Wherever we are facing issues, whether they will be long term or temporary, by implementing the Mock Services we can bypass the faulty components that are not needed for our particular testing. In that way, we are able to reduce our cycle times. In addition, we have some physical devices and network devices in our testing. It takes a week to create physical devices in a virtual way. Instead, with the Mock Services we are creating them in a minute, and that helps our end-to-end testing to be completed on time. The benefit of BlazeMeter's Mock Services is that it takes us through our testing early in the cycle.
In a single line of business, in a particular call flow, if we have 1,000 test cases per release, 100 to 200 of them are with the help of the Mock Services. That saves us time, money, and manpower.
And before we had BlazeMeter's API monitoring, if there were 10 components and anything was down, we would not be aware. We would not send a heartbeat every second to all of the components to check whether they were down or up. The API monitoring is a real benefit for us because we are able to schedule it for every 30 minutes or every hour, and we can keep on monitoring a component. If there is a failure, we will immediately be notified by email, even on the weekend. We can take action and report the situation to the data analyst and to the component people so that they can immediately work on fixing it.
The API monitoring is one of the most excellent tools we have come across because of the scheduling and the results. We are able to analyze how stable a component is based on that monitoring.
What is most valuable?
The most valuable features for us are the API monitoring and the Mock Services.
Another good thing is that we can upload JMX files and schedule and monitor performance testing. We are able to share results and see reports that we can't get in JMeter. In that way, the performance testing is good.
In terms of the range of test tools, when there are API calls we can do automation testing, functional testing, performance testing, and use the Mock Services to create a situation that the APIs are down. We are able to handle everything that has to do with APIs. Whatever we have to test—the functionality, the behavior—we are able to do so with the help of BlazeMeter.
What needs improvement?
One problem, while we are executing a test, is that it will take some time to download data. Let's say I'm performance testing with a high-end load configuration. It takes a minimum of three minutes or so to start the test itself. That's the bad part of the performance testing.
I don't think they can reduce that time because that's the functionality they have implemented in our BlazeMeter performance testing. But it's a pain point whenever we are running performance testing in a call or a demo, as well as in our live testing when all the business people are there.
The first time I run a given test, if it takes three minutes to download onto my server that's understandable. But every time I rerun the same test, it is downloaded again, because once the test is completed the files that were downloaded are removed. That means I have to wait for three to four minutes again.
We also had a call last week regarding secret keys. In JMX we have some Backend Listeners, such as Kibana, and there are usernames and passwords for them that we have to manually enter. When we upload the JMX file into BlazeMeter for performance testing, the usernames and passwords are viewable. Anyone who has access to BlazeMeter can download the JMX file and the usernames and passwords are visible to all those people. That's an issue with the performance testing.
Also, all the competitors have MQ protocol support, which is lacking in BlazeMeter's Mock Services. Having MQ protocol support in the Mock Services would be great for us. JDBC, the database communication, is also lacking. If we had those things, we would be completely satisfied with BlazeMeter's Mock Services.
And for the API monitoring, we are missing a data-driven approach. If, for a single API call, we have 50 to 100 test cases, there should be no need for us to create multiple steps or to duplicate the test steps. Instead, if we had a data-driven approach available, we could directly add the test data into an Excel sheet and call it into the single test steps and achieve what we need to do. We have raised this concern to the Perforce team as well, and they said they are working on it.
For how long have I used the solution?
I've been using BlazeMeter for two year.
What do I think about the stability of the solution?
It's stable. Sometimes we do face issues, but they are understandable things.
Every month or two months, something will happen in the back end. The UI will say, for example, that performance testing is down due to this or that reason, and that they are fixing it. Sometimes it affects our testing. We will be in a demo or in a call with our stakeholders where we are presenting and something will be down.
We will raise a support ticket and they will say they are analyzing it and fixing it. They won't take much time, but at that time, it's a pain point. But it happens in all tools. Because it is a cloud tool it's expected, but it's not happening very frequently, so we are happy with it.
How are customer service and support?
We have weekly calls with the BlazeMeter support team, and that's a great thing. During those calls they will ask if there are any issues and whether we need something resolved. If we raise any concerns, they immediately help us during that call. If not, they will ask us to raise a ticket and they follow up on it on both sides—on the support side and with us. They will give us updates. I haven't seen any other companies do that. I have been amazed with the basic support.
We also get weekly updates on whatever the roadmap contains and the new features they are going to be implementing. If we have any doubts we address them in the call. We are using some other tools, but we haven't seen this much support from any other company. When it comes to support, Perforce is the best company I have ever come across.
How would you rate customer service and support?
Which other solutions did I evaluate?
We haven't had a chance to use the cloud services because of security issues related to our company. We only use the on-prem server. But the cloud services are one of the best things about BlazeMeter when comparing it with its competitors.
We have older tools, like CA DevTest, that we are still using due to dependencies on JMX, MQ, and JDBC steps that are not available with BlazeMeter. With DevTest we are able to handle a lot of the custom extensions. Instead of the Mock Services, we were using the CA DevTest Service Virtualization tool. We want to move completely to BlazeMeter but we can't because of those dependencies.
Ca DevTest is the main competitor, but it doesn't have the performance testing available. Both solutions have pluses and minuses.
DevTest is hard to use. It has too many features for Service Virtualization. If a beginner is trying to learn something in DevTest, it's hard. It might take a month or two months to get some understanding of what the DevTest tool does. BlazeMeter is very simple. Even for beginners, they give some options in the Mock Services. If you're a beginner, you can create a Mock Service and it gives you a description for each and every step. This way, beginners can easily adopt BlazeMeter.
In addition to the step-by-step demos, there is the BlazeMeter University. When we onboard people into BlazeMeter, we ask them to go through those courses. For example, if we are asking them to work on API monitoring, we have them do the course on API monitoring. Once they get the certification, we have them work on the API monitoring. With the BlazeMeter University, there is no need for us to have a separate KB on how it will work or how it will respond. Onboarding people into BlazeMeter is not a problem for us.
What other advice do I have?
We were using the functional testing for APIs, but it has been disabled in our organization. I asked what was the purpose of disabling it and they said it was to make sure that everyone is using the API monitoring. Although we requested that they enable it again for our purposes, so far we haven't had much chance to explore the API functional testing.
Overall, I would rate the solution at seven out of 10 because I have sent some requirements for API monitoring and performance testing on Mock Services separately, to separate teams; things that should be introduced into BlazeMeter. Until those things are available, I am not able to use some of the components
Which deployment model are you using for this solution?
On-premises
Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.