My company started to use BlazeMeter since we wanted parallel runs and more penetration across teams with more ease, allowing better reporting. BlazeMeter doesn't do anything on its own since it uses the same script used in JMeter. BlazeMeter serves as a tool for orchestration, and to arrange better testing, parallel testing, and better reporting, making it easy for developers to use were some of the factors that led my company to opt for BlazeMeter.
Director of Quality Engineering at a manufacturing company with 501-1,000 employees
The shareability of resources allows multiple people to access the same scripts across different environments
Pros and Cons
- "The extensibility that the tool offers across environments and teams is valuable."
- "The tool fails to offer better parameterization to allow it to run the same script across different environments, making it a feature that needs a little improvement."
What is our primary use case?
What is most valuable?
The most valuable feature of the solution is that I like its workspace and shareability of resources, allowing multiple people to access the same scripts and use them in different environments. The extensibility that the tool offers across environments and teams is valuable.
What needs improvement?
The tool fails to offer better parameterization to allow it to run the same script across different environments, making it a feature that needs a little improvement. The tool should offer some ease of use across environments.
The solution's scalability is an area of concern where improvements are required.
For how long have I used the solution?
BlazeMeter was introduced a year ago in my new organization because we had a higher demand. My company is a customer of the product.
Buyer's Guide
BlazeMeter
December 2025
Learn what your peers think about BlazeMeter. Get advice and tips from experienced pros sharing their opinions. Updated: December 2025.
879,425 professionals have used our research since 2012.
What do I think about the stability of the solution?
Stability-wise, I rate the solution an eight out of ten since my organization is still streamlining things at our end.
What do I think about the scalability of the solution?
Scalability-wise, I rate the solution a seven or eight out of ten.
How are customer service and support?
Technical support doesn't respond the moment you put up a query, so it takes time to get a response from the customer support team. The support team does respond with enough information.
I rate the technical support an eight out of ten.
How would you rate customer service and support?
Positive
Which solution did I use previously and why did I switch?
I used mostly commercial IT tools in my previous organization, including JMeter.
How was the initial setup?
The product's deployment phase is fine and is not difficult.
I can't comment on the time taken to install the solution since our organization uses a shared installation with our enterprise account. My team didn't need to actually install the product, so we just created our workspace, and that was it.
What's my experience with pricing, setup cost, and licensing?
I rate the product's price two on a scale of one to ten, where one is very cheap, and ten is very expensive. The solution is not expensive.
What other advice do I have?
Maintenance-wise, the product is fine.
Based on my initial perception and initial experiences, I rate the overall tool an eight out of ten.
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
Performance Engineer Manager at a financial services firm with 1,001-5,000 employees
A tool for load testing or performance testing that needs to improve on its scalability part
Pros and Cons
- "The baseline comparison in BlazeMeter is very easy, especially considering the different tests that users can easily compare."
- "Scalability is an area of concern in BlazeMeter, where improvements are required."
What is our primary use case?
I work in a bank where we use BlazeMeter to conduct our load testing or performance testing. In our company, we utilize multiple machines, and multiple projects are hosted on BlazeMeter.
What is most valuable?
The baseline comparison in BlazeMeter is very easy, especially considering the different tests that users can easily compare. The response time and the calls we make in our company are easy to trace with the help of BlazeMeter.
What needs improvement?
There is a tab in Blazemeter named Request Stats Report where improvements are required. If I have to go and find a particular timeline, which might be in a 15 or 20-minute window, there are no places or options where I can enter the perfect or exact timelines I want to find. You have the pointers to drag from one place to another, but that doesn't give me much freedom to get or find a timeline. Basically, on a tab in Blazemeter named Request Stats Report, there should be the start time and end time. In BlazeMeter's Request Stats Report, users should have an option where they can select and manually enter the test start and test end time to get the stats for a particular time period.
Scalability is an area of concern in BlazeMeter, where improvements are required.
For how long have I used the solution?
I have been using BlazeMeter for a year.
What do I think about the stability of the solution?
The load testing time in BlazeMeter is too high since, in our company, we have seen that it takes four to five minutes to do an entire load testing process, and after that, we run the tests, which is a big problem for us.
Stability-wise, I rate the solution a five out of ten.
What do I think about the scalability of the solution?
I see certain limitations when it comes to the scalability part of BlazeMeter since, in our company, we have faced multiple interruptions, because of which we had to stop certain testing processes.
Scalability-wise, I rate the solution a five out of ten.
I cannot give you an exact number related to the number of users of BlazeMeter in our company since we are just one of the teams in the company that uses the tool. I believe that around 150 to 200 people in my company use BlazeMeter.
How are customer service and support?
I rate the technical support an eight out of ten.
How would you rate customer service and support?
Positive
Which solution did I use previously and why did I switch?
I only have experience with BlazeMeter.
How was the initial setup?
I rate the product's initial setup around seven on a scale of one to ten, where one is a difficult setup, and ten is an easy setup.
The time taken for the deployment of BlazeMeter varies since we have multiple types of applications in our company. If I have to deploy something on BlazeMeter, the time range for the deployment process can be somewhere between 30 to 120 minutes.
The solution is deployed on the cloud.
What other advice do I have?
BlazeMeter is a supportive tool in terms of the fact that it is easy to migrate from one system to another. BlazeMeter offers an easy infrastructure that allows for migration, if needed, in a well-organized manner, but it is not an optimized tool yet since I still see a lot of problems in it.
I rate the overall tool a seven out of ten.
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
Buyer's Guide
BlazeMeter
December 2025
Learn what your peers think about BlazeMeter. Get advice and tips from experienced pros sharing their opinions. Updated: December 2025.
879,425 professionals have used our research since 2012.
Quality Assurance Architect at a healthcare company with 51-200 employees
An easy-to-use tool with a great interface and report-generation capabilities
Pros and Cons
- "It is a stable solution. When we compare BlazeMeter with other tools in the market, I can say that the solution's overall performance has also been very good in our company."
- "I don't think I can generate a JMX file unless I run JMeter, which is one of my concerns when it comes to BlazeMeter."
What is most valuable?
BlazeMeter is a very good tool to add users and ramp up things, making them a few of its very good features.
What needs improvement?
BlazeMeter is a very handy tool requiring drag and drop to operate., but I don't think I can generate a JMX file unless I run JMeter, which is one of my concerns when it comes to BlazeMeter. In our company, we are mostly unable to capture logs or events with BlazeMeter. We want BlazeMeter to assimilate a mobile app, especially sincere company deals in mobile apps, and we wish to conduct testing using BlazeMeter. The solution has been good so far, but JMeter is one area that has been tricky for me since I cannot generate events.
I cannot speak about a particular weakness in the tool, but it is a tricky product since those who want to use it need to depend on another tool called JMeter. JMeter is required to get the scripts and JMX file before being able to run on BlazeMeter.
In our company, an APK is generated whenever we develop mobile apps, and when I drag and drop it as a script, a JMX file should be generated, which is a feature not included in the solution. The aforementioned area where the solution lacks can be considered for improvement.
For how long have I used the solution?
I have been using BlazeMeter for two months. I am currently an end user of the tool using BlazeMeter's trial version.
What do I think about the stability of the solution?
It is a stable solution. When we compare BlazeMeter with other tools in the market, I can say that the solution's overall performance has also been very good in our company.
What do I think about the scalability of the solution?
It is a scalable solution, but our company currently uses the tool's free version, and we have not opted for its paid version. Considering the aforementioned fact, I can't comment on the solution's scalability though I have heard from one of my friends that the product's scalability is good.
Around 50 people can use the product in my company.
How are customer service and support?
In my company, we haven't contacted the solution's technical support since we are still exploring the product as we are a startup company. We are conducting a trial of all the tools available to us so that we can choose the ones that suit our company at the end of the process.
How was the initial setup?
The tool's implementation is done since my company deals more in mobile apps than web apps.
Which other solutions did I evaluate?
My company is a health app provider making our process or business completely different in the market. We want a product that is not an API to test the performance of our company's apps, so we consider BlazeMeter to be a good option.
My company is looking for options, like LoadRunner tools, that can be a better choice than BlazeMeter.
My company needs to search for better options since we feel that we will have around a million users once we launch our health app in India. I want a tool that can help me test the app's performance, especially if a million users are using it.
What other advice do I have?
BlazeMeter is a tool that is easy to use.
Interface and report generation capabilities make the tool very handy for its users. The only tricky area in the solution is running BlazeMeter on JMeter, an open-source tool making it a very complex part for me.
There are different technical stacks in the market in which one needs to invest. After the testing phase, one may go for an expensive product in the market. Once there is a stable product in the market and the company can generate revenue, then it is feasible to go for the paid version, which is an option available in JMeter, so I can recommend it to others. BlazeMeter's paid version can be a bit expensive compared to JMeter.
I rate the overall product a nine out of ten.
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
VP QA Performance Engineer at a financial services firm with 1,001-5,000 employees
User-friendly, comprehensive analysis, and highly scalable
Pros and Cons
- "The most valuable aspect of BlazeMeter is its user-friendly nature, ability to conduct distributed load testing and comprehensive analysis and reporting features. It particularly excels in providing a clear and organized view of load test results."
- "BlazeMeter has room for improvement in terms of its integration with GitLab, particularly in the context of CI/CD processes. While it has multiple integrations available, the level of integration with GitLab may need further enhancements. It is known to work well with Git and Jenkins, although the extent of compatibility with GitLab is uncertain."
What is our primary use case?
The use cases of BlazeMeter encompass a wide range of scenarios, including loop load testing for API-level, web service, or web application load testing. The primary purpose is to simulate various types of loads. For instance, if the load originates from distributed load testing, opting for a dedicated cloud solution would be advisable. This allows testing applications from diverse geographic locations and handling traffic from different tiers effectively. JAMITA cloud is particularly recommended for this situation, as it efficiently manages infrastructure interfaces and resolves technical intricacies associated with infrastructure maintenance.
It simplifies the process by emphasizing the key aspects of writing, uploading, and running scripts for testing purposes.
What is most valuable?
The most valuable aspect of BlazeMeter is its user-friendly nature, ability to conduct distributed load testing and comprehensive analysis and reporting features. It particularly excels in providing a clear and organized view of load test results.
What needs improvement?
BlazeMeter has room for improvement in terms of its integration with GitLab, particularly in the context of CI/CD processes. While it has multiple integrations available, the level of integration with GitLab may need further enhancements. It is known to work well with Git and Jenkins, although the extent of compatibility with GitLab is uncertain.
For how long have I used the solution?
I have used BlazeMeter within the last 12 months.
What do I think about the scalability of the solution?
BlazeMeter is a highly scalable solution. The solution is SaaS and the cloud vendor controls the scalability.
How are customer service and support?
I have not used the support from the vendor.
How was the initial setup?
The initial setup of BlazeMeter is straightforward.
What other advice do I have?
I rate BlazeMeter an eight out of ten.
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
QA Automation & Perform Lead (C) at a retailer with 10,001+ employees
A highly stable cloud-based tool with an impressive depth and breadth of functionality
Pros and Cons
- "Using cloud-based load generators is highly valuable to us, as we can test from outside our network and increase load generation without having to upscale our hardware as much. The cloud load generator is there when we need it and is the feature we leverage the most."
- "We encountered some minor bugs, and I would like to have the ability to add load generators to workspaces without having to use APIs. We can't do that now, so we're beholden to the APIs."
What is our primary use case?
We use the solution for enterprise performance testing of various technologies including web services, APIs, and web GUIs.
We deployed the solution to increase our performance testing footprint, which we needed to upscale for the maturity of our operation.
We have six on-prem load generators on our network, and the rest of our deployment is in the cloud. It's a very simple architectural design.
How has it helped my organization?
BlazeMeter opened up performance testing for us. Our old solution was a client-based performance testing tool, and for staff to access it, they needed to remotely connect to a Windows VM and book time with that controller. Now our tool is web-based, and we onboarded 12 to 14 teams to BlazeMeter, which would not have happened before. Our CoE team was the go-to for performance testing, but the solution has opened up the practice to the whole enterprise, making teams more self-sufficient, and that's the most significant benefit. Performance testing is no longer segregated to one team.
What is most valuable?
Using cloud-based load generators is highly valuable to us, as we can test from outside our network and increase load generation without having to upscale our hardware as much. The cloud load generator is there when we need it and is the feature we leverage the most.
We have a very high opinion of the range of test tools the solution provides, it has a great deal of potential and we are just scratching the surface of it currently. As our maturity and skillset with the product increase, we'll be able to leverage that more. For example, we don't really use mock services yet. We know how to, but we're still set in some of our ways.
BlazeMeter being cloud-based and open-source is vital; it was one of our top priorities when choosing a solution. Much like the rest of the world, we're moving away from the old paradigm of the Windows days where we would bring up a server, get Windows licenses, an operating system, and maintain it all. With BlazeMeter, most of that is done for us, and we don't have to worry about infrastructure. We have on-prem load generators for teams needing to run load tests from within our network, and we need to maintain that capacity. However, we don't have to host anything outside of the load generators in the network, so the maintenance effort and cost are much less than they would be as a legacy system.
The solution does bridge Agile and CoE teams. It's a shift-left tool, and testing comes in much earlier than in the past. BlazeMeter is a valuable asset in this regard.
The tool helped us to implement shift-left testing. Many of our teams with the required skillset can include performance testing as part of their build runs. This may not be high-level testing; internally, we refer to it as early performance testing. It allows teams to confirm the software is functioning correctly early, which was not the case before. We would wait until a certain point in the SDLC before running a performance check, and now we're able to implement that much earlier in the process.
We currently don't have any stats on changes in our test cycle times, but there is no doubt in my mind that BlazeMeter improved our software quality.
We have not faced challenges in getting multiple teams to adopt BlazeMeter. We onboarded around 50 users in three quarters, which is incredible considering we had two performance testers before implementing the solution. Our only challenge is skill sets, our staff wants to adopt the tool and understand its importance, but they may not have the resources or skillset to do so. Those with the necessary skillset are onboarded as soon as their project is greenlighted.
What needs improvement?
Our biggest challenge is the skill set required to operate the solution because we used to have a centralized performance testing team. Now we've opened it up to other teams; some needed to onboard new resources. The solution is simple and user-friendly, but we still need the right staff to use it.
We encountered some minor bugs, and I would like to have the ability to add load generators to workspaces without having to use APIs. We can't do that now, so we're beholden to the APIs.
For how long have I used the solution?
We have been using the solution for about nine months.
What do I think about the stability of the solution?
The solution is very stable. We had a few issues with users getting 404 errors recently, but that's the first time we have encountered any problems in three quarters.
What do I think about the scalability of the solution?
The scalability is incredible. We could scale it to as big or small as we want, with our license being the sole limitation. The resources are in Docker containers in Docker images. We could scale within a few minutes if necessary.
How are customer service and support?
The technical support is excellent. When we had hiccups during deployment, they responded quickly with effective solutions for us.
How would you rate customer service and support?
Positive
Which solution did I use previously and why did I switch?
We used other tools and switched because they weren't as user-friendly. BlazeMeter offered us the ability to increase our performance testing footprint without requiring a high level of performance testing expertise from our QA staff. Additionally, our old solutions were client-based, and BlazeMeter is cloud-based, providing all the advantages that come with that.
How was the initial setup?
The deployment is very straightforward. That was one of our criteria, as we didn't want a complex new enterprise solution rollout. There were a few bumps during deployment, but most of that was on our side. BlazeMeter is relatively simple compared to other enterprise solutions we implemented.
Less than ten staff were involved in the deployment. We used Linux Enterprise to house the six on-premise load generators, and there were a couple of employees responsible for Docker, our solutions architect, and myself as the admin.
What was our ROI?
I don't have a concrete figure, but I can say once we sunset our old solution, that will save us a significant amount of money on infrastructure, licensing, and maintenance. I also think there is an ROI associated purely with the increased quality of our software, thanks to BlazeMeter.
What's my experience with pricing, setup cost, and licensing?
The product isn't cheap, but it isn't the most expensive on the market. During our proof of concept, we discovered that you get what you pay for; we found a cheaper solution we tested to be full of bugs. Therefore, we are willing to pay the higher price tag for the quality BlazeMeter offers.
Which other solutions did I evaluate?
We carried out a proof of concept of four tools, which included BlazeMeter. It's more stable and mature, with well-documented APIs. BlazeMeter University was a significant consideration for us due to our requirements; it helped us roll out the solution to multiple teams. It seemed like APIs for the other solutions were an afterthought.
What other advice do I have?
I would rate the solution an eight out of ten.
The solution enables the creation of test data for performance and functional testing, but our use is focused on performance testing. We don't particularly use functional testing, but we are currently talking about using test data management for functional testing. We have our in-house automation framework, so the ability to create both functional and performance test data isn't a high priority for us.
We don't use BlazeMeter's ability to build test data on-the-fly, not because we aren't aware of it, but because we are still at the early stages with the solution. Until fairly recently, just one other person and I were in charge of performance testing for the entire company, so having self-sufficient teams is an immense change for us as an organization.
I would say it's critical to have the appropriate skillsets among the staff as we could deploy just about any solution in an enterprise. Still, it won't be used to its total capacity without the proper skills. BlazeMeter showed us how little performance testing we were doing before and how vital increasing that footprint is. We've onboarded 50 users; that's 50 users who were not engaged less than a year ago and can all carry-out performance testing.
This solution can work very well for enterprise companies with a more advanced skill pool to draw from. For beginners in this area, specific skills such as JMeter scripting are required to use the application. It's easier to use than most solutions but requires a particular skill set to deploy and operate successfully. A good solutions architect and QA leads are essential in evaluating any product.
Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
Technology services specialist at a financial services firm with 1,001-5,000 employees
Brings agility and efficiency, and with real-time data, helps in understanding the behavior of an application at all stages of the test
Pros and Cons
- "For me, the best part is that we can graphically see the test result at runtime. It helps us understand the behavior of the application during all stages of the test."
- "The Timeline Report panel has no customization options. One feature that I missed was not having a time filter, which I had in ELK. For example, there are only filter requests for a time of less than 5 seconds."
What is our primary use case?
Previously, to perform performance tests, we had to connect servers in the cloud, configure them to perform the test, and plot the results on a dashboard. BlazeMeter came to simplify all this work.
In terms of deployment, we are using local Linux servers (RHEL 7), and for the cloud, we are using EC2 servers with Amazon Linux. Our cloud provider is Amazon AWS.
How has it helped my organization?
With BlazeMeter, our main gains were in agility and efficiency in the execution of performance tests and delivery of post-test reports.
It has helped us to implement shift-left testing. It has certainly helped us to speed up the tests, and with that, we gained time to carry out tests in all development cycles.
It has the ability to build test data on-the-fly, and this on-the-fly test data meets compliance standards, which is very important for us. Real-time data helps us understand the behavior at each level of the test. So, we can define numbers that an application needs to achieve in the test to classify it as being OK or not. This data helps a lot in the real-time investigation. By looking at each level, we can identify the exact moment of degradation or “break”.
It decreased our test cycle times. I believe that we saved at least 50% of the time in preparation for the execution. Using BlazeMeter has greatly simplified our performance testing experience, especially the preparation part.
What is most valuable?
For me, the best part is that we can graphically see the test result at runtime. It helps us understand the behavior of the application during all stages of the test.
BlazeMeter is a cloud-based and open-source testing platform, which is very important for us because we can be sure that we're using a tool that follows market trends and stays up-to-date.
What needs improvement?
The Timeline Report panel has no customization options. One feature that I missed was not having a time filter, which I had in ELK. For example, there are only filter requests for a time of less than 5 seconds.
For how long have I used the solution?
I have been using this solution for approximately 1 year.
What do I think about the stability of the solution?
It is very stable. We haven't seen any instability or unavailability issues so far.
What do I think about the scalability of the solution?
It is scalable as per our needs. In our working model, the only team that uses BlazeMeter is ours. This solution is used only by our team whose mission is to bring performance tests to projects and squads.
How are customer service and support?
They are very good. In the beginning, they held a workshop with our team, and whenever we ask questions, we are attended to without any problem. I would rate them a ten out of ten.
How would you rate customer service and support?
Positive
Which solution did I use previously and why did I switch?
We didn't use any other solution. We performed the tests manually.
As soon as we got to know this tool, we realized how important it would be and the benefits it would bring to the company. Its main benefits have been gains in agility and efficiency.
For the performance tests that we carry out in the company, we only use BlazeMeter. I don't know any other tools. My view of BlazeMeter is that it is a very mature tool that delivers what it has set out to deliver in an excellent way.
How was the initial setup?
I was not involved in its deployment. In terms of maintenance, the only maintenance is setting up new servers for use. This configuration is usually performed by us in the Performance team.
What was our ROI?
I don't have access to the information about its cost. So, I can't say if we have seen an ROI and if we have reduced our test operating costs.
Which other solutions did I evaluate?
We did not review other products.
What other advice do I have?
BlazeMeter brings agility and efficiency in the preparation and execution of performance tests. With this, we gain time which is used to increase the scope of tests and anticipate possible problems.
BlazeMeter didn't help bridge Agile and CoE teams because we have a specific team. So, there was no involvement of professionals who work with agile. We gained agility and efficiency, but there was no involvement of any external team.
I would rate BlazeMeter a nine out of ten.
Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
Manager at a tech vendor with 10,001+ employees
Robust auto-collision feature but the scanning capability needs improvement
Pros and Cons
- "It has a unique programming dashboard that is very user-friendly."
- "The scanning capability needs improvement."
What is our primary use case?
The solution is used as a performance system.
What is most valuable?
It has a unique programming dashboard that is very user-friendly. The auto-collision feature is also robust.
What needs improvement?
The scanning capability needs improvement.
For how long have I used the solution?
I have been using BlazeMeter for a year.
What do I think about the scalability of the solution?
The solution is highly scalable. Five people are using the solution at present. I rate the scalability an eight out of ten.
How was the initial setup?
The initial setup is straightforward. The deployment takes few minutes time and a couple of people were involved in the process.
What other advice do I have?
Overall, I would rate the solution a seven out of ten.
Which deployment model are you using for this solution?
Public Cloud
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
Mock Services and API monitoring help us reduce cycle times, but MQ protocol and JDBC are needed
Pros and Cons
- "With the help of the Mock Services, we are overcoming everything. Wherever we are facing issues, whether they will be long term or temporary, by implementing the Mock Services we can bypass the faulty components that are not needed for our particular testing."
- "One problem, while we are executing a test, is that it will take some time to download data. Let's say I'm performance testing with a high-end load configuration. It takes a minimum of three minutes or so to start the test itself. That's the bad part of the performance testing... every time I rerun the same test, it is downloaded again... That means I have to wait for three to four minutes again."
What is our primary use case?
I'm working for a telecommunications client. We are using BlazeMeter's Mock Services as a priority for performance testing, along with API monitoring. These functions are each used on different projects.
How has it helped my organization?
One of the projects for this client is completely based on Mock Services and, with the help of that functionality, we are able to complete end-to-end testing with all the dependent components.
There are third-company suppliers who provide components that need to be tested, but most of the time we do not get those components on time. Normally, we would wait until the development is completed and only then go for automation. But now, with the help of the Mock Services, once we get the source code, we upload it directly into the Mock Services and into the API monitoring. Once we get the tracers, we are ready to point to the actual instances and test things immediately. By the time we get to that stage, we are ready for the market. That means that even before getting the complete component, we can start working with something. There is no need to wait. As a result of the Mock Services, the time it takes us to develop API automation is minimized.
Also, we had a number of performance testing tools, but we had to integrate third-party applications for generating reports. That was a pain point for us when it came to showcasing things to stakeholders so that they could be aware of what was happening. But now, everything is available once the performance testing is completed. We can immediately see the reports, and when a test is running, I can share the execution page with anyone else, as a link. That means they can view exactly what is happening, moment by moment, regarding the response time, request time, and latency. That feature is not available in some of the other applications and possibly not in any of the other applications.
One of the areas that BlazeMeter benefits us is our test cycle times. In the past, if there was a defect with a component, we would have to wait until the issue was fixed. And even though we were not testing that particular component, because of the dependency of that component, we would have to wait until the issue was fixed. If it ended up going beyond the deadline for the release cycle, we would leave that test case for the next release.
With the help of the Mock Services, we are overcoming everything. Wherever we are facing issues, whether they will be long term or temporary, by implementing the Mock Services we can bypass the faulty components that are not needed for our particular testing. In that way, we are able to reduce our cycle times. In addition, we have some physical devices and network devices in our testing. It takes a week to create physical devices in a virtual way. Instead, with the Mock Services we are creating them in a minute, and that helps our end-to-end testing to be completed on time. The benefit of BlazeMeter's Mock Services is that it takes us through our testing early in the cycle.
In a single line of business, in a particular call flow, if we have 1,000 test cases per release, 100 to 200 of them are with the help of the Mock Services. That saves us time, money, and manpower.
And before we had BlazeMeter's API monitoring, if there were 10 components and anything was down, we would not be aware. We would not send a heartbeat every second to all of the components to check whether they were down or up. The API monitoring is a real benefit for us because we are able to schedule it for every 30 minutes or every hour, and we can keep on monitoring a component. If there is a failure, we will immediately be notified by email, even on the weekend. We can take action and report the situation to the data analyst and to the component people so that they can immediately work on fixing it.
The API monitoring is one of the most excellent tools we have come across because of the scheduling and the results. We are able to analyze how stable a component is based on that monitoring.
What is most valuable?
The most valuable features for us are the API monitoring and the Mock Services.
Another good thing is that we can upload JMX files and schedule and monitor performance testing. We are able to share results and see reports that we can't get in JMeter. In that way, the performance testing is good.
In terms of the range of test tools, when there are API calls we can do automation testing, functional testing, performance testing, and use the Mock Services to create a situation that the APIs are down. We are able to handle everything that has to do with APIs. Whatever we have to test—the functionality, the behavior—we are able to do so with the help of BlazeMeter.
What needs improvement?
One problem, while we are executing a test, is that it will take some time to download data. Let's say I'm performance testing with a high-end load configuration. It takes a minimum of three minutes or so to start the test itself. That's the bad part of the performance testing.
I don't think they can reduce that time because that's the functionality they have implemented in our BlazeMeter performance testing. But it's a pain point whenever we are running performance testing in a call or a demo, as well as in our live testing when all the business people are there.
The first time I run a given test, if it takes three minutes to download onto my server that's understandable. But every time I rerun the same test, it is downloaded again, because once the test is completed the files that were downloaded are removed. That means I have to wait for three to four minutes again.
We also had a call last week regarding secret keys. In JMX we have some Backend Listeners, such as Kibana, and there are usernames and passwords for them that we have to manually enter. When we upload the JMX file into BlazeMeter for performance testing, the usernames and passwords are viewable. Anyone who has access to BlazeMeter can download the JMX file and the usernames and passwords are visible to all those people. That's an issue with the performance testing.
Also, all the competitors have MQ protocol support, which is lacking in BlazeMeter's Mock Services. Having MQ protocol support in the Mock Services would be great for us. JDBC, the database communication, is also lacking. If we had those things, we would be completely satisfied with BlazeMeter's Mock Services.
And for the API monitoring, we are missing a data-driven approach. If, for a single API call, we have 50 to 100 test cases, there should be no need for us to create multiple steps or to duplicate the test steps. Instead, if we had a data-driven approach available, we could directly add the test data into an Excel sheet and call it into the single test steps and achieve what we need to do. We have raised this concern to the Perforce team as well, and they said they are working on it.
For how long have I used the solution?
I've been using BlazeMeter for two year.
What do I think about the stability of the solution?
It's stable. Sometimes we do face issues, but they are understandable things.
Every month or two months, something will happen in the back end. The UI will say, for example, that performance testing is down due to this or that reason, and that they are fixing it. Sometimes it affects our testing. We will be in a demo or in a call with our stakeholders where we are presenting and something will be down.
We will raise a support ticket and they will say they are analyzing it and fixing it. They won't take much time, but at that time, it's a pain point. But it happens in all tools. Because it is a cloud tool it's expected, but it's not happening very frequently, so we are happy with it.
How are customer service and support?
We have weekly calls with the BlazeMeter support team, and that's a great thing. During those calls they will ask if there are any issues and whether we need something resolved. If we raise any concerns, they immediately help us during that call. If not, they will ask us to raise a ticket and they follow up on it on both sides—on the support side and with us. They will give us updates. I haven't seen any other companies do that. I have been amazed with the basic support.
We also get weekly updates on whatever the roadmap contains and the new features they are going to be implementing. If we have any doubts we address them in the call. We are using some other tools, but we haven't seen this much support from any other company. When it comes to support, Perforce is the best company I have ever come across.
How would you rate customer service and support?
Positive
Which other solutions did I evaluate?
We haven't had a chance to use the cloud services because of security issues related to our company. We only use the on-prem server. But the cloud services are one of the best things about BlazeMeter when comparing it with its competitors.
We have older tools, like CA DevTest, that we are still using due to dependencies on JMX, MQ, and JDBC steps that are not available with BlazeMeter. With DevTest we are able to handle a lot of the custom extensions. Instead of the Mock Services, we were using the CA DevTest Service Virtualization tool. We want to move completely to BlazeMeter but we can't because of those dependencies.
Ca DevTest is the main competitor, but it doesn't have the performance testing available. Both solutions have pluses and minuses.
DevTest is hard to use. It has too many features for Service Virtualization. If a beginner is trying to learn something in DevTest, it's hard. It might take a month or two months to get some understanding of what the DevTest tool does. BlazeMeter is very simple. Even for beginners, they give some options in the Mock Services. If you're a beginner, you can create a Mock Service and it gives you a description for each and every step. This way, beginners can easily adopt BlazeMeter.
In addition to the step-by-step demos, there is the BlazeMeter University. When we onboard people into BlazeMeter, we ask them to go through those courses. For example, if we are asking them to work on API monitoring, we have them do the course on API monitoring. Once they get the certification, we have them work on the API monitoring. With the BlazeMeter University, there is no need for us to have a separate KB on how it will work or how it will respond. Onboarding people into BlazeMeter is not a problem for us.
What other advice do I have?
We were using the functional testing for APIs, but it has been disabled in our organization. I asked what was the purpose of disabling it and they said it was to make sure that everyone is using the API monitoring. Although we requested that they enable it again for our purposes, so far we haven't had much chance to explore the API functional testing.
Overall, I would rate the solution at seven out of 10 because I have sent some requirements for API monitoring and performance testing on Mock Services separately, to separate teams; things that should be introduced into BlazeMeter. Until those things are available, I am not able to use some of the components
Which deployment model are you using for this solution?
On-premises
Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
Buyer's Guide
Download our free BlazeMeter Report and get advice and tips from experienced pros
sharing their opinions.
Updated: December 2025
Product Categories
Performance Testing Tools Functional Testing Tools Load Testing Tools API Testing Tools Test Automation ToolsPopular Comparisons
Tricentis Tosca
OpenText Functional Testing
Katalon Studio
Apache JMeter
BrowserStack
SmartBear TestComplete
Tricentis NeoLoad
Perfecto
Sauce Labs
OpenText Professional Performance Engineering (LoadRunner Professional)
Worksoft Certify
Selenium HQ
Testim
UiPath Test Cloud
LambdaTest
Buyer's Guide
Download our free BlazeMeter Report and get advice and tips from experienced pros
sharing their opinions.
Quick Links
Learn More: Questions:
- How does BlazeMeter compare with Apache JMeter?
- When evaluating Load Testing Tools, what aspect do you think is the most important to look for?
- SOAtest vs. SoapUI NG Pro?
- Does Compuware have a manual testing solution? Which manual testing solutions should we be considering?
- What are the top performance tools available to load test web applications?
- What is the best tool for mobile native performance testing on real devices?
- When evaluating Performance Testing Tools, what aspect do you think is the most important to look for?
- Cost of TOSCA Testsuite?
- Do you have an RFP template for Testing Tools which you can share?
- Specflow vs Selenium
















