Try our new research platform with insights from 80,000+ expert users
Ryan Mohan - PeerSpot reviewer
Quality Assurance Manager at a financial services firm with 10,001+ employees
Real User
Enterprise performance testing platform that gives us a centralized place to execute load tests, do reporting, and have different levels of user access control
Pros and Cons
  • "The orchestration feature is the most valuable. It's like the tourist backend component of BlazeMeter. It allows me to essentially give BlazeMeter multiple JMeter scripts and a YAML file, and it will orchestrate and execute that load test and all those scripts as I define them."
  • "BlazeMeter needs more granular access control. Currently, BlazeMeter controls everything at a workspace level, so a user can view or modify anything inside that workspace depending on their role. It would be nice if there was a more granular control where you could say, "This person can only do A, B, and C," or, "This user only has access to functional testing. This user only has access to mock services." That feature set doesn't currently exist."

What is our primary use case?

Our primary use case for BlazeMeter is performance testing. We leverage BlazeMeter as our enterprise performance testing platform. Multiple teams have access to it, and we execute all of our load tests with BlazeMeter and do all the reporting through it. We also use it for mock services.

We have a hybrid deployment model. The solution is hosted and maintained by BlazeMeter. We also have on-premise locations within our network that allow us to load test applications behind our corporate firewalls. That's for test environments and non-production applications that are not externally available. It's a hybrid role that is mostly SaaS, but the on-premises component allows us to execute those load tests and report the results back to the BlazeMeter SaaS solution.

The cloud provider is GCP. BlazeMeter also grants access to Azure and AWS locations which you can execute load tests from. They engaged with all three of the major cloud providers.

How has it helped my organization?

BlazeMeter gives us a centralized place to execute load tests, do reporting, and have different levels of user access control. BlazeMeter has a full API, which is the feature that's given us a lot of value. It allows us to integrate with BlazeMeter in our CI/CD pipelines, or any other fashion, using their APIs. It helps increase our speed of testing, our reporting, and our reporting consistency, and gives us a central repository for all of our tests, execution artifacts, and results.

BlazeMeter added a mock services portion. We used to leverage a different product for mock services, and now that's all done within BlazeMeter. Mock services help us tremendously with testing efforts and being able to mock out vendor calls or other downstream API calls that might impact our load testing efforts. We can very easily mock them out within the same platform that hosts our load tests. That's been a huge time saver and a great value add.

BlazeMeter absolutely helps bridge Agile and CoE teams. It gives us both options. BlazeMeter is designed so that we can grant access to whoever needs it. We can grant access to developers and anyone else on an Agile team. It allows us to shift left even farther than a traditional center of excellence approach would allow us.

It absolutely helps us implement shift-left testing. One of the biggest features of shifting left is BlazeMeter's full, open API. Regardless of the tools we're leveraging to build and deploy our applications, we can integrate them with BlazeMeter, whether that's Jenkins or some other pipeline technology. Because BlazeMeter has a full API, it lets us start tests, end tests, and edit tests. If we can name it, it can be done via the API. It tremendously helps us shift left, run tests on demand, and encode builds.

Overall, using BlazeMeter decreased our test cycle times, particularly because of the mock service availability and the ease with which we can stand out mock services, or in the case of an Agile approach, our development teams can stand out mock services to aid them in their testing. 

It's fast, and the ability to integrate with pipelines increases our velocity and allows us to test faster and get results back to the stakeholders even quicker than before.

The overall product is less costly than our past solutions, so we've absolutely saved money.

What is most valuable?

The orchestration feature is the most valuable. It's like the tourist backend component of BlazeMeter. It allows me to essentially give BlazeMeter multiple JMeter scripts and a YAML file, and it will orchestrate and execute that load test and all those scripts as I define them.

The reporting feature runs parallel with orchestration. BlazeMeter gives me aggregated reports, automates them, and allows me to execute scheduled tests easily on my on-premise infrastructure.

BlazeMeter's range of test tools is fantastic. BlazeMeter supports all sorts of different open-source tools, like JMeter and Gatling, and different web driver versions, like Python and YAML. If it's open-source, BlazeMeter supports it for the most part.

It's very important to me that BlazeMeter is a cloud-based and open-source testing platform because, from a consumer perspective, I don't have to host that infrastructure myself. Everything my end users interact with in the front-end UI is SaaS and cloud-based. We don't have to manage and deploy all of that, which takes a lot of burden off of my company.

The open-source testing platform is fantastic. They support all of the open-source tools, which gives us the latest and greatest that's out there. We don't have to deal with proprietary formats. A secondary bonus of being open-source and so widely used is that there is a tremendous amount of help and support for the tools that BlazeMeter supports.

What needs improvement?

BlazeMeter needs more granular access control. Currently, BlazeMeter controls everything at a workspace level, so a user can view or modify anything inside that workspace depending on their role. It would be nice if there was a more granular control where you could say, "This person can only do A, B, and C," or, "This user only has access to functional testing. This user only has access to mock services." That feature set doesn't currently exist.

Buyer's Guide
BlazeMeter
September 2025
Learn what your peers think about BlazeMeter. Get advice and tips from experienced pros sharing their opinions. Updated: September 2025.
867,497 professionals have used our research since 2012.

For how long have I used the solution?

I have used this solution for almost five years.

What do I think about the stability of the solution?

The stability has absolutely gotten better over the years. They had some challenges when they initially migrated the platform to GCP, but most of those were resolved. Overall, they have very high availability for their platform. If there's an issue, they have a status page where they publish updates to keep customers in the loop. 

If you email their support team or open a ticket through the application, they're always very quick to respond when there's a more global uptime issue or something like that. Overall, they have very high availability.

How are customer service and support?

Technical support is absolutely phenomenal. I've worked with them very closely on many occasions. Whether it's because we found a bug on their side, or an issue we're having with our on-premises infrastructure, they're always there, always willing to support, and are very knowledgeable.

I would rate technical support as nine out of ten.

How would you rate customer service and support?

Positive

Which solution did I use previously and why did I switch?

We previously used HP Performance Center. We used HP Virtual User Generator as a predecessor to JMeter for our scripting challenges.

We switched because it's a very outdated tool and toolset. BlazeMeter is a more modern solution. It supports many more tools, and it allows us to solve problems that were blocked by the old solution. 

The BlazeMeter platform is designed to be CI/CD, so it has continuous integration, it's continuous delivery-friendly, Agile-friendly, and it has all of the modern software development methodologies. 

Our old solution didn't really cooperate with that. It didn't have the API or any of the test data functionality that we've talked about with generating or pulling test data. It didn't have any of the mock services. BlazeMeter gave us the kind of one-stop-shop option that allows us to accelerate our development and velocity within our Agile space.

How was the initial setup?

From my company's side, I'm the "owner" of BlazeMeter. I worked with a support team to set up the on-premises infrastructure. I still work with them.

Deployment was straightforward and simple. We pulled some Docker images and deployed them. The whole on-premise deployment methodology is containerized, whether it's standalone unit servers running Docker or a Kubernetes deployment, which allows you to deploy on-premise BlazeMeter agents through a Kubernetes cluster and your own GCP environment or on-premises Kubernetes environment.

What about the implementation team?

We worked directly with BlazeMeter.

Which other solutions did I evaluate?

We evaluated Load.io and a couple of other solutions. When we brought on BlazeMeter five years ago, they were absolutely the leader in the pack, and I believe they still are. They have a much more mature solution and an enterprise feel. The whole platform is much more developed and user-friendly than some of the other options we evaluated. 

I don't know if there are any features in other platforms that BlazeMeter didn't have; it was mostly the other way around. There were things BlazeMeter had that other platforms didn't have, and existing relationships with the company that used to own BlazeMeter, Broadcom.

What other advice do I have?

I would rate this solution an eight out of ten. 

It's a fantastic solution and can do so many things. But unless you have a team that's already very experienced with JMeter and BlazeMeter, there will be some ramp-up time to get people used to the new platform. Once you're there, the features and functionality of BlazeMeter will let you do things that were absolutely not feasible on your previous platforms.

We don't really leverage the actual test data integration and creation functionality, but we leverage some of the synthetic data creation. BlazeMeter will let you synthetically generate data for load tests, API, or mock services. We have leveraged that, but we have not leveraged some of the more advanced functionality that ties in with test data management.

The ability to create both performance and functional testing data is not very important to us. A lot of the applications we test are very data-dependent and dependent on multiple downstream systems. We don't leverage a lot of the synthetic data creation, as much as some other organizations might.

We don't extensively use BlazeMeter's ability to build test data on-the-fly. We use it to synthetically generate some test data, but a majority of our applications rely on existing data. We mine that in the traditional sense. We don't generate a lot of synthetic test data or fresh test data for each execution.

BlazeMeter hasn't directly affected our ability to address test data challenges. We don't directly leverage a lot of the test data functionality built into BlazeMeter, but we're trying to move in that direction. We have a lot of other limitations on the consumer side that don't really let us leverage that as much as we could. It certainly seems like a great feature set that would be very valuable for a lot of customers, but so much of our testing is done with existing data.

We haven't had any significant challenges with getting our teams to adopt BlazeMeter. There were just typical obstacles when trying to get people to adopt anything that's new and foreign to them. Once most of our users actually spent time using the platform, they really enjoyed it and continued to use it. 

There were no significant hurdles. Their UI is very well-designed and user-friendly. Perforce puts a lot of effort into designing its features and functionalities to be user-friendly. I've participated in a few sessions with them for upcoming features and wire frameworks of new functionalities.

Which deployment model are you using for this solution?

Hybrid Cloud

If public cloud, private cloud, or hybrid cloud, which cloud provider do you use?

Google
Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
PeerSpot user
Vikram Vallabhineni - PeerSpot reviewer
Senior Performance tester at CS
Real User
Top 5
A tool with an easy initial setup that needs to offer more plug-ins and extensions
Pros and Cons
  • "The product's initial setup phase was simple."
  • "From a performance perspective, BlazeMeter needs to be improved...BlazeMeter has not found the extensions for WebSockets or Java Applet."

What is our primary use case?

Whenever I am not able to record the script with the JMeter, I use BlazeMeter extensions to record the scripts. Whenever there is a need to execute something in the cloud, my company uses BlazeMeter.

What is most valuable?

There are some advantageous features that are available in BlazeMeter. The type of HTML reports that can be downloaded from BlazeMeter can be shown to the clients so that they can be provided with a clear picture in a clean way, allowing even a layperson to be able to understand the metrics our company shows with the help of BlazeMeter.

What needs improvement?

Whenever we use BlazeMeter for the ramp-up and designing the scenarios in our company, we also use JMeter or other load testing tools, which provide some convenience in areas where the granularity can be maintained in seconds. The ramp-up and ramp-down require our company to use the granularity for a few minutes, making it an area where improvements are required to be able to use the granularity in seconds. From a performance perspective, BlazeMeter needs to be improved.

Whenever we discuss the development stage, JMeter has plug-ins and other extensions in the area of WebSockets, and it is the same case in terms of the kind of extensions provided by JMeter that are available in LoadRunner. BlazeMeter has not found the extensions for WebSockets or Java Applet. Decoding the scripts that contain the applications with Java Applet is not possible with BlazeMeter or even with JMeter, and it includes some Oracle and desktop applications, too.

For how long have I used the solution?

I have experience with BlazeMeter.

What do I think about the stability of the solution?

I didn't get much opportunity to work on the tool, but as per my experience, I can say that BlazeMeter serves as an alternate tool whenever my company faces some hurdles or challenges with the JMeter and k6. To record the scripts, I use BlazeMeter as an extension. With BlazeMeter, it is very easy to identify if a request belongs to an application or not, and in the initial phase itself, we can neglect the requests that don't belong to the application. In BlazeMeter, depending upon the requests, users can select a particular domain.

What do I think about the scalability of the solution?

Though the product is scalable, I found the tool to be a bit tricky while setting up the scenario for the stress test and during step-up and step-down kind of scenario setup.

Which solution did I use previously and why did I switch?

I have experience with Grafana and JMeter.

Whenever a comparison is made between JMeter and BlazeMeter, there are a lot of differences one can observe. With JMeter, our company has to concentrate on the features as it is an open-source tool that works with Java. The configuration of the systems should have some high-end configuration, and the heap size depends upon the load our company uses. JMeter can be used in UI or GUI mode or in a non-GUI mode. If users have to go with a smoke test and the preparation of scripts, the GUI mode of JMeter can be used. For the actual execution of load testing, we have to go with JMeter's non-GUI mode. With the non-GUI mode, until the completion of the test, I could see the percentage of the error, but I couldn't see what kind of error was there in the application. In JMeter, I had to wait until the completion of the entire test. When we use the BlazeMeter cloud as a licensed tool in our company, we do have to deal with the setup of any configuration area. With BlazeMeter, whenever our company executes the load test, parallelly we can monitor what kind of errors we get, and if possible, we can have a word with the development team in parallel and we can solve all the issues so we don't need to wait until the completion of the tests as some of them may be longer than thirty minutes to an hour. In current situations, everything works in the cloud, and every request and every click gets counted in the cost. In BlazeMeter, there is no need to wait till completion of the one hour or until the end of the testing phase. BlazeMeter provides better reporting, but it takes much longer to do so, making it an area of concern where improvements are required. It is not always sufficient to only use BlazeMeter.

How was the initial setup?

The product's initial setup phase was simple.

The solution can be deployed in less than five minutes.

What's my experience with pricing, setup cost, and licensing?

When compared with the cost of the licenses of other tools, BlazeMeter's license price is good.

What other advice do I have?

Currently, I am looking out for Java Applet and Oracle applications. I want something that will support the load testing phase for the tools that Java Applet and Oracle support.

I rate the overall tool between six and seven out of ten.

Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
Buyer's Guide
BlazeMeter
September 2025
Learn what your peers think about BlazeMeter. Get advice and tips from experienced pros sharing their opinions. Updated: September 2025.
867,497 professionals have used our research since 2012.
reviewer1511478 - PeerSpot reviewer
QA Automation & Perform Lead (C) at Canadian Tire
Real User
A highly stable cloud-based tool with an impressive depth and breadth of functionality
Pros and Cons
  • "Using cloud-based load generators is highly valuable to us, as we can test from outside our network and increase load generation without having to upscale our hardware as much. The cloud load generator is there when we need it and is the feature we leverage the most."
  • "We encountered some minor bugs, and I would like to have the ability to add load generators to workspaces without having to use APIs. We can't do that now, so we're beholden to the APIs."

What is our primary use case?

We use the solution for enterprise performance testing of various technologies including web services, APIs, and web GUIs.

We deployed the solution to increase our performance testing footprint, which we needed to upscale for the maturity of our operation. 

We have six on-prem load generators on our network, and the rest of our deployment is in the cloud. It's a very simple architectural design.

How has it helped my organization?

BlazeMeter opened up performance testing for us. Our old solution was a client-based performance testing tool, and for staff to access it, they needed to remotely connect to a Windows VM and book time with that controller. Now our tool is web-based, and we onboarded 12 to 14 teams to BlazeMeter, which would not have happened before. Our CoE team was the go-to for performance testing, but the solution has opened up the practice to the whole enterprise, making teams more self-sufficient, and that's the most significant benefit. Performance testing is no longer segregated to one team.

What is most valuable?

Using cloud-based load generators is highly valuable to us, as we can test from outside our network and increase load generation without having to upscale our hardware as much. The cloud load generator is there when we need it and is the feature we leverage the most.

We have a very high opinion of the range of test tools the solution provides, it has a great deal of potential and we are just scratching the surface of it currently. As our maturity and skillset with the product increase, we'll be able to leverage that more. For example, we don't really use mock services yet. We know how to, but we're still set in some of our ways. 

BlazeMeter being cloud-based and open-source is vital; it was one of our top priorities when choosing a solution. Much like the rest of the world, we're moving away from the old paradigm of the Windows days where we would bring up a server, get Windows licenses, an operating system, and maintain it all. With BlazeMeter, most of that is done for us, and we don't have to worry about infrastructure. We have on-prem load generators for teams needing to run load tests from within our network, and we need to maintain that capacity. However, we don't have to host anything outside of the load generators in the network, so the maintenance effort and cost are much less than they would be as a legacy system.  

The solution does bridge Agile and CoE teams. It's a shift-left tool, and testing comes in much earlier than in the past. BlazeMeter is a valuable asset in this regard. 

The tool helped us to implement shift-left testing. Many of our teams with the required skillset can include performance testing as part of their build runs. This may not be high-level testing; internally, we refer to it as early performance testing. It allows teams to confirm the software is functioning correctly early, which was not the case before. We would wait until a certain point in the SDLC before running a performance check, and now we're able to implement that much earlier in the process. 

We currently don't have any stats on changes in our test cycle times, but there is no doubt in my mind that BlazeMeter improved our software quality.

We have not faced challenges in getting multiple teams to adopt BlazeMeter. We onboarded around 50 users in three quarters, which is incredible considering we had two performance testers before implementing the solution. Our only challenge is skill sets, our staff wants to adopt the tool and understand its importance, but they may not have the resources or skillset to do so. Those with the necessary skillset are onboarded as soon as their project is greenlighted. 

What needs improvement?

Our biggest challenge is the skill set required to operate the solution because we used to have a centralized performance testing team. Now we've opened it up to other teams; some needed to onboard new resources. The solution is simple and user-friendly, but we still need the right staff to use it.

We encountered some minor bugs, and I would like to have the ability to add load generators to workspaces without having to use APIs. We can't do that now, so we're beholden to the APIs.

For how long have I used the solution?

We have been using the solution for about nine months.

What do I think about the stability of the solution?

The solution is very stable. We had a few issues with users getting 404 errors recently, but that's the first time we have encountered any problems in three quarters. 

What do I think about the scalability of the solution?

The scalability is incredible. We could scale it to as big or small as we want, with our license being the sole limitation. The resources are in Docker containers in Docker images. We could scale within a few minutes if necessary. 

How are customer service and support?

The technical support is excellent. When we had hiccups during deployment, they responded quickly with effective solutions for us.

How would you rate customer service and support?

Positive

Which solution did I use previously and why did I switch?

We used other tools and switched because they weren't as user-friendly. BlazeMeter offered us the ability to increase our performance testing footprint without requiring a high level of performance testing expertise from our QA staff. Additionally, our old solutions were client-based, and BlazeMeter is cloud-based, providing all the advantages that come with that.

How was the initial setup?

The deployment is very straightforward. That was one of our criteria, as we didn't want a complex new enterprise solution rollout. There were a few bumps during deployment, but most of that was on our side. BlazeMeter is relatively simple compared to other enterprise solutions we implemented.

Less than ten staff were involved in the deployment. We used Linux Enterprise to house the six on-premise load generators, and there were a couple of employees responsible for Docker, our solutions architect, and myself as the admin.

What was our ROI?

I don't have a concrete figure, but I can say once we sunset our old solution, that will save us a significant amount of money on infrastructure, licensing, and maintenance. I also think there is an ROI associated purely with the increased quality of our software, thanks to BlazeMeter.

What's my experience with pricing, setup cost, and licensing?

The product isn't cheap, but it isn't the most expensive on the market. During our proof of concept, we discovered that you get what you pay for; we found a cheaper solution we tested to be full of bugs. Therefore, we are willing to pay the higher price tag for the quality BlazeMeter offers.

Which other solutions did I evaluate?

We carried out a proof of concept of four tools, which included BlazeMeter. It's more stable and mature, with well-documented APIs. BlazeMeter University was a significant consideration for us due to our requirements; it helped us roll out the solution to multiple teams. It seemed like APIs for the other solutions were an afterthought.

What other advice do I have?

I would rate the solution an eight out of ten. 

The solution enables the creation of test data for performance and functional testing, but our use is focused on performance testing. We don't particularly use functional testing, but we are currently talking about using test data management for functional testing. We have our in-house automation framework, so the ability to create both functional and performance test data isn't a high priority for us.  

We don't use BlazeMeter's ability to build test data on-the-fly, not because we aren't aware of it, but because we are still at the early stages with the solution. Until fairly recently, just one other person and I were in charge of performance testing for the entire company, so having self-sufficient teams is an immense change for us as an organization.

I would say it's critical to have the appropriate skillsets among the staff as we could deploy just about any solution in an enterprise. Still, it won't be used to its total capacity without the proper skills. BlazeMeter showed us how little performance testing we were doing before and how vital increasing that footprint is. We've onboarded 50 users; that's 50 users who were not engaged less than a year ago and can all carry-out performance testing.

This solution can work very well for enterprise companies with a more advanced skill pool to draw from. For beginners in this area, specific skills such as JMeter scripting are required to use the application. It's easier to use than most solutions but requires a particular skill set to deploy and operate successfully. A good solutions architect and QA leads are essential in evaluating any product.

Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
PeerSpot user
QA Automation Engineer with 201-500 employees
Real User
The action groups allow us to reuse portions of our test and update multiple tests at once
Pros and Cons
  • "The feature that stands out the most is their action groups. They act like functions or methods and code, allowing us to reuse portions of our tests. That also means we have a single point for maintenance when updates are required. Instead of updating a hundred different test cases, we update one action group, and the test cases using that action group will update."
  • "The performance could be better. When reviewing finished cases, it sometimes takes a while for BlazeMeter to load. That has improved recently, but it's still a problem with unusually large test cases. The same goes for editing test cases. When editing test cases, it starts to take a long time to open those action groups and stuff."

What is our primary use case?

We have a couple of use cases for BlazeMeter. One is performance testing. It allows us to aggregate the execution and reporting of our performance tests. We can also create automated functional tests relatively quickly compared to writing tests in a coded platform like Java.

Around 20 people in the QA department are using BlazeMeter to test Mendix- based applications. We're doing regression testing on 22 applications, and we have at least two environments that we interact with regularly: a development environment and a pre-production environment.

How has it helped my organization?

Before BlazeMeter, we didn't have a performance test aggregator. They were running one-off JMeter tests that weren't stored in a repository. JMeter can generate some reporting, but it's nowhere near as nice as what BlazeMeter provides. And it's more readily understood by the development teams that we work with and the management. That part is great.

We initially purchased the tool for performance testing, but we discovered that we had access to functional testing, so we started using that. That's been great for a lot of the same reasons. It increases visibility and gets everybody on the same page about which tests can run and the status of our regression and functional tests.

BlazeMeter can create test data for performance and functional testing. We don't have much use for that currently, but I could see that being useful for individual functional tests in the future. It's nice to have automatic data generation for test cases.

We haven't used BlazeMeter for shift-left testing. The functional testers embedded with the sprint teams don't do automation. That's all kicked down the road, and the automation is done outside of the sprint. While there is a desire to start attacking things that way, it never really got any traction.

I believe BlazeMeter has also reduced our test times, but I can't quantify that.
It's helped us with our test data challenges. I think they have a lot of great implementation, so I don't want to detract from that, but we have some problems with our applications and some custom things. I think we work on a different platform than many other people do, so it hasn't been as beneficial to us probably as it would be for many other people.

What is most valuable?

The feature that stands out the most is their action groups. They act like functions or methods and code, allowing us to reuse portions of our tests. That also means we have a single point for maintenance when updates are required. Instead of updating a hundred different test cases, we update one action group, and the test cases using that action group will update.

The process is pretty straightforward. You can enter data into spreadsheets or use their test data generation feature. You can create thousands of data points if you want. We aren't currently using it to create that much data, but it could easily be used to scale to that. The solution includes a broad range of test tools, including functional tests, performance tests, API testing, etc. They're continuously expanding their features. 

I also like that it's a cloud-based solution, which gives me a single point of execution and reporting. That's great because we can take links to executed test cases and send those to developers. If they have questions, the developers can follow that link to the test and duplicate it or run the test for themselves.

A cloud solution can be a little bit slower than an on-premises client or maintaining test cases locally on our machine. However, we've also run into issues with that. Sometimes people mess up and push the latest changes to the repository. That's not a problem with BlazeMeter because we're doing all the work in the cloud.

Out of all the functional tests, scriptless testing has been the standout piece for my team because it's cloud-based. It's easy for everybody to get into the navigation, and it's pretty intuitive. There's a recorder that's already built into it. It's easy to get started writing test cases with scriptless testing.

BlazeMeter's object repository provides a single point of update for us with regard to locators or selectors for our web elements. It's the same with the action groups. It's incredibly valuable to have reusable action groups that give us a single point for maintenance. It saves a ton of maintenance time.

What needs improvement?

The performance could be better. When reviewing finished cases, it sometimes takes a while for BlazeMeter to load. That has improved recently, but it's still a problem with unusually large test cases. The same goes for editing test cases. When editing test cases, it starts to take a long time to open those action groups. 

For how long have I used the solution?

We've been using BlazeMeter for a little more than a year now.

What do I think about the stability of the solution?

BlazeMeter is pretty solid. The only complaint is performance. When we get massive tests, we run into some issues.

What do I think about the scalability of the solution?

We've never had issues with scalability. We've got hundreds of tests in BlazeMeter now, and we haven't had a problem aside from some performance problems with reporting. 

How are customer service and support?

I rate BlazeMeter support ten out of ten. The BlazeMeter team has been fantastic. Anytime we need something, they're always on it fast. We have regular meetings with the team where we have an opportunity to raise issues, so they help us find solutions in real-time. That's been great.

How would you rate customer service and support?

Positive

Which solution did I use previously and why did I switch?

We were previously using Java and Selenium. We implemented BlazeMeter for the performance testing. When we discovered the functional test features, it was easy to pick up and start using. It was an accident that we stumbled into. Our use grew out of an initial curiosity of, "Let's see if we can create this test." And, "Oh, wow. That was really quick and easy." And it grew from there into a bunch more tests.

How was the initial setup?

Our DevOps team did all the setup, so I wasn't involved. We have faced challenges getting our functional test teams to engage with BlazeMeter. They don't have automation experience, so they're hesitant to pick it up and start using it. We've made a couple of attempts to show them how to get started with scriptless, but the incentive has not been good enough. Generally, it's still the regression team that handles the automation with Blazemeter, as well as whatever else we're using.

After deployment, we don't need to do much maintenance. Sometimes, we have to update test cases because they break, but BlazeMeter itself is low-maintenance.

What was our ROI?

We've seen a return. I don't know exactly how many test cases are in BlazeMeter now, but we've added quite a few functional test cases in there. It's the tool that our performance testing uses right now in conjunction with JMeter.

What's my experience with pricing, setup cost, and licensing?

I can't speak about pricing. My general evaluation isn't from that standpoint. I make the pitch to the leadership, saying, "I think we should get this," and somebody above me makes a decision about whether we can afford it.

Which other solutions did I evaluate?

We looked at other solutions for performance testing, not functional testing. 
A few points about BlazeMeter stood out. One was BlazeMeter's onboarding team. They seemed more helpful and engaged. We had a better rapport with them initially, and their toolset integrated well with JMeter, the solution we were already using. It's also a much more cost-effective solution than the other options.

What other advice do I have?

I rate BlazeMeter nine out of ten. There's still some room to grow, but it's a pretty solid product. If you're comparing this to other tools and you're thinking about using BlazeMeter for functional testing, take a look at the action groups, object library, and test data generation features. Those three things make your day-to-day work a lot easier. It simplifies creating and maintaining your tests. 

Which deployment model are you using for this solution?

Public Cloud

If public cloud, private cloud, or hybrid cloud, which cloud provider do you use?

Other
Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
PeerSpot user
Mahesh Bontha - PeerSpot reviewer
Quality Assurance Architect at Healthonus
Real User
An easy-to-use tool with a great interface and report-generation capabilities
Pros and Cons
  • "It is a stable solution. When we compare BlazeMeter with other tools in the market, I can say that the solution's overall performance has also been very good in our company."
  • "I don't think I can generate a JMX file unless I run JMeter, which is one of my concerns when it comes to BlazeMeter."

What is most valuable?

BlazeMeter is a very good tool to add users and ramp up things, making them a few of its very good features.

What needs improvement?

BlazeMeter is a very handy tool requiring drag and drop to operate., but I don't think I can generate a JMX file unless I run JMeter, which is one of my concerns when it comes to BlazeMeter. In our company, we are mostly unable to capture logs or events with BlazeMeter. We want BlazeMeter to assimilate a mobile app, especially sincere company deals in mobile apps, and we wish to conduct testing using BlazeMeter. The solution has been good so far, but JMeter is one area that has been tricky for me since I cannot generate events.

I cannot speak about a particular weakness in the tool, but it is a tricky product since those who want to use it need to depend on another tool called JMeter. JMeter is required to get the scripts and JMX file before being able to run on BlazeMeter.

In our company, an APK is generated whenever we develop mobile apps, and when I drag and drop it as a script, a JMX file should be generated, which is a feature not included in the solution. The aforementioned area where the solution lacks can be considered for improvement.

For how long have I used the solution?

I have been using BlazeMeter for two months. I am currently an end user of the tool using BlazeMeter's trial version.

What do I think about the stability of the solution?

It is a stable solution. When we compare BlazeMeter with other tools in the market, I can say that the solution's overall performance has also been very good in our company.

What do I think about the scalability of the solution?

It is a scalable solution, but our company currently uses the tool's free version, and we have not opted for its paid version. Considering the aforementioned fact, I can't comment on the solution's scalability though I have heard from one of my friends that the product's scalability is good.

Around 50 people can use the product in my company.

How are customer service and support?

In my company, we haven't contacted the solution's technical support since we are still exploring the product as we are a startup company. We are conducting a trial of all the tools available to us so that we can choose the ones that suit our company at the end of the process.

How was the initial setup?

The tool's implementation is done since my company deals more in mobile apps than web apps.

Which other solutions did I evaluate?

My company is a health app provider making our process or business completely different in the market. We want a product that is not an API to test the performance of our company's apps, so we consider BlazeMeter to be a good option.

My company is looking for options, like LoadRunner tools, that can be a better choice than BlazeMeter.

My company needs to search for better options since we feel that we will have around a million users once we launch our health app in India. I want a tool that can help me test the app's performance, especially if a million users are using it.

What other advice do I have?

BlazeMeter is a tool that is easy to use.

Interface and report generation capabilities make the tool very handy for its users. The only tricky area in the solution is running BlazeMeter on JMeter, an open-source tool making it a very complex part for me.

There are different technical stacks in the market in which one needs to invest. After the testing phase, one may go for an expensive product in the market. Once there is a stable product in the market and the company can generate revenue, then it is feasible to go for the paid version, which is an option available in JMeter, so I can recommend it to others. BlazeMeter's paid version can be a bit expensive compared to JMeter.

I rate the overall product a nine out of ten.

Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
reviewer2122104 - PeerSpot reviewer
VP QA Performance Engineer at a financial services firm with 1,001-5,000 employees
Real User
User-friendly, comprehensive analysis, and highly scalable
Pros and Cons
  • "The most valuable aspect of BlazeMeter is its user-friendly nature, ability to conduct distributed load testing and comprehensive analysis and reporting features. It particularly excels in providing a clear and organized view of load test results."
  • "BlazeMeter has room for improvement in terms of its integration with GitLab, particularly in the context of CI/CD processes. While it has multiple integrations available, the level of integration with GitLab may need further enhancements. It is known to work well with Git and Jenkins, although the extent of compatibility with GitLab is uncertain."

What is our primary use case?

The use cases of BlazeMeter encompass a wide range of scenarios, including loop load testing for API-level, web service, or web application load testing. The primary purpose is to simulate various types of loads. For instance, if the load originates from distributed load testing, opting for a dedicated cloud solution would be advisable. This allows testing applications from diverse geographic locations and handling traffic from different tiers effectively. JAMITA cloud is particularly recommended for this situation, as it efficiently manages infrastructure interfaces and resolves technical intricacies associated with infrastructure maintenance.

It simplifies the process by emphasizing the key aspects of writing, uploading, and running scripts for testing purposes.

What is most valuable?

The most valuable aspect of BlazeMeter is its user-friendly nature, ability to conduct distributed load testing and comprehensive analysis and reporting features. It particularly excels in providing a clear and organized view of load test results.

What needs improvement?

BlazeMeter has room for improvement in terms of its integration with GitLab, particularly in the context of CI/CD processes. While it has multiple integrations available, the level of integration with GitLab may need further enhancements. It is known to work well with Git and Jenkins, although the extent of compatibility with GitLab is uncertain.

For how long have I used the solution?

I have used BlazeMeter within the last 12 months.

What do I think about the scalability of the solution?

BlazeMeter is a highly scalable solution. The solution is SaaS and the cloud vendor controls the scalability.

How are customer service and support?

I have not used the support from the vendor.

How was the initial setup?

The initial setup of BlazeMeter is straightforward.

What other advice do I have?

I rate BlazeMeter an eight out of ten.

Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
reviewer1934670 - PeerSpot reviewer
Technology services specialist at a financial services firm with 1,001-5,000 employees
Real User
Brings agility and efficiency, and with real-time data, helps in understanding the behavior of an application at all stages of the test
Pros and Cons
  • "For me, the best part is that we can graphically see the test result at runtime. It helps us understand the behavior of the application during all stages of the test."
  • "The Timeline Report panel has no customization options. One feature that I missed was not having a time filter, which I had in ELK. For example, there are only filter requests for a time of less than 5 seconds."

What is our primary use case?

Previously, to perform performance tests, we had to connect servers in the cloud, configure them to perform the test, and plot the results on a dashboard. BlazeMeter came to simplify all this work.

In terms of deployment, we are using local Linux servers (RHEL 7), and for the cloud, we are using EC2 servers with Amazon Linux. Our cloud provider is Amazon AWS.

How has it helped my organization?

With BlazeMeter, our main gains were in agility and efficiency in the execution of performance tests and delivery of post-test reports.

It has helped us to implement shift-left testing. It has certainly helped us to speed up the tests, and with that, we gained time to carry out tests in all development cycles.

It has the ability to build test data on-the-fly, and this on-the-fly test data meets compliance standards, which is very important for us. Real-time data helps us understand the behavior at each level of the test. So, we can define numbers that an application needs to achieve in the test to classify it as being OK or not. This data helps a lot in the real-time investigation. By looking at each level, we can identify the exact moment of degradation or “break”.

It decreased our test cycle times. I believe that we saved at least 50% of the time in preparation for the execution. Using BlazeMeter has greatly simplified our performance testing experience, especially the preparation part.

What is most valuable?

For me, the best part is that we can graphically see the test result at runtime. It helps us understand the behavior of the application during all stages of the test.

BlazeMeter is a cloud-based and open-source testing platform, which is very important for us because we can be sure that we're using a tool that follows market trends and stays up-to-date.

What needs improvement?

The Timeline Report panel has no customization options. One feature that I missed was not having a time filter, which I had in ELK. For example, there are only filter requests for a time of less than 5 seconds.

For how long have I used the solution?

I have been using this solution for approximately 1 year.

What do I think about the stability of the solution?

It is very stable. We haven't seen any instability or unavailability issues so far.

What do I think about the scalability of the solution?

It is scalable as per our needs. In our working model, the only team that uses BlazeMeter is ours. This solution is used only by our team whose mission is to bring performance tests to projects and squads.

How are customer service and support?

They are very good. In the beginning, they held a workshop with our team, and whenever we ask questions, we are attended to without any problem. I would rate them a ten out of ten.

How would you rate customer service and support?

Positive

Which solution did I use previously and why did I switch?

We didn't use any other solution. We performed the tests manually.

As soon as we got to know this tool, we realized how important it would be and the benefits it would bring to the company. Its main benefits have been gains in agility and efficiency. 

For the performance tests that we carry out in the company, we only use BlazeMeter. I don't know any other tools. My view of BlazeMeter is that it is a very mature tool that delivers what it has set out to deliver in an excellent way.

How was the initial setup?

I was not involved in its deployment. In terms of maintenance, the only maintenance is setting up new servers for use. This configuration is usually performed by us in the Performance team.

What was our ROI?

I don't have access to the information about its cost. So, I can't say if we have seen an ROI and if we have reduced our test operating costs.

Which other solutions did I evaluate?

We did not review other products.

What other advice do I have?

BlazeMeter brings agility and efficiency in the preparation and execution of performance tests. With this, we gain time which is used to increase the scope of tests and anticipate possible problems.

BlazeMeter didn't help bridge Agile and CoE teams because we have a specific team. So, there was no involvement of professionals who work with agile. We gained agility and efficiency, but there was no involvement of any external team.

I would rate BlazeMeter a nine out of ten.

Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
PeerSpot user
Manager at Vodafone
Real User
Top 20
Robust auto-collision feature but the scanning capability needs improvement
Pros and Cons
  • "It has a unique programming dashboard that is very user-friendly."
  • "The scanning capability needs improvement."

What is our primary use case?

The solution is used as a performance system. 

What is most valuable?

It has a unique programming dashboard that is very user-friendly. The auto-collision feature is also robust. 

What needs improvement?

The scanning capability needs improvement. 

For how long have I used the solution?

I have been using BlazeMeter for a year. 

What do I think about the scalability of the solution?

The solution is highly scalable. Five people are using the solution at present. I rate the scalability an eight out of ten. 

How was the initial setup?

The initial setup is straightforward. The deployment takes few minutes time and a couple of people were involved in the process. 

What other advice do I have?

Overall, I would rate the solution a seven out of ten.

Which deployment model are you using for this solution?

Public Cloud
Disclosure: My company does not have a business relationship with this vendor other than being a customer.
PeerSpot user
Buyer's Guide
Download our free BlazeMeter Report and get advice and tips from experienced pros sharing their opinions.
Updated: September 2025
Buyer's Guide
Download our free BlazeMeter Report and get advice and tips from experienced pros sharing their opinions.