We use Apache JMeter for performance testing, and the Locust tool for Python framework for performance testing. We primarily do performance load testing.
The easiest route - we'll conduct a 15 minute phone interview and write up the review for you.
Use our online form to submit your review. It's quick and you can post anonymously.
We use Apache JMeter for performance testing, and the Locust tool for Python framework for performance testing. We primarily do performance load testing.
Apache JMeter has its own pros and cons when compared to other tools. It is easy to use the tool and it has open-source capability so we can build our custom scripts and execute them. It provides other capabilities, such as integrating a database and connecting to other application servers for monitoring and related functions.
We use dynamic HTML reporting, which helps us in testing analysis by pinpointing the bottlenecks based on the reports. We can identify the specific areas that need attention, troubleshoot them, and report to the development team.
The user-friendly GUI for creating and managing tests makes it very easy to drag and drop samplers. For example, if you want the HTTP sampler, you can drag and drop it and use it. For configurations, we have other samplers. For results, we have the view results samplers that we can also drag and drop. The UI is good in comparison with other tools.
Regarding integration with CI/CD pipelines, we can create Apache JMeter scripts and use the Docker image. From the image, whatever scripting we have done can be connected. We can use the CI/CD pipelines and connect them with Jenkins tools and GitHub. Then we can create the pipelines and automate the end-to-end flow.
For connecting Jenkins to Apache JMeter, JMeter plugins are available, and we have used them. Apache JMeter also has some third-party plugins, which are not native samplers. If we want to use custom test executions, we definitely use all the different plugins available in Apache JMeter.
The capability to simulate users has impacted testing resources and outcomes as Apache JMeter is based on Java, which has a limit to the users in a particular load generator. Apache JMeter provides distributed load testing where you can connect multiple PCs in a master and slave concept, allowing you to pump the load with any number of users. In the past, I have done load testing with 10,000 users by connecting the Apache JMeter distributed network in BlazeMeter. There is a cloud version available, the updated BlazeMeter, and I used that. It is very easy to launch load generators in BlazeMeter, and then we can run the test, scaling up beyond 10,000 users.
There are obviously many areas for improvement in Apache JMeter, as they are providing support only for the web protocol now. They can provide support for other protocols as well, and with AI becoming more prominent, they can implement features where it can generate code by itself based on the results or provide suggestions. Based on the current trends, AI needs to be integrated into Apache JMeter.
I have been using Apache JMeter for more than 10 years, specifically 10 to 12 years.
Regarding stability, previous versions of Apache JMeter were a little unstable, but the new versions are very stable. We use the N-1 versions, which are the stable versions. I have not seen any issues, but occasionally there might be some system hangs depending upon your configuration. Other than that, there are no issues.
As for scalability, if you want to run minimal users, one PC suffices your requirement. However, if you want to scale up to any number of virtual users, you have to set up the distributed network with a master and slave configuration. Then, you can scale based on your infrastructure.
The technical support for Apache JMeter comes from the open-source community. There are many channels available to get support. We can find all the information even in ChatGPT. Just paste whatever error you are experiencing, and it will provide you with a solution in numerous ways. With AI models ChatGPT, troubleshooting issues has become very easy for us.
Positive
The deployment of Apache JMeter can be done on-premises only. We can also do it in the cloud, and it is very simple. You have to ensure Java is the prerequisite for this and then download the new version of Apache JMeter. It is platform-independent, so you can run it on UNIX, Mac, or Windows operating systems. Thus, you can deploy it in the cloud and run it.
Deployment takes just a matter of minutes, with a couple of hours needed to set up the distributed network. As long as the virtual machines are in the same subnet with the same IP address, the connectivity between the master and the slave can be established. If any issues arise, then it might take some time to troubleshoot it.
Apache JMeter is open-source, so there is no pricing. The only concern is to set up your infrastructure to scale up the number of users. It is distributed. You can set up your own virtual PCs, or use the paid version such as BlazeMeter or other options such as OctoPerf. Each vendor has their own pricing details based on users and negotiations. Apache JMeter is open-source. Everything, including scripting executions, can be executed for free, and you can set up your own network and scale up to any number of users without limit.
In my organization, there are two users, me and my teammate, who are working with Apache JMeter.
There is no maintenance required for the solution.
I definitely recommend Apache JMeter to other users because it is an open-source tool. Compared to LoadRunner, you save a significant amount of money. Installation is hassle-free when we compare it with LoadRunner, and there are no additional components compared to the other tools. It is easy to use, and most organizations use it for performance testing, around 70 to 80%.
On a scale of 1-10, I rate Apache JMeter an 8.
I work only with the performance testing tools and not with OpenText Professional Performance Engineering (LoadRunner Professional) for mobile and web applications.
With performance testing, I am the user of OpenText Professional Performance Engineering (LoadRunner Professional). I have procured it and am in charge of conducting performance testing across my organization. I have a team of architects and performance test engineers who are working for me.
The main use case is performance testing, and that's the main reason I'm using this product.
I have been working with this solution for almost five years.
We look at the scalability because we are using the SaaS model now. The license can be consumed on a need-basis and can be served. The scripting language is quite comfortable for us since we are working with C and C++.
I see it is stable, though there are some glitches or latency sometimes. If we want to drill down further on what's happening, I need to check with my engineer. Overall, the installation was quite easy since OpenText Professional Performance Engineering (LoadRunner Professional) is SaaS-based.
I have mentioned many advantages about this product, but to discuss disadvantages or areas that could be improved, I would need to consult with my engineers who are working on it. So far I have not heard of any significant issues.
I can gather specific improvement suggestions from my architect and provide that information later.
I have been working with this solution for almost five years.
I see it is stable, though there are some glitches or latency sometimes. If we want to drill down further on what's happening, I need to check with my engineer. So far, I have not had any significant issues.
We look at the scalability because we are using the SaaS model now. The license can be consumed on a need-basis and can be served.
The technical support is really excellent. That's something which I completely agree with.
Positive
The installation was quite easy since OpenText Professional Performance Engineering (LoadRunner Professional) is SaaS based.
The pricing is always at a higher rate when compared to competitive tools such as NeoLoad.
When we compare OpenText Professional Performance Engineering (LoadRunner Professional) with Tricentis and other vendors, we see differences not only in pricing but also from a technical perspective. We don't use similar tools from different vendors. From OpenText, we have taken only LoadRunner Professional, which is mainly used to conduct load testing. Similarly, from Tricentis, we have only taken Tosca, which is for our functional testing tools across our applications. That's how we choose our tools.
I can provide additional advice after consulting with my team.
I am open to being contacted via email for any clarifications about my review.
I rate OpenText Professional Performance Engineering (LoadRunner Professional) an 8 out of 10.