We performed a comparison between OpenText LoadRunner Enterprise and OpenText UFT Digital Lab based on real PeerSpot user reviews.
Find out in this report how the two Performance Testing Tools solutions compare in terms of features, pricing, service and support, easy of deployment, and ROI."The initial setup was straightforward. I was able to download everything myself without any IT support."
"It offers easy integration with third-party tools like Dynatrace, Splunk, etc."
"With Performance Center, the version upgrade is easy. You just have to roll out the new patch or the new version."
"This product is better oriented to large, enterprise-oriented organizations."
"The product is very user-friendly."
"It is also good for reporting purposes, which would be most familiar for QC and UFT users."
"The most valuable feature is the Vuser protocols."
"LoadRunner Enterprise's most valuable features are load simulation and creating correlation for parameters."
"There are numerous valuable features such as automation, the ones that facilitate importing and synchronization capabilities between our platform, Jira, and Azure DevOps."
"For automation testing, the tool provides the record and playback option, which helps with object detection easily."
"The solution is easy to use. There are features to orchestrate mobile testing, including mobile testing automation. You can test different devices at the same time."
"It is a complete solution for mobile application testing."
"The most valuable feature of this solution is virtualization."
"The fact that it allows users to test on real mobile devices instead of emulators is something that projects have told us is beyond compare."
"The product is easy to use."
"The installation has not been straightforward, and we have had so many problems. We have had to re-install, try to install on a different machine, etc. We have not been able to launch the LRE server itself yet."
"I know there are integrations with continuous testing. It's got tie-ins to some of the newer tools to allow continuous testing. I'd love to see us not have to customize it, but for it to be out of the box."
"LoadRunner Enterprise's reporting should be quicker, easier, and more flexible."
"Micro Focus's technical support could be more responsive."
"The price of this solution could be less expensive. However, this category of solutions is expensive."
"A room for improvement in Micro Focus LoadRunner Enterprise is that it should take multiple exhibitions for a particular scenario and have automatic trending for that. This will be a very useful feature that lets users look into how many exhibitions happened for the scenario and their performance, and you should be able to see the data within the Performance Center dashboard. For example, there's one scenario I'm focusing on multiple times in a month, and if I check five times, there's no way for me to see the trend and find out how it went with those five exhibitions. It would be great if the Performance Center has a view of all five exhibitions, particularly transaction by transaction, and how they happened. If Micro Focus LoadRunner Enterprise shows you the time trends, information about one exhibition to another, and how each performed, it'll be an immense feature, and that should be visible to every user. Reporting should be simpler in Micro Focus LoadRunner Enterprise. If I did a scenario with one exhibition now, and I did that scenario again, then I should be able to schedule that scenario for the exhibition, and if that scenario is executed multiple times, there should be the option to turn it into a single view that shows you all the transactions, how the performance was, what the trend graph is for a particular time, etc."
"We'd like the product to include protocol identifiers whenever a tester wants to test a new application."
"The product's scalability must be improved."
"We need to scale devices easily. Some customers would like to loop in AWS or other cloud providers to check if their devices have the cloud factor. OpenText UFT Digital Lab needs to improve it."
"The product's object detection method needs to be improved since it can help testers do perfect testing."
"The documentation and user interface both need improvement."
"I would like to see more integration with automation tools."
"We like to host the tools centrally. We would need them to be multi-tenants, so different projects could log on and have their own set of devices and their own set of apps, and they wouldn't see data from other projects that are using it."
"For the most part, the key challenge is ensuring that customers fully utilize the product as intended and adopt the appropriate frameworks to implement the solutions effectively."
"They should introduce a pay-per-use subscription model."
More OpenText LoadRunner Enterprise Pricing and Cost Advice →
OpenText LoadRunner Enterprise is ranked 5th in Performance Testing Tools with 81 reviews while OpenText UFT Digital Lab is ranked 6th in Mobile App Testing Tools with 16 reviews. OpenText LoadRunner Enterprise is rated 8.4, while OpenText UFT Digital Lab is rated 7.4. The top reviewer of OpenText LoadRunner Enterprise writes "Saves time and effort, and makes it easy to set up scenarios and execute tests". On the other hand, the top reviewer of OpenText UFT Digital Lab writes "Robust solution for application lifecycle management with numerous valuable features". OpenText LoadRunner Enterprise is most compared with OpenText LoadRunner Cloud, OpenText LoadRunner Professional, OpenText Silk Performer, Tricentis NeoLoad and Apache JMeter, whereas OpenText UFT Digital Lab is most compared with OpenText UFT One, Appium, Perfecto, AWS Device Farm and Sauce Labs. See our OpenText LoadRunner Enterprise vs. OpenText UFT Digital Lab report.
We monitor all Performance Testing Tools reviews to prevent fraudulent reviews and keep review quality high. We do not post reviews by company employees or direct competitors. We validate each review for authenticity via cross-reference with LinkedIn, and personal follow-up with the reviewer when necessary.