We are from a Vodafone department that manages testing and quality. We brought this tool in to assist us. We are constantly using it. 95 percent of projects are running on it.
We mostly use this solution on our laptop devices.
Download the Micro Focus ALM Quality Center Buyer's Guide including reviews and more. Updated: June 2022
Micro Focus ALM/Quality Center serves as the single pane of glass to govern software quality and implement rigorous, auditable lifecycle processes. Designed for complex multi-application environments, organizations can achieve high efficiency in their testing and measure quality with a requirements-driven and risk-based testing approach. Advanced reporting provides a complete view across all releases to gain new insights and make informed decisions. With numerous deployment options, open integrations with common tools and strong data controls, ALM/Quality Center is a perfect choice for enterprises that need to enforce standards, ensure compliance and adapt to changing tools.
Learn more:
Micro Focus ALM Quality Center was previously known as HPE ALM, Quality Center, Quality Center, Micro Focus ALM.
Airbus Defense and Space, Vodafone, JTI, Xellia, and Banco de Creìdito e Inversiones (Bci)
We are from a Vodafone department that manages testing and quality. We brought this tool in to assist us. We are constantly using it. 95 percent of projects are running on it.
We mostly use this solution on our laptop devices.
It is helping with our delivery, testing, and quality processing. It links all our test cases with defects. Users from across the globe can comment on a defect or add attach artifacts to the defect cycle. ALM adds control with its integration.
We use it for visibility on multiple projects. We categorize all our deliveries into different domains and projects. Recently, we had a call with the technical team and they suggested to split our project into multiple domains and projects since this account is not that big. We hardly have six to seven projects running in parallel so we manage with one domain and one project, and all other projects are archived. We decided the way forward would be to split one project into multiple domains. This way, if in future something goes wrong, other projects will not get impacted if there is a problem with a project.
The solution’s ability to connect all related entities to reflect project status and progress is good. Right now, individual users are logging in with Single Sign-On and uploading their test cases. Performance usability is fine.
We have never experienced any security issues.
We can get an entire project into a single repository where we can view all the data in detail. This is where we keep all our test cases where everyone can reference them. This provides everyone access to the test cases and artifacts via the cloud. There is no need to contact anyone. It is the same with defects. It uses a common forum for tracking the defects and centralizing discussions.
Test Lab: This is where we keep all the test cases and mapping of all the defects. It's also for storing of all the artifacts.
Defect management: This is a good feature and fulfills all our requirements. We use it for user and role management. Only the admins can see all the users' details.
We use the application's Single Sign-On feature. The usability is good. There are no access performance issues. It is easily understood, even for new users.
It takes time because it has a 360 view of all the processes when talking about test case, design, and defects. There are so many things to track. Therefore, if I try to inject Micro Focus ALM into a small agile, delivery project, there is resistance. If there is resistance, is there flexibility for customization based on project scale? I don't know if this is possible.
Also, it adds time when I upload and execute all my test cases to Micro Focus ALM. For example, when I prepare test cases, I need to run them individually, then upload them to my sheet. After 10 days, I might have finished all my testing after tracking everything in Excel. Moving to ALM at this point adds time and overhead. It increases my testing timeline, e.g., if my testing takes eight days, when I add on time for ALM, the testing time is now 10 days.
The version of Micro Focus ALM that we use only works through Internet Explorer (IE). We have to communicate to everyone that they can only use IE with the solution. This is a big limitation. We should be free to use any type of browser or operating system. We have customers and partners who are unable to log into the system and enter their defects because they work on a different operating system.
From 2011, we have been using Test Director, which became HP ALM, and finally Micro Focus ALM Quality Center.
We haven't seen any issues when working with multiple projects. Maybe once a year, we have an issue with stability.
When this solution was upgraded to version 12.55, we saw some performance issues. We raised this as an incident. The team has worked on this and provided us with results. We have seen performance issues which may not be related to ALM, such as latency in the data or remote working conditions. These are issues that we are raising to the Micro Focus team though.
We have experienced some scalability issues. I would rate the scalability as an eight to nine out of 10.
We have about 50 to 60 users logging into the solution.
I would rate our Qatar technical support as 10 out of 10. Our technical support person is always available to help us. We are very thankful for the service and support. The communication is excellent. However, when we have an issue, e.g., an application is not working, having error, or we are raising a ticket, it takes time to resolve. This should be improved.
When we were installing for the first time, it was not simple. We could not just go to the URL and install. There were some initial installations problems with IE where we have to add the URL and make it a trust site. This had to be done by an admin, which takes times. I would like to see this improved.
After the installation, we didn't have any problems with deployment or integration into our environment.
We can open this solution by URL and access the application where it runs to the server. We do have a restriction when installing infrastructure applications. We have to ask our IT to have our admin install it.
Admins should not need to directly install objects into the application. This should be done directly into the server or cloud.
We don't do any maintenance. The solution is SaaS and managed by the Micro Focus team.
The solution has saved time with background activities and helped my delivery to move forward. However, this application is a support function into our delivery.
Compared to the market, the price is high.
We just renewed our licenses, which took time to do. I think we have 30 concurrent licenses.
The world is changing to open source code and free applications. This may be an issue in the future.
In one of our agile projects that was going into a sprint, we recently started using Jira (about a year ago). This was for a small delivery project whose team felt more comfortable using it. For example, if a tester raises a defect in ALM, there are many fields, including those that we have customized. It takes time to raise a defect, then close it. Since it takes time, the project team decided Jira is quicker and also open source. On the other hand, they agree that Micro Focus ALM is better overall, e.g., in the way, it keeps information and provides reports. Because the team didn't need a lot of information as part of their delivery, they went with Jira.
We are happy. It is a good product. We have benefited from the tool and recommend it. We have received very good feedback regarding its use. From a user perspective, the ability to create test cases and manage defects is excellent.
We are planning to integrate automation with Micro Focus ALM. This is in development.
We are doing risk-based testing using manual generation of the script, then uploading it.
To use the flexibility feature from a requirement to my test cases and get the benefit of traceability per the SDLC process, I would need to keep and map all my requirements. It is on the user whether they are using this feature or not. While I know this feature is there, we are currently not using it. We are manually managing traceability. We are preparing and keeping all our test cases in Excel. When the test cases have built up, we are manually mapping them based on our requirements.
We are not currently using mapping test cases. This is a feature of ALM that would allow us to map our requirements, solutions, and everything the test misses. We had a call with the Micro Focus technical team regarding this and about how we can use other features.
Consolidate the testing process, centralised reporting, ease of analytics on metrics, easier bug management, consistent flow of requirements, flow of test cases, reusable test cases, testing history, bugs.
It has improved our organization as a result of several factors: All test assets are in one central location; Easier to track progress of QA activities; Easier reporting; Easier to assess quality
Requirements Management, Test Plan, Test Lab, Defect Management, Sprinter, Access control, Versioning and audit.
The project tracking is a bit complex. It takes some time to maneuver around it. It would also help if you could export some of the reports generated from it e.g. the Master Plan.
More than five years.
No
Can be used for really large organisations, multiple test projects
Customer Service:
Very responsive, though we haven't needed a lot of support.
Technical Support:
Technical Support has been helpful.
Initial setup was straightforward.
Vendor. The team was very qualified, both technically and from a user perspective.
We haven't yet computed the ROI.
Original cost was $158,000. Our day-to-day cost is difficult to compute, but it’s very low.
Yes, IBM - CLM.
It’s a great product for managing an end-to-end lifecycle process. It’s easy to use once you get the hang of it. One of the biggest pluses is having all your test assets in one place – requirements, models, test cases, test results, bugs, reporting, tracking (it’s unbeatable in my opinion).
It's also great that HP has now lowered the Saas cost for ALM - it was too high in my view.
We use this solution for an Avionic System to test for integration and verification with real and simulated hardware.
By using QC we broke down silos (of teams), improved the organization of our tests, have a much better view of the testing status, and became much quicker in providing test results with document generation.
The automated document generation provides the ability to perform tests within one day of our flight test readiness reviews. In the past, the timespan was several weeks.
We would like to have support for agile development. As we do not have this capability, we are now investigating the use of Octane.
Good test management tool.
Defect tracking.
Licensing model is awful.
Quality assurance, requirement, and testing.
ALM helps focus on requirements, test, and the execution, track your defects, etc.
Release management and integration with other tools.
Test cycles.
We use it for manual and automatic testing along with defect and requirements management. We can check everything, know who is the sponsor for it, and make a test plan. Everything is very visible.
ALM uses a waterfall approach. We have some hybrid approaches in the company and need a more agile approach. We have also installed ALM Octane and are trying to see if it fills the approach that we are looking for our company.
It is very stable. We have not had any problems.
I have a good impression of the scalability. I have been very satisfied.
I used tech support once. It took a while to solve the issue, but it was solved.
Before, we used Excel for complex testings. Using this solution has been a huge step for us. From reporting to team management, everything is better now.
We have divided our licenses between Micro Focus ALM and ALM Octane. It works for us.
I use 80 to 90 percent of the product's features.
Certain features are lousy. Those features can drag the whole server down. There are times that the complex SQL queries are not easy to do within this solution.
Micro Focus ALM needs to bring the features of this ALM into the newer version of Octane.
ALM can scale and is very impressive. It can support thousands of users with a very low amount of resources. It can easily manage very big projects within thousands of people at a time. It allows and disables scale, supporting front-end operations and task management at different levels.
The initial setup is quite easy, if you know what you are doing.
It allows us to keep our costs low. I do not want to pay beyond a certain point for this solution.
My prior organization used the test execution and defect modules for QC. As a manager, I was able to set up reports that allowed me to finds areas of improvement for my team. We used the import functionality to import test cases for reusability and execution.
Prior to using Quality Center, my organization used spreadsheets and emails to track testing efforts. Therefore, QC helped my team become more efficient by tracking all testing activities with the tool.
I like the customizable report functionality. I was able to set up reports that allowed me to accurately give a real-time status all of all testing projects that were in process.
We use this solution for model testing and as a central location for the test case responses and some test automation.
Central test locations are a benefit. The ability to integrate this solution with other applications is helpful. If there is automation, it comes with improved quality and speed.
This solution is open and very easy to integrate. The interface is good too.
There needs to be improvement in the requirement samples. At the moment, they are very basic.
They could also improve the usability.
I find the system very stable. There is very little downtime.
This solution is scalable. There is the ability to draw on the different platforms, especially with ALM Octane. However, I am more interested in the new hard platforms, so more of a container platform or solution. This is on their roadmap in the next three years, so at least there is a plan for it.
It is very good. We are located in Germany, and they have a service partner here. The fix given to us depends on the complexity of the problem, but usually we get answers within a day.
If someone is researching solutions, they should know that this solution is stable, centralized, and scalable. If they need integration, then this is the tool to use.
When selecting a vendor, some important criteria are availability, knowledge, price, and the site where they are getting the product. For example, if we have people doing a project as a team, then it is best if the solution can work in different languages, like German and English.
It is a complete AQM suite: single repository for tests, requirements, defects, etc.
Our primary use case is for regression testing.
We use Micro Focus products together to improve organizational SLAs.
Lab Management is a valuable feature, because you have a 360 view.
The QA needs improvement.
Our current environment is ALM QC 12.53 and for performance testing ALM PC 12.53, Vugen 12.53, and UFT 14.
Micro Focus Quality Center helps in end-to-end traceability from releases to requirements to test cases and with defects. The enhanced dashboards capabilities are useful for senior management to view the progress of releases under the portfolio in one go and also drill down to the graphs.
The Project Templates and Enhanced Reporting features are the most useful. We have created domains as per the business units, and per business units, there is one template. It becomes easy to manage the template at business unit level. By standardizing our template, we publish reports at the business unit level.
I feel that the licenses are expensive.
The ability to show end-to-end tractability between requirements, tests, defects, and also reporting.
Allows us to have a single source of truth for our test efforts.
The reporting could be a little more robust.
I have used this version since June 2015. However, I have used this product since 1999, when it was Test Director 6.0.
Very infrequently.
Not so far, no.
A seven out of 10.
No. We have used Quality Center/ALM since I joined this organization.
A little complex as it pertains to migrating databases, then manually linking them back up to the file repositories.
In my current role, I don't advise anyone about pricing and licensing for this product.
This product was already in-house when I joined my company.
If you go the on-premise route, make sure your system architects and DBAs thoroughly review the installation/upgrade guide. I would also advise establishing a "center of excellence" department which can help build template projects and enforce standards so the users are all using similarly configured projects.
As a system administrator, HPE ALM can be flexibly configured so that it can accommodate a variety of defined project lifecycles and test methodologies. As a project user, HPE ALM can provide a logical approach in conducting comprehensive test planning, execution, and defect management.
My organization uses HPE ALM to track the progress of testing and quality assurance efforts across projects that we are formally engaged in. The product has provided my team with metrics that provide various insights into the management and delivery of projects with respect to documented business needs.
HPE ALM’s out-of-the-box reporting can be perceived as rigid and limited, to an extent. Knowledge of HPE ALM’s data model is important when setting up certain reports, and can be challenging depending on reporting requirements. Even so, these reports may not translate into appropriate insights that will provide value to a project or management team. The performance of the product may also be a concern, depending on the amount of active connections and data processing that it has to conduct at any given moment.
Four years.
HPE ALM is relatively stable, especially with more recent versions (12.5X). However, it is important to consider utilization frequency of HPE ALM at any given time and ensure that hosted application/database servers are configured to handle resource-intensive transactions to minimize performance/availability/data integrity issues. Micro Focus has published certified/supported configurations for running HPE ALM servers and client computing devices.
In general, HPE ALM has the potential to be very scalable from both a feature and usage perspective. HPE ALM has the capability to create project templates which may then be linked and applied to different projects. The solution also allows for customizations to be applied by individual project. However, an organization must exercise discipline in applying consistent processes to manage and govern any projects which use HPE ALM, to avoid data/information management problems.
HPE’s technical support services are fair but leave a lot to be desired. An alternative to the direct HPE offering would be to pursue outside, well-known, thirrd-party professional support services that have extensive knowledge in HPE ALM and associated tools.
No, I have not used a different solution in the past.
Initial setup of HPE ALM is relatively easy. However, it does become more complex when the product must be configured to meet company needs and compliance policies. This includes site configuration parameters, migrating existing projects from a previous version, securing access, and implementing integration to other HPE products such as HPE Performance Center. Many of these considerations are documented within the HPE ALM Installation and Upgrade Guide.
It appears that most companies choose HPE Quality Center Enterprise or HPE ALM. HPE ALM contains everything included with Quality Center Enterprise, and further adds features focused on cross-project customization, planning, and tracking.
Seat and concurrent licensing models exist; the latter is recommended if a large number of different users will be utilizing the product. There is also the option of utilizing HPE ALM/Quality Center Enterprise as hosted on HPE’s SaaS platform.
This product was already chosen from a historical perspective. Although some high-level research around alternative solutions (Helix/TestTrack, Microsoft) was performed, none of them seem to be as comprehensive or as well suited towards satisfying existing needs.
HPE ALM is a relevant product that assists with test delivery, execution, and management within a project-driven environment. I would recommend others to check out the HPE ALM Help Center and product pages for additional information before making a decision.
We mostly use the Defect module and then Test Plan, Test Lab. But if you ask about the most valuable feature it is the customization of any action in the ALM client. We can customize any action, window, or workflow of not only the Defect workflow but also any other entity. There is no other tool that can do it in such a way.
The testing methodology we use means we do not need any other tool for requirements, testing and so on.
We used it for more than 10 years (from version 8 as Test Director from Mercury).
The clients, of course, are not stable, but it is acceptable.
Scalability is not a problem at this time because the hardware is better than the software needs.
Technical support is good.
Yes, we used Rational ClearQuest. It was very customizable too but it was old and tough and we need a better and more elegant solution.
Initial setup is very simple, but the upgrade on Linux is impossible so we moved to Windows.
Pricing is very big, so it's good is to negotiate with its vendors. The solution is not so important and should not cost so much.
Yes, we evaluated three other options but it was about 10 years ago and it is not relevant now to specify them here. The other options also have very good solutions.
Do the simple implementation, do not customize it because you will have more problems.
It gives me the ease of putting together the requirements, test cases, the release test schedules, and executing the test. It can generate the reports for each and every release that we need, and it's quick and easy enough to generate reports.
We have multiple teams across the globe where we have multiple projects set up in ALM. One project is used by our team in Israel and our other project is used by a team in Atlanta. We have a centralized control or multiple projects going across the globe. So that's a good benefit for us.
We actually use Performance Center, too. Where what we need is: When we run any test to the Performance Center, the results are stored in ALM, too. But what happens instead, the results show a summary report in HTML or a .zip file. But if there was a way in the test lab for ALM, after running any Performance Center test, that the the results could be published in the test lab itself instead of going and opening the particular result (if it shows all the response time and whatever transactional data that we have) in the ALM lab itself, that would be beneficial for us.
Also, on and off, we have had some issues with the operation itself where the operation is not able to run the test or something. We have to go back and forth with the vendor and HPE (now Micro Focus) to get this resolved.
For the last 10 years.
For ALM, we didn't have any specific downtime crashes, but we have had some issues with the database connection. The internal database where we put in the ALM data might crash or the database connection is lost. That's where we noticed some issues.
Whatever version of ALM that we have, it is more than enough for what we have right now. In terms of scaling, I can say it will go beyond four to five years from now.
For any of the operators, the support is extremely useful. It's great, actually. They are always available on-time.
It's straightforward to operate, but you actually need to get involved with the concerning vendors when you need their support.
We cannot just go download anything from online and put it. We need some support from the concerning team to make sure everything is right.
I am the QA Manager, so we use it to score all our test cases, results, defects, and reporting, which is very important. We're able to produce a number of reports and graphs. This helps us a lot when working with our clients.
We're doing a lot of agile work and using a number of different agile tools. Agile integration, as right now it does integrate with version 1.0, but I'm not sure about its integration with some of the other agile tools.
We've been using it for about five years.
It is stable. We've really had no downtime experiences. We've had good experiences with it.
It is scalable. We're actually moving onto the SaaS product. We're looking at that right now.
Tech support has been very helpful when we've had some questions or issues. They've been very responsive.
I'll find out over the next couple of months as we are looking to do the first upgrade since I have been using it.
It's a very good tool. We use it throughout the company. There are just some integration points which could be a little better. But if they're out there, I don't know about them. Maybe having the knowledge and knowing about them would help as well.
Definitely the testing. My app test case organization, being able to organize it, and standardize a quality program.
Definitely ease the complexity of the tool: the upgrading part of the tool. It needs to be easier.
Also, it needs easier integrations. I know one of the big reasons we did upgrades to the ALM upgrade license was because you could use Octane, which Tasktop is giving free for a year. That helps integrate with some of our other tools. I think as our organization, one of our biggest challenges is, we have all of these different tools, and getting them to talk to each other. To really have a whole encompassing pipeline, that is our challenge.
Five years.
We don't really have downtime, but we do have where it crashes here and there. So, stability is not great, but okay.
It does not meet our needs. The product is very geared towards waterfall. Very stable, standard things, and as an organization, we want to be innovative. We want to try new things, and it just doesn't seem to do that easily.
It's okay. We've had to escalate things a few times to get answers, but they have provided them in the end.
Setup and upgrades are complex.
Probably one of my biggest issues with the product is that it's so complex and hard to do. We even paid $30,000 for a consultant to come in. One year in, then we wanted this upgrade again, and they wanted to have a consultant come in again. I'm like, "We just did!"
So, we decided, "We're going to try it without it," and so far it's going well, but the complexity of it seems to be daunting to engineers, not like other tools that they implement and upgrade.
The most valuable thing is the flexibility of the customized options. That makes it more powerful than any other tool. We can customize based on the project and on how we want to control the testing.
We used to have 10 different Excel spreadsheets for one project. Then, we switched everything: paper, Excel, etc. to be done in ALM. There is no outside noise and everything is done under one umbrella.
The canned report site could be improved. You can get your report but you have to do some stuff. If the project doesn't have a good, strong user, they don't get these reports. If we have more canned reports from the ALM site, this will solve some issues.
We have been using the ALM call center since the Mercury times, so the last 10 to 12 years.
It's very stable. In last 12 years, we've probably had two/three downtimes. But, nothing concerning their application.
Yeah, it is scalable. 10 years back, we started with five users. Now, we have 38 confirmed licenses. Over the years, we have grown from having just a few projects to having more than 25 large projects.
Our experience with the HPE support was not great. We have not used Micro Focus yet. Based on that, we switch to a consulting firm, Melillo, for the support because we were not getting direct answer from the HPE support, therefore we switched because of that. Now, we get a better service. Hopefully, with Micro Focus, it will be better.
If someone is doing the setup for the first time, it might be a little complex for them. However, if you are continuously upgrading, then it should be fine, because all of our upgrades we have done in-house. We never went to a company to get that bit done. If you plan it right, you can have the upgrade very smoothly done, so the user isn't affected.
Most important criteria when selecting a vendor: support and stability of the product. These are the two most important things to us. We want to have continuous improvement, because there are places to improve; we also don't want rapid changes, because they do affect the user, so that balance is important.
The overall task management. Managing all the assets and metrics.
I'm not familiar with all the changes, but they definitely have to be more DevOps friendly. They have to certainly be more open source friendly. That's the world we live in, where we can cut costs away from large-scale vendor contracts and service contracts. The ability to seamlessly integrate and provide more capability for those, managing those infrastructures and solutions, is going to be critical historically.
A lot of the vendor products - not just HPE or, in this case, Micro Focus, or whomever that I've dealt with over the years - were much more proprietary, much more exclusive. And what we're finding now is that the world doesn't work like that. Particularly as you move left and shift towards DevOps, application teams now don't consume from a central resource, they consume based upon decisions made internally to that application team.
Ultimately, what they need is flexibility. So any vendor product needs to have that intrinsic in its fiber, to be able to adopt open source, and integrate basically into almost anything, to expand out the choices available to an application; to make the decisions that need to be made independently at the time that they need to make them.
Not having looked at the latest, ALM Octane, just coming from the old world, at the time that it was necessary to implement a test management system to gather more information, metrics across different teams, different platforms, it served the purpose.
Things change constantly these days. There's a lot more going on. There are a lot more integrations available. I think if we're looking at the legacy owned product, I think its kind of come and gone as far as its ability to do what you need to do in a DevOps world. Any solutions in the future - I know ALM Octane is the heir apparent to the old infrastructure - it's going to have to be more DevOps friendly. It will need to be able to enable the consumers, the application's users who ultimately become the developers, to see the value in a more organized test management practice, versus more of a kind of hidden, under the sheets unit testing.
It's actually a whole trajectory of different solutions, different tests, that need to follow the pipeline for those folks. Anything that's not DevOps friendly, that's not DevOps easily consumable, to make the case for a more formal test management practice, is really going to end up by the wayside at the end of the day.
11 years.
My experience with the solution is that it has been fairly stable. What lies underneath is what creates the instability at the end of the day, the architecture that you are providing the solution on top of. I think once you figure out a viable, scalable approach to it, then the software itself, at least in my experience, has been very stable in running a test management operation.
It has met our needs. Just as long as you have the right architecture from the old days of physical server hardware to more of the newer stuff, which is VMware within datacenters - more virtualized.
And certainly the next rage for everybody is moving into Cloud infrastructure. So things are becoming much more self-service. You're getting model scaling. You're getting the things that are making the system more maintainable. But from a scalability standpoint, you want to be able to scale to the needs at the time that you need them. The Cloud certainly provides that capability.
I think like every company, they're changing the landscape. Support, in my experience, has been pretty good. There are always challenges based upon the routing/tier structure of who gets the issue first, how it gets routed, how it gets filtered down to the specific expertise that you need. That depends on your acumen as far as knowing your tooling, knowing your approach, what that's going to be.
Somebody who is very savvy, will obviously have frustrations coming into a tier-one support desk. Who they really need to go talk to at the end of the day may be somebody, and it will vary by company, like a tier-three, real low-level, very experienced resource support tech who fixes those issues. So it's going to vary based upon the customer's competency versus how they are routed through a support desk.
Testing is going to be testing. And the same challenges that you have in any of the different industries are going to be the challenges that you have in the ours, the insurance and financial industry, as well.
You know from DevOps to Agile, to Shift Left to Cloud, to managing your test assets efficiently and effectively, industry is really not going to make a difference.
I've been in a number of different sectors over the years. I've been in QA about 25 years, and having been in the natural gas industry, financials, insurance, HR systems. They are all pretty much the same challenges around testing. So I don't see a discrepancy in terms of the application you're testing. It's almost agnostic to the challenges that are innate with trying to test, within any type of development environment. Now, it just happens to be a more self-service DevOps model, where application teams make those decisions. But there's still always going to be those QA challenges.
Test management and reporting. Those are the two most important things. I tell my customers that the two main reasons they have ALM:
When I was a customer, it improved my organization because I was able to manage, to enforce standards on building tests, executing tests, and manage centralized reporting.
Now, I translate that over to my customers from various levels of the spectrum from complete, "We have no idea what to do to, we're doing stuff but we know we need to change," to "We've got some stuff and we just want to tweak what we're doing now."
I'd like to see the idea of users being flushed out more, so not just, "This defect is now assigned to a particular person," or "This person is assigned to execute a test."
I want to see the users expanded out to teams where you have five users and the sixth user is the manager, so the manager can roll the idea of somebody being responsible and accountable. The idea of things being assigned to a team of users and users belonging to that team. There are ways of getting around this in the tool because it's very customizable, but I'd like to see that separate from the idea of using security groups, which is one way of getting around that.
I'd like to see the concept of teams put into it.
ALM has gotten more stable over the years. It's a stable app. Like any other large, complex application, you run into things every now and again. We have a system to report things and get them taken care of.
I have customers that are small and customers that are enterprise-wide. So I'm able to deploy it in both kinds of environments and customize the tool, depending on size and level of maturity, for any kind of customer. Also within any vertical as well.
I have used tech support. Mostly because I'm with a consulting company and we also support ALM. We have our own internal support organization that people can get into.
In terms of Micro Focus support, because I'm a more advanced user - I've been using this tool since version 7 - I typically don't get a whole lot from first-level support. I tend to want to go right up to second, third, or even directly with the development organization. So I'm more the outlier, edge-case kind of person compared to most customers out there.
Once I get to the people that are at the level that I know I need to deal with, they're good. I'm also dealing with the people on the other side of the ocean, working directly with people who may have actually coded ALM to begin with.
When I became a customer in 2000/2001, when I first started, I was involved in the decision to purchase the solution. Now, as a professional services consultant, that decision has been made and I'm going in there to either deploy, upgrade, or help them use ALM to best suit their needs. In some cases I help them figure out what it is they need to have ALM do for them or how to customize it best.
When I was a customer, we were not using another solution. We were completely manual and I was a department of one. I was the QA organization for a small development company and the two company owners said to me, "We want to invest in this, go look and see what's out there and show us what our options are and what you think the best option is."
What caused us to switch to this solution was the customizability. The fact that we could make it give us the information that we needed to get out of it. The support organization seemed very top-notch. I actually learned a lot from the support organization when I was getting started in it. And I found it more intuitive then the Rational solution.
I've deployed it in many organizations because I'm a consultant. I've deployed it, upgraded it, customized it, in various ways for different customers.
In terms of complexity, it really depends on the needs of the customer.
When I was a customer in a small development organization that only had 20 people in the entire company, I deployed it, I did the customization - that was way back in the day.
Now, I have customers along the entire spectrum from small to large enterprise. Some customers are okay with near vanilla, out of the box. And some customers have very complex sets of business logic that they feel, for whatever reason, need to be enforced as far as how their defect management lifecycle is going to go. How their test construction, test execution lifecycle is going to go, how they want to manage requirements, and that can require significant customization.
Some of my customers have compliance concerns, they have digital signatures and they have FDA 21 CFR Part 11 compliance. They have all of these rules that they have to follow and some of them are subject to interpretation, so with one particular rule I have one customer who says, "This is how we interpret the rule," and they have me customize it one way; and I have another customer who says, "No, we're not going to interpret it that, way we interpret this way," and it's a completely different set of customizations.
Back then it was Mercury Test Director, which is now ALM. We were also looking at the Silk products, and we were looking at the Rational, now IBM, products.
When selecting a vendor to work with, I want to see that the technical people are really knowledgeable of what they're talking about. I want to know that the tool can give me what I need, not just, this is a standard proof of concept. I want to see what I need to see, and I want to know that, down the road, I'll be able to either get out of it what I need or be able to learn or have somebody come in to help me get out of it what I need. Because if I'm not getting out of it what I need, then I've wasted my money.
I give it a nine because nothing is perfect, there's always room for improvement, especially when you're talking about an app system as large as ALM is. I've been using it for so long it's kind of second nature for me to think about where its strengths are, and know that if I can't get something done one way there's always another way around it. Or I can integrate something into it or build work flow to make the UI behave the way I want it to.
Regarding advice to a colleague about ALM, remember that your process and your methodology should be driving what you need out of their tool and not the other way around. Tools can do some really cool stuff. You may look at it and say, "Okay, maybe we could get some value out of this feature that we're not doing today." But don't make that the driving force. It really needs to be able to support what you're doing and force the things that you want to get out of it. Because there's a truism in reporting: If you don't capture the data you can't build a report that's meaningful. So make sure it can get you what you need.
It's a centralized test management solution. The fact that you have a place where you can go and find all the stuff that you need to find, and keep track of all of the results long term. That is extremely valuable.
In terms of functionality, it really provides all of the stuff that you need for managing test cases and test execution and keeping track of all of these different items. Now, in terms of keeping up with the trends, there's obviously a lot of challenges.
The new offering, Octane, has all of the essential features that we need in order to move forward to the next mode of operation. I tried to use it and, unfortunately, we had all sorts of trouble down to some limitations as to what kind of URL you can use. That was a pretty sad issue that we ran into. Had that not been the case, I would right now be planning to move on to Octane.
The key pieces of functionality are in place. The reason why I wouldn't rate it higher than seven out of 10 is because you're still using really obsolete technology like ActiveX. You have to physically install the product on your desktop. That's a big no-no. Other than that, it is not far away from being much better than what it is.
17 years.
It is very stable. It has some issues here and there but not significant.
We have struggled to grow with the tool, because the original model was to have just a handful of ALM projects, whereas, we have more than 150 projects. Whenever you pass some threshold, it becomes a challenge.
Even upgrading, it's a massive effort. I'd say at least a six month effort for us just to upgrade it.
The tech support is not so great so far. At least as far as the HPE tech support is concerned. Before, when it was Mercury, it was the best tech support of all time. Right now it's okay. It's doable. It could be better though.
When looking at a company to work with, it's as simple as knowing that the products are mature. We know that if there are going to be issues, we're going to be able to find solutions or some work around for them. It's as simple as that. There's a lot of competition out there. Especially in the open source space, but for you to get support on open source, that's probably a whole different ball game.
I like the traceability, especially between requirements, testing, and defects. Being able to build up a traceability matrix, being able to go through and show what's been covered, where your defects are, etc.
It's allowed us to be a little more consistent across the board. We have probably 80% of our QA teams using Quality Center. It is a system of record.
It really does allow our testers to work in a single application. It's not as good if you don't set things up in advance to work with other applications. But we're working on that part.
I'd like to see an easier way to upgrade and install. I'd like to see it less required to have a client. I know that Octane doesn't require a client, but Octane is not mature enough for our organization. I'd like to see some of the good points from that integrated into it.
I would rate it a 10 if it had the template functionality on the web side, had better interfaces between other applications, so that we didn't have dual data entry or have to set up our own migrations.
It's been around a really long time. It is very stable. It does require a little more work to upgrade, add patches, because you have to take it down. But then again, while it's running, we've had very little down time, very few issues from a system perspective.
When we do have to take it down, we usually take a full weekend, because we're a very large instance. But usually the install and upgrade goes through and takes three or four hours, and then it's just going through and running repair/realign or upgrade on the existing projects.
Quality Center is very scalable. We have over 700 active projects on our instance. That's projects, not users.
I've seen a lot of improvement over the years, from tech support. We are premier customers, or whatever the newest term is. We do meet biweekly with them and when we have an issue, we can escalate it and we get very fast response times.
We're a company that has gone through a lot of mergers and consolidations, and we've gone through and actually consolidated a lot of instances into ALM and, with that, the complexity is more with the users than it is with the application.
Getting it installed, getting it set up, that's the easy part. Getting people trained to use it, that's a little bit harder. But once people start using it, they find that they're not sure how they did their job before.
The most important criteria when selecting a vendor to work with are:
I would advise a colleague considering this solution to start with a plan. Make sure you know what it is that you want to accomplish with Quality Center, and only add fields that will meet that. Use your current documentation, your current processes, to help design the fields and the projects for it, rather than just adding things one at a time. Don't allow a "wild west," which is where anybody can add fields, add workflow. You want to manage that from the top down.
It gives us a solution where we can keep everything centralized like our test scripts, test data, and our projects. It doesn't matter who is creating the project, everybody can access and execute it. Both our onsite and offshore teams working from different locations are able to benefit from this solution. That's the beauty of it.
When we implemented this solution, we chose to virtualize, so we didn't implement any physical hardware. We're able to scale very quickly for very large projects when we need to run 5,000 user simulations. Afterwards, we can also scale down quickly. This gives us a lot of flexibility in our project executions.
The web client doesn't match the quality of the rest of the features of this solution. HP needs to improve it.
There are some challenges we faced during the deployment. But, we've had no major issues.
We've used versions 11, 12, 12.2, and now, 12.53. They've been very stable in our environment.
We're able to scale up and down as needed. It has great flexibility when it comes to scaling.
There are challenges related to the network security during the set up. But, once you figure it out, solution is relatively easy.
We have done the implementation in-house.
While comparing to the previous solution, this solution gave us as much as 60% cost savings.
Before you start implementing, make a solid plan and try to figure out the challenges before hand.
You can do your development from start to finish: starting with the requirements, ending with defects, and testing in-between having everything in one tool and be able to validate everything up and down in the SDLC.
Over the years, we've used this tool extensively in quality assurance, so they can record the test cases. They do whatever they need to do to execute the test cases, then report on what they've done, and how many defects have appeared. If they've gone out of production, then they've actually been able to shorten the time for QA, shorten the time for development, and lessened the amount of defects that actually get out to production.
The downside is that the Quality Center's only been available on Windows for years, but not on Mac. There's not any solution to any platform or browser. That's been a problem and people have been going to other tools because of it.
I've been using this product for about 18 years.
To a certain extent. We've had some projects that have gotten very large, to the point where one of them has become unsustainable, and we had to split it up to upgrade it to the next version.
The install itself, a lot of the DLLs and a lot of the times we get some updates, like Microsoft updates, cause issues with the tool itself. I think it can be improved, but if it's going to be a Windows program, something different than .net or something that's going to be more futuristic or really more inline with the technology of today.
We have a SaaS solution right now. At one point, we were on-premise. When we were on-premise, we had premier support, which was phenomenal because we got immediate attention. The support that's not premier support, it is not so good.
People don't take advantage of a lot of the functionality that the tool has. I think overall it's a very good tool for what it does.
Most important criteria when selecting a vendor: They know me, and I know them, so having a very good relationship and a very good rapport is very important. If I need help, I can go to certain people, and I can get help.
You can plan ahead with all the requirements and the test lab set it up as a library, then go do multiple testing times, recording the default that's in the system. Later, go back to check the coverage you are missing, so you can plan ahead and maybe reuse the same set as next time. Sort of like creating templates and reusing them over and over.
We use the quality engineering testing tool plus the defect tracking to make our reports, projects, and quality better. Once we had the evidence to approve all the testing and all the coverage, the reporting went better. Usually, the products make it much easier to identify the issues we have.
It is nice, but it does have some weaknesses. It's a bit hard to go back and change the requirement tool after setup.
It's not flexible enough. The formatting is also an issue. For example, the project manager doesn't like the use it, even for requirements, because it's not easy for them to change it. If they make a mistake and go back, it is hard to change the formatting to make it good. So, they have to share or use another one that try to upload. But, after the upload, you cannot change it because the IDs are identified. It's hard for them to work somewhere in-between, adding something in there, then keep the rest of them record is still linked well.
It's difficult to change it. Let's say you set up the requirement, if you change the requirement, by adding any on bottom which won't cause an issue, but I want to add it in central somewhere, then you mess up all the linkage for the test plan and test lab.
This requirement piece is what I think is the biggest disadvantage for the Quality Center. I do know Micro Focus does have a bunch of the new tools, but that depends if a customer wants to change it, use a new tool or stay on an older tool.
Reporting is a bit complicated. They have a standard report, but if I don't want to use that, I have to use the Excel reporter.
I have used it for the start of the implementation at our organization using Quality Center versions: 8, 9, 10, and now, we're on 11.5.
We host it in-house, so basically we don't have any bad downtime. It runs mostly 24/7, so Quality Center is pretty good with stability.
So far, it hasn't been an issue.
I would give them a high score as they do a pretty good job.
In the Quality Center, there's a tool, which we started with, QuickTest Pro. From there, we started to use QuickTest Pro, later we introduced and evaluated it. It looked like the situation we needed.
However, we wanted tracking. We started with QuickTest Pro, but now we're doing this, which includes a lot of the different areas, like it handles the workflow and/or agile and involving many necessary departments.
I was involved in the initial setup. I installed configure, manager, and the patch providing user access, though now we have a team.
The setup is straightforward. It's not hard to set up. We even used the multi-complicated one because we didn't want have the database alone.
To someone looking at Quality Center, I would tell them: It's a good tool to use and the support is good. However, if you want a fancy and modernized tool with a fancy outlook, then Quality Center is not a good tool for you.
Most important criteria when choosing a vendor: personal style. I want to know who will be continually knowledgeable.
If someone talks to me, and I try a few times, but I cannot get clear information from them, I may disqualify this vendor completely.
The most valuable features of ALM are in the new upgraded version of 12.53. We're able to more accurately document our test results, our actual versus our expected results, with the new screenshot functionality. That is the most useful part of the tool for me right now. Of course, we use it as our testing repository, and it's basically the way to show the work that we do as QA testers, and to have a historical view of those executions.
Because we can trend repeatable results, we can look at trends of things that are continuously working well, and things that continuously get broken within the software development process. So it helps us improve our testing quality.
Sprinter, I think, is a good part of this ALM tool, but it has some limitations for us. Based on the type of software we use - we have some web based applications and also some power built applications - not able to capture all the objects, or the way that we develop our software. We're not able to use it as much as we would like to. So Sprinter would be something I would like to see better integrated with the different types of technologies used by the software companies.
I'd like to be able to improve how our QA department uses the tool, by getting better educational resources, documentation to help with competencies for my testers, to make sure they understand how to use the tool. Do they really understand how they're using it? Why they're using it. So, for me, that would useful.
For us, so far, it's been pretty stable. Because we have such a ginormous amount of historical data, we've had a little bit of an issue with performance. We were working on copying and creating a new database for that because we have products that we use, FDA regulated products, and we can't get rid of those testing results. So we have to keep them for the life of the product.
So of all of the things that we've experienced, or had issues with, it would be the amount of data we're able to store, because we have to keep everything.
That would go to what I just mentioned above. We're looking at ways to improve being able to capture more results without impacting our products.
I haven't used tech support because we have a couple of different layers within the business unit. So I have people that I can go to, and then those people go to tech support. So it is utilized on a different level, just not by me.
When selecting a vendor to work with, the most important criteria are flexibility, availability, and scalability.
I would say it's a good tool. You have to invest the time into learning the different ins and outs of the tool, and become educated on it. I think it can scale as much as you allow it to, but you have to put the time into learning what it has to offer.
All the modules that we have in ALM, one of them is the test module. No better tool in the market than ALM because the foundation is what you see, it's been in the market for so long. I really like the test module.
But it's not only limited to the test module. It is the entire application that's a management tool. So we use it for requirements as well. And the link is between your requirements and your test waves and test plans, and everything is in there. So it's a pretty good tool.
If you don't use any tool to manage your application people will - like some teams we have who use Microsoft Word documents to do their requirements, and Excel sheets to plan their test cases, and write the test case and then execute and store it. In the long run, that is not going to be helpful because this is a structured way of exhibiting your development. That is what had been missed.
So when we started using ALM in our organization - we'd been using QC for so long - when we finally started using ALM and we tied the requirements module to the testing module, that definitely benefited. It's because we can show a lot of data in there and now we can link to some 15 years of back data. Most of the applications are there from so long, so we still need to do the core functionality test. But we don't need to redesign and we don't need to search for Excel sheets. We know exactly who ran it, when they ran it, how the execution happened.
We do have some suggestions on reporting. Most of the time we need to download data and then we create reports ourselves. If there was a little bit better reporting available that would be great. The reporting is the one thing that we definitely want them to do more on.
Yes and no. Once in a while we'll have some bugs and they will fix them, but other than that it's pretty stable.
We have assessed ALM right now to be pretty stable. I don't see too many things that are missing in ALM right now.
Scalability is good. So far we have not had any issues with scalability. For the last three years we were using it as SaaS. Before that, for a while we had on-prem, but after moving to SaaS we have never had any problems. We run around 300 projects, we have about 100 projects which are light. We've got, at most, 100 users at any given time.
We've used it multiple times. One of the reasons is the SAP tab. There is this plug-in that connects with SAP, and whenever we do an upgrade or something we need to test with the SAP tab, and the software has been very helpful in doing that.
I already know the response that we get from support. We have a dedicated CSO who engages whenever we need something, when we sat we need this report, we need that data, then he will definitely immediately give us that.
No. It's been there about three years. I wasn't part of the team at that time.
If you are using ALM, you had best educate your users to use the entire solution, not only the testing module or not only requirements module, because you will have way more benefit using the entire tool. It is designed to supplement the entire lifecycle and will definitely improve your productivity and traceability. If you use bits and pieces of the tool then the whole intention of developing the tool is not fully utilized. So use entire module, all the modules in ALM.
The most valuable user feature that we use right now is the camera. It allows us to take the test evidence, and it makes it easier for the user, especially the business user, to capture evidence as they test.
For a test management department, and we are highly audited, by the way, it allows us to have a single repository for all our projects where we do tests, as one go-to place for our test evidence. It has a go-to place for us to generate reporting, retain results, and be able to share it.
I have to say I'm not a huge fan of ALM. I think it's the best out of some of the not-so wonderful tools out there. The example that I usually give people is, if you're an IT person and you use a tool, you know that right click always does similar things. You know that there's functions that from one application to another mean the same thing or has the same features and functionalities. ALM doesn't work that way. I don't think it's a difficult tool to use, but it does need someone to be first trained on it, and then you have to use it a few times before it kind of sticks.
If you use it once, but then you go away and you come back, let's say a few months later, you have to get a refresher course. So it's like a computer application, there are certain functions which are: F1 is Help. Control is something else, and so on.
It's not intuitive in that way, which has always been a problem, especially with business users.
We have had a few unplanned downtimes. There's been situations where we're not able to access the tools.
Scalability, again going back, we're limited by the number of licenses that we have. If we want to have more projects, from our understanding, we just have to purchase additional licenses or purchase additional access for projects.
We have definitely used tech support, and the skill level varies (this is before the Micro Focus integration). When we were trying to figure if QC SaaS can work with Windows 10, it took us three weeks before tech support finally realized that they missed a patch on their end, and it cost us three weeks of wasted time. The IT support even said, "We're waiting for you guys to get synced up on your side," before we could do anything.
Tech support has the knowledge and skill, but it's not consistent.
I previously had used QC at a different company. I know I need a test management tool. When I joined the company, we already had this one, but I wanted to move to SaaS, because I needed something that was not on-premise based.
I was involved in moving us from the client to the SaaS and it was painful. We were on QC 10, and we had to move to QC SaaS. Because we're a bank, we have a ridiculous amount of firewalls.
So, we could not install QC SaaS and our tech support team didn't understand how to get it installed. Therefore, one of my team members had to figure out all of the nuts and bolts, then the HPE tech support was also slow in helping us. It actually took us many months to finally go from QC 10 to QC SaaS. I'm actually close to the end of my three-year license, and I'm seriously like, "Do I stay? Do I move?"
I did look at other solutions, and I didn't accept those only because the camera feature was very important to us. The other solutions that I looked at really didn't have the camera feature yet.
It was Zephyr, SmartBear, and ALM. I have some business users who are also very conservative, and for me to move them away from something they're very familiar with, I have to have some very compelling additions and functionalities to give them in order to say it was worth the effort to retrain them on something else.
I had a demo recently that was actually for Octane, but in that demo, I found out about a couple of tools that I actually have access to now that I didn't know about before. One of them was a JIRA integration and the other was a way to create manual task steps, actually just stepping through the application, which could be automated.
I was like, "Wait, I'm near the end of my three-year license, and I'm just now about this?" I was like, "I could have been using this?"
So, those are the new tools I'm looking at, and it actually came up because, as I said, we're renewing our license, and when my rep was talking to me to find out what was my interest, part of it is, "Well, I need your integration." He's like, "Oh, we think we have that." I was like, "Really?"
For anyone looking at this product, I would definitely have them look at other tools, too, and make some comparisons. I would say to them, "Hey, here's how we had to deal with it, and here's what works for us and what doesn't." For the other tools, since we don't have firsthand experience, I could only suggest that they actually do some research.
Most important criteria when selecting a vendor: response and customer service. Support is very important. Then obviously, still getting a good value for what I'm spending. The product at least needs to be comparable to the other tools that are available on the market.
I have to say that I definitely was looking to move away from HPE initially when I took over the department, because we were getting no support from HPE at all. However, HPE, because we're small in comparison to their other customers, shunted us off to a third party, their reseller, which may ultimately have been a good thing for HPE (now Micro Focus) as well as for us, because we finally got some attention.
Requirements, testing, defect management, all integrated in one solution.
In my previous organization, various tools like Excel and other open source test management tools were used. This was causing issues in sharing Test Management data. Once HPE ALM was brought in as a single solution, and data from all other tools were migrated to it, there was a single tool for all needs.
Agile, Devops.
Seven to eight years.
No.
Agile, Devops
No.
Good.
Need of a single solution for all needs.
Not straightforward, it was a complex architecture.
Expensive tool which could become an overload on the budget in the future.
Bugzilla.
Be aware of the cost aspect, it is very expensive.
Cross project customization through template really helps to maintain standards with respect to fields, workflows throughout the available projects.
Traceability feature really allows you to maintain linkage between all the test artifacts, starting from Releases>Requirements>Test Coverage>Test Execution>Defects. ALM allows you to maintain complete end-to-end process.
Business Views has really come in handy for all users, as different kinds of reports can be created very easily and published to all the stakeholders.
Synchronizer add-in has allowed us to integrate Microfocus ALM to other third-party tools like JIRA, ClearCase and ClearQuest, and helps to eliminate the isolation between these tools.
ALM has brought great collaboration among the team members.
Helps to maintain all the test artifacts in one place as a central repository where all teams can contribute and collaborate with each other.
Dashboard
Management- Libraries
Test Lab
Test Plan
Defects
I have been using the solution for 10 years now.
ALM is quite a stable application. We had some issues during the initial setup but it's been stable since, due to right level of competency/expertise we have in the organization to maintain the ALM setup.
No scalability issues.
Technical support is not that great. We really need to push to get HPE support to provide a resolution for technical issues. Tech support needs to improved now, as it has deteriorated badly.
HPE ALM was our first choice.
Initial setup was not that complex.
HPE has always been flexible in terms of pricing and licensing, but we are a bit concerned with the fact that it is now in the hands of Micro Focus and things may change.
No.
Think about the below before you start implementing this product:
The Business Process Testing module and approach to testing in QC is its most valuable feature.
For manual test cases, we need to write test case each time and if any update or CR comes then we need to go to each test case and update, which is very time consuming. But, with BPT we can update it in less time as would otherwise take to update two or three business components. After a refresh, it will automatically update the whole test set, which is over 100 test cases.
I would like to see a bit of improvement in its look and feel.
We've used it for seven years.
There were issues with the deployment.
We had some issues with the stability.
We had some issues with the scalability.
8/10
Technical Support:8/10
Earlier I used Mantis, but it was not user friendly and had no functionality apart from defect tracking. But HP QC is defect tracking by default. Test Case Execution tracking and reporting functionality which will serve all purposes for testing processes.
The initial setup was not complex.
We used a vendor team with in-house machines for the implementation.
For testing processes and improvements, I would suggest you use this product. But, if you're looking at cost, then that might be a concern, but no doubt it is the best tool for testing.
Our organization has developed a product for automating test cases and we track the test cases' automation status with HPE ALM. It includes a column which made it easy to track them.
It needs a feature for scheduling of automation scripts to run automatically. This feature would be very useful.
Five years.
No.
No.
Quick response.
We checked out other tools like JIRA and Rally which also have good features, but HPE ALM has is user friendly.
Straightforward.
No suggestions.
No.
As long as you can afford it, go for this product. Otherwise, there are less expensive or open source products with almost the same features.
Because of global test documentation, testing can be resourced most effectively each test cycle (including outsourcing).
Integration with test automation never worked properly. We were never able to run test automation on the SaaS platform. We ended up having to use QC 10 for test automation. We went through several "fixes", upgrades, etc., and were never able to fix the problem.
My guess would be 18 years at least.
Use of test automation with ALM is inherently unstable.
The parts that work seem to scale very well. But our use of test automation appears to have exceeded what this tool can do with that functionality.
Tech support is very, very poor. There is no coordination between product groups, you have to repeat the same information multiple times, features that are "fixed" end up breaking in the next release, and releases and upgrades do not fix problems as promised.
No.
We have installed multiple versions and upgrades over the years and it can be very cumbersome.
It can be expensive to own.
No.
Can be beneficial for large companies, but check out alternatives. Some of them might fit the bill for less money.
REST API. It lets me do what I need to do, instead of what HPE Quality Center does on its own.
By using the REST API, I have automated QA Reporting, and integrated QA information into the development build process.
Its performance is horrible, and it's unnecessarily complex, which means the local site administrators set it up to be used in very unproductive ways.
10 years (including earlier versions).
Yes. While most of it is introduced by our poor local setup, that is a direct outcome of my complaint mentioned in the need for improvement.
Yes, but again this mostly do with how we implemented it locally. Again, it is an outcome of the issue that the local site administrators set it up to be used in very unproductive ways.
On the lower end.
I have a lot of trouble getting to useful information – on the HP site, and with their technical support. Though I’m far removed from interacting with HP support directly now (at one point I was on the local support team for HPE QC, but now I’m just a user within my company).
Yes – HPE QC is much better than anything else I have seen.
Extremely complex, and unnecessarily so. Main reason was HPE QC doesn’t do a good job of explaining how you can keep it simple and still get the same job done. The tool is ready to do a great job, its how it gets implemented that is the real problem.
I understand that it’s still extremely expensive.
Yes – PVCS Tracker, Compuware’s Track Record, SmartBear, and JIRA. Some groups use JIRA for defect management (in addition to its development usage), but local JIRA usage is just as messed up in its setup that it just recreated the problems which we have with HPE QC instead of solving them.
Same advice as for any Test Resources Management product: KISS – "Keep it simple, stupid."
You can plan your test execution very thoroughly with configurations and setups in the test lab. This is very unique in comparison with other tools.
To add test cases from the test plan in test lab, the filtering function is not very user-friendly. Comparing the functionality with the querying/filtering functionality from TFS, which is much more user-friendly, it is clearer and the default screen is almost full-sized,
Two years.
No.
No.
I do not know.
I have worked at other companies, I cannot choose between solutions.
No, I was not the admin of this application, and the setup itself is not complex, nor very straightforward (but somewhere in the middle).
I would not recommend this product If you have a small organization, which would not use this tool very often.
The pricing is very expensive, but it covers a full solution, which means, you will use a lot of functionalities and gain in time and efficiency.
No.
Defect management, because it has allowed me to manage the defects throughout its lifecycle (from being opened to its resolution – closed); who is assigned to it and working on it, what are the issues, and why it is being held up. It gave valuable metrics.
It forced different parties to collaborate better. It gave a lot of information to team members, also additional information to stakeholders in the defect summary, and items to highlight, when needed.
At the time, the dashboard never really worked.
Three to five years.
Yes, there was a problem in stability when the number of test cases grew at a very fast pace. Adding more memory, backup, and restore remedied the instability.
I can’t remember any.
On a scale of one to 10: five.
Yes, it was forced on the company. Change was to facilitate management of defects between multiple parties.
It was straightforward because the test manager was consulted on how to go about setting it up.
It is very pricey. To be fair, it is geared for enterprise use. Unless the company and the decision maker (who would say “yes” to paying for it) is convinced on benefits of the product, it is not a go. Medium businesses would not buy into it.
Do the cost benefit analysis, and understand how the product/tool is a solution rather than a nice-to-have because it is flashy. It should fit the organization’s size and needs.
Test planning, Test Execution, and Defect Tracking, as these are the code artifacts/deliverables for testing.
A centralized and unified repository of requirements and testing artifacts with access across geographies in real time, which improved efficiency and efficacy of application lifecycle management, including integration of Test Automation tool (HPE UFT tool).
Two years.
No.
No.
Nine on a scale of one to 10.
Had used Quality Center and Test Director in the past and HPE ALM is much more enhanced and useful.
The initial setup is more secure and procedural.
Pricing may be a factor prompting a look at competitor products and differentiators or open source options with limited product features. Licensing could leverage combo packs with HPE ALM such as HPE UFT, HPE LoadRunner, etc, as well as continued usage of product post-expiration of license without upgrade and technical support.
No.
Get input regarding the evolving core customer's unmet needs to help choose the right solution.
The Defect Module helps the project team to track the defects in a structured and manageable way until defects closure.
Helps in terms of managing defects easily.
The technical support from the vendor is weak, as I believe the back end script support for the tool is not easy to access and modify by the customer.
12 years.
So far fine.
Not good.
No.
It depends on which version. The upgrade from version 9 to version 11 was not a straightforward upgrade because the whole platform changed. So the upgrade was a manual project and it took the organization a few months to upgrade all thousand projects.
Depends on company direction.
No.
Compare with other test management tools to see their benefits. We did compare with JIRA. Every tool has its own pros and cons. It depends on the organization's needs.
The biggest benefit is it’s a seamless way for demonstrating the validity of the testing effort from requirement: test planning, execution, and ultimately, reporting.
When I first arrived, everything was manual, no single process, etc. At this point, I set up a standard testing practice utilizing this tool for all testing. It allows for test management to be seen by senior leadership.
Reporting.
Over 15 years.
No.
No.
A seven out of 10.
Not relevant.
It could be complex depending on the setup. However, I have done this for a number of years and do not have any issues with it.
It is expensive.
Yes. Micro Focus, IBM Rational, and Spectra.
It is the standard upon which all products are gauged.
The test generation ability, coverage reporting, and risk prioritization. From a test and defect management perspective, there are few that can compare for delivery in a traditional/waterfall project).
For our clients it provides a centralized means of accessing data, monitoring progress, and creating a singular view on the test status within the business. A number have used it to perform comparative analysis to determine the resultant cost saving and benefits achieved as a result of using the application, and determining process improvement.
The product could do with more native integration for agile projects, a greatly reduced cost model and closer integration with products that are non-HP.
On and off for various customers for 10 years (i.e. going back to when it was Mercury Quality Center).
Not normally.
No. This is an enterprise application and scales accordingly.
Seven out of 10.
We use multiple solutions depending on the needs of our client at the time.
Straightforward.
HPE is constantly updating its licensing to be competitive, but as a whole the pricing for ALM is very high for the local market and, as such, we see a larger adoption overseas.
Yes.
Perform the recommended due diligence when adopting any new tool and ensure that the tool adoption correctly addresses the problem being experienced. This is a good tool and if correctly implemented will provide a solution to a number of delivery focused issues within a business.
What I love about it is that it can be used for the entire lifecycle (from requirements to release) and the dashboard is great for reporting.
The organization I am currently with does not use ALM. I recently moved. At AMC, we began using it during my tenure. When I got there, it was shelfware and they were using spreadsheets. ALM improved our ability to share reports which were understandable to all levels of business. Also having all the bugs for all the projects in one place instead of in a spreadsheet allowed more visibility of issues to all teams. Having test cases in ALM also helped us with spending less time doing rework. Being able to reuse the tests instead of creating new tests saved time.
The product could be more user-friendly. It's pretty easy when you know what to do, but it's not intuitive learning that all the features are available. For example, it’s not obvious that you can update a set of test cases with multi-select cut/paste. This was not available in its previous versions.
A lot of people also think all you get is what you see out-of-the-box. A lot of inexperienced users don’t know almost everything is configurable.
16 years.
No.
No.
I rarely needed any as I am an expert administrator.
Yes, spreadsheets. We switched because we grew from three people to 10 and needed a tool. ALM was one most of the team had used at other companies.
Complex. There are many steps to follow to set everything up for an upgrade, and if you make a mistake, it's catastrophic.
It is definitely overpriced. Most of the cost is for support that you rarely need if you have an onsite admin.
No.
It's better suited as an enterprise tool that can support the licensing cost, instead of for a small shop.
I have used HPE ALM before and the feature which I loved the most is the ability to create Excel reports and inject macros to format your exported file/report.
Tests and test flows are organized properly.
Notifications to developers/testers is also a plus!
Probably more fields to customize.
Almost nine years, and I've been working it for 10 years now.
I have no experience setting up the server. I have only been an admin.
The product is pretty stable.
No.
Customer service is good. They respond on time to inquiries and requests.
Technical Support:The technical guys at HPE are good and knowledgeable on the solutions provided.
I have no experience using other test management softwares.
Initial setup is complex when you want to customize the product.
In-house.
We didn't look through the ROI of using this product. We look more into the ROI of automated tests.
Pricing is high compared to other solutions.
We're looking into using JIRA, but skills-wise we have more resources which are knowledgeable in using HPE ALM.
ALM: You cannot just say one feature is most important. You get the most value
using all modules from Management to Defects. When you use the tool end-to-
end, you can pull efficient project reports (especially scorecards) from the
Dashboard. So everything is integrated and only then you can evaluate the tool
fairly. ALM is very flexible and each module can be used independently, but
when you do that you are only using the tool as storage, not as a test
management tool.
UFT: It became much more stable tool in terms of object recognition over the
years. It is easy to use as long as the user has basic software development
knowledge and understands that the software automation process is not just a
record/playback.
ALM: We currently successfully manage all testing projects due to ALM’s invaluable capabilities, which are listed below:
UFT: We save time executing smoke and regression tests. We also use UFT to create test data.
I would like to see better Reporting functionality especially more sophisticated graphs, for example Actual vs. Planned or high level progress graphs using indicators like traffic lights. I would like to see more sophisticated or flexible Dashboard views, such as editing and resizing. I use scorecards and pull them into the Project Reports using customized templates. Scorecards can only be refreshed from the Site Admin, which then test leads has to depend on the ALM Admin to refresh the reports if needed after the scheduled auto run. There should be ability to refresh scorecards (execute KPIs) from the project itself or give more frequent auto refresh option like even every 5 min. This is a really burden on the team.
I would like to see Requirements mapped to test steps so we can combine multiple requirements validation in to one test case but map the verification steps to the associated requirements, so if the step fails only fails one requirement not all. When we are operating in an Agile world we do not have time to write test cases to capture one-to-one coverage. I know ALM allows many-to-many mapping but we cannot get true requirement pass/fail status if we use many-to-many option. Test configuration option kind of on the right path, but can only be use for data driven test cases, I cannot add design steps. If we can add design steps to a subset of a main test using Test Configuration option, I think we may be able to differentiate individual requirement that was failed without failing all the requirements mapped to the main test case.
We have used this solution for 17 years.
I did not encounter any issues with stability.
I did not encounter any issues with scalability.
In terms of technical support, I usually get solutions to my issues. I did not have any issues to call technical support for a long time.
If you follow the instruction, the setup is straightforward. It definitely requires an experience user to do the installation and setups, especially for upgrades.
I always used ALM and UFT. However, I had training and evaluated IBM JAZZ tools.
We identified an object that was supposed to have a width of 30 characters, but instead had 100,000. No manual tester would have found it, forcing developers to take a second look at all objects which uncovered similar size issues.
While my experience tells me successful automation projects are at 70% coverage of manual test cases, we have been able to hit well into the 90% range of .Net automation with this tool.
So the first impression that hits me about HP UFT 14.0 (formerly QTP) is that it seems to be a whole lot faster! But that could be subjective, as I'm running it on a high end gaming system.
And my second impression was "Oh man, why does it still do THAT?"
Let's review the good stuff:
VBscript language -Easy to learn, surprisingly powerful and extendable.
What I will call the "PDM.DLL feature" provides a list view of any object property and methods at run-time from the code as well as the two other windows.
Built in Excel Datatables for Data-driven design
Revamped beautiful HTML results report with screen and movie capture
Terminal Emulator automation.
Modular design (through functions, ALM components and Flows)
Launch through Jenkins brings CI to the test automation development team.
Can leverage Windows API calls as well as custom AutoItScript for enhanced features.
The wide range of supported current and legacy web technologies, desktop apps, and WebServices testing is by far the most valuable feature.
Even in the case where technology is only partially supported, being able to customize out-of-the-box object methods is another time saver.
For example, we recently started to investigate automation of an AngularJS application. The problem was record/playback (UFT 12.54) did not work on it. However, the Object Spy correctly adds objects to the Object Repository. In addition Descriptive Programming worked from our custom framework. We had a basic login/navigate/verify Proof of Concept test operational with AngularJS Buttons, Links and Images quickly. Minor custom coding was required to override .Set methods of WebEdit objects, and more will be needed to support it's Angular WebTable objects. Totally doable for an experienced level team or user.
There are always new features and more support for new and legacy technology architectures with each release. But the bad news is a growing list of long-standing issues with the product rarely gets addressed. While I have a larger list of issues that make day to day work harder than it needs to be, these are the Top Five that I do wish would capture someone's attention in upcoming releases. All hit the tool's ROI pretty hard.
#1) Jump To Source - The Silent Code Killer: In older QTP versions a double-click on any function in the Toolbox window would take the developer to the function's source code, while a drag from the Toolbox would add it to the code window. Since 12.0 a double-click on a function in UFT's Toolbox window now ADDS the function (same as drag) to the Code window - to whatever random location the cursor happens to be at - even if it is off screen, and it will replace sections of code if it is highlighted. We are not sure what the intention was, but our Best Practice is to avoid the Toolbox window entirely to avoid the real danger of losing days of work and needless bug hunts.
Now Jump to Source is not all bad. A right-click on any function called from a Script takes us to the code source, which is great! But it only half works: in a Library, only for functions declared within the same library. Our advance designs have well over twelve libs so a whole lot of extra time is spent searching the entire project for a function's source on a daily basis.
Lastly, while we can add custom methods to object, a Jump To Source from these methods is long overdue. So again our only option is to search the entire project.
#2) Object Spy: It needs to have multiple instances so that you can compare multiple object properties side-by-side. It lacks a Refresh button, so that automation engineers can quickly identify the property changes of visible and invisible objects.
Or HP could skip to option #3...
#3) Add RegEx integer support for .Height or .Width object properties when retrieving object collections. If this were possible, our framework could return collections that contain only visible objects that have a .height property greater that zero. (Side Note: the .Visible property has not returned a False value for us in nearly five years - a recent developer decision, not a product issue) Eliminating the need to separate the non-visible objects from visible ones would decrease execution time dramatically. (Another side note: Our experiments to RegEx integer-based .Height properties found that we could get a collection of just invisible objects. Exactly the opposite of what we needed.)
#4) The shortcut to a treasure trove of sample code in the latest release 14.0 has been inexplicably removed. This impeeds new users from having an easy time learning the tool's advanced capability. In fact the only users daring enough to go find it now will be you who is reading this review.
#5) Forced Return to Script Code. This again is a no-brainer design flaw. Let's say we run a script and throw an error somewhere deep in our function library. Hey it happens. In prior QTP versions when the Stop button would be clicked the tool would leave you right there at the point where the error occurred to fix. Now in recent releases, UFT always takes us back to the main Script, far from that code area that needed immediate attention.
The product is surprisingly stable. For the flaws that I mention, if stability was an issue we would not have been using it for the better part of two decades.
The only time that the product gets unstable is when you try to do wicked advanced coding. For example, when you are trying to execute dynamic code strings with the Execute command that might not have been generated correctly. My years of experience tells me that if something is going flaky, then it's the developer's fault (me) and not the tool's fault.
Scalability is quick and easy implemented through a framework. Let's say we write a custom function
VerifyValue (oObj, sExpectedValue)
that can compare the expected value, string or integer, to an object's actual Value property. By adding the function as a .VerifyValue method to all WebEdit class object, the functionality is available to all current and future edit field objects regardless if they are in the repository or programmatic descriptions. And it is done with just a single line of code called RegisterUserFunc()
Now Let's say we now want to extend the verification to include value falls within an expected range? Add the code to the VerifyValue() function and all fields support the new functionality.
Scaling to new pages with new objects is not an issue either. The tool allows advanced users to design frameworks that can identify objects on the fly from "Plain English" descriptions ("OK LINK") without using the Object Repository. This may remind you of a Gherkin/Cucumber approach and the tool is certainly flexible enough to design just such a framework.
Orasi's technical support is an A+.
HPE's direct technical support is okay.
Having an issue getting UFT to work with your technology stack? Some versions of QTP used to include an oft overlooked document called the Product Availability Matrix (aka the PAM) telling users exactly which Technology version worked with HP's UFT tool versions. Unfortunately, due to a inexplicable "horse-and-cart" decision, HP has chosen to remove this document from UFT 14.0 install and provide access to it only AFTER users have purchased the product. So I have to buy it to learn how to make it work with my technology stack? Wait, what?
The install/uninstall of the updates are fast and easy. Many of the no-brainer configuration settings are set up correctly out-of-the-box. To be fair, we do have a 20-page document that walks through all the settings to check.
Sure, HP UFT is not free. But consider what you get for that cost: A stable product that is easy to use; the kitchen sink of technology stack support; decades of code (which in many cases actually is free); a version that is a stepping stone to an easier Selenium design; and a support base that is more that just the kindness of strangers.
Want to take UFT to another level for free? Add AutoItScript calls.
If Selenium is your thing then make it easy on yourself with LeanFT, aka UFT Pro. This gets you the easy object recognition of UFT, in Java, plus execution of concurrent test cases in multiple browsers which is a trick standard UFT does not do.
I am always downloading and evaluating other products. SmartBear TestComplete is the next closest option.
Take tool training by someone with years of experience.
HP, Orasi and RTTS all offer the level of training that gets you to the advanced state quickly.
And if you might be longing for the IDE toolset that Microsoft offers in Visual Studio, then look at Test Design Studio from Patterson Consulting to enhance your UFT toolbelt.
Defect Management: This feature allows us to track the defect status for our project, send the notification to the user via email and all the details about the defects can be maintained for the future reference as knowledge center.
Graphs and Dashboard: This is one of the top features, by which we can track the status of the project with ease to keep track of project management and executive reporting.
The live graphs can be exported via public URL's and can be integrated with SharePoint and others as required.
More collaborative, ease of work, and better documentation of all project activities.
Certainly on the UI part, it has to be improved to make it user-friendly and more presentable.
More than three years.
No issues with stability.
No issues with scalability.
Technical Support is great and always responsive to solve the issues.
Nope, the architecture is simple to implement and scale.
The price is a bit high.
Go ahead with this tool. It is for the project management and test execution.
Do consult a few of the other folks using this tool to understand the tricks.
Features that support documentation and test results from Requirements through Test Cases to Scripts to Execution results (pass or fail), then to Defects.
Life was made easy by shifting the MS Office documentation to the product. We deal with our test cases and execution, and the mapping is taken care by the product. You pass/fail a test case and product tells you what requirements/features of the product are good to go into production.
Auto-generation of automation scripts. Integration into the UTF (earlier QTP) has little more scope to improve.
I've used this solution for four years.
No.
No.
A nine out of 10.
No.
Straightforward.
Very expensive.
Some Open Source tools, but did not choose them as they lack support.
Have a skilled person to do the administration. You will love the ease of reporting that results in.
All the options are designed so if you are using this tool for Test Management, then you must have them all enabled for efficient use.
It has basic and advanced facilities that allow you to arrange all of your test data in very well arranged, systematic manner.
Graphs can be further improved to manage more requirements at a time.
We have you used this solution for seven years.
We never encountered any issues with stability.
I did not encounter any issues with scalability. It depends on the number of licenses. If you have them all active at the time, it still responds the same way, which is fast.
I would give technical support a rating of 10/10.
We used Excel, but it was so difficult to manage the various Excel files and sheets just like maintaining a register book for your records. ALM allows you to handle them all with just one click.
The initial setup was straightforward as it it required little knowledge of the server to do a simple server-side installation.
If we have huge data and more number of users managing the data, then ALM is the best option for our organization. It also applies if your team is distributed, out-sourced or in-sourced.
We did not evaluate other options. I was aware of their earlier version which had a very good market and feedback, I didn't have to waste my time doing a PoC on any other product.
You must look to implement the minimum system requirements, Configuration, for the server for optimum and efficient use.
Customization is the most valuable feature. I don’t see other products with such a huge and detailed customization feature.
The Web UI and the Administration Page need to improve. These are not issues, but areas to improve.
Web UI: In some version (I don’t remember exactly the version number, unfortunately), Web UI was introduced with limited functionalities but it was cool in my opinion. Web UI has a lot of advantages and the main advantage is that the customer doesn’t have to install a big thick client on his machine. Web UI is faster, lightweight and easy to configure compared to the on-premise installation. But for some reason, this functionality (all Web UI) was removed in the next release.
Administration Page: It doesn’t have any issues. The main idea here is that it was written using old technologies and it would be great to improve/rewrite it using modern technologies. It doesn’t have an impact on the customer, it is more for developers who should support this area.
I have used this solution for five and a half years.
There were no stability issues.
There were no scalability issues.
Technical support is very good.
Previously, I have used Redmine and JIRA. Switching to HPE ALM was not my decision, I changed my job.
The setup was straightforward.
Just want to advise to use ALM SaaS in case you don’t want to use ALM all your life. It is cheaper and provides you with all the ALM features but for some period of time. Also, you don’t need to install it manually, the entire environment is already installed.
As an advise for developers who will develop such applications I would like to say - always try to support the area in an actual state, i.e., by using modern ideas/technologies if possible. Also, listen to the customer's needs, have flexible customization tools and do not forget about performance.
HP ALM is a good tool for a centralized and coordinated view of requirements, tests, defects, and iterations.
The main barriers of entry are cost and implementation, especially if an enterprise implementation is the best solution
Being able to have one place to review defects, testing progress, and defects was very useful.
Merging 40 different streams, just for defects, into one solution that had good search and reporting capabilities saved a significant amount of time in coordination, defect management, and by consequence, there was better control of the quality of delivered software.
The main barriers of entry are cost and implementation, especially if an enterprise implementation is the best solution
I have used this solution for eight years in a variety of versions and companies.
The biggest challenge was finding the appropriate resource balancing for the enterprise release. It is not very clear how that was going to be implemented due to documentation in 2010.
If there is a need for third-party integration, the documentation is not very good. We were able to integrate with FIT, but it took a very capable programmer to figure out how to do it. Again, this was in 2010. Hopefully, the documentation has improved.
I did not encounter any issues with stability.
I did not encounter any issues with scalability.
Customer service was adequate.
Technical Support:Technical support was adequate
We used a lot of home grown 'tools' and spreadsheets in one location and Lotus Notes in another.
In one instance, it was straightforward because anything was better than spreadsheets.
In the location that used Lotus Notes, there was a significant amount of resistance because of loss of control.
Neither instance was due to the tool, but it was due to cultural issues.
The implementation was done in-house.
At a Fortune 100 company, we achieved a reduction of 30% of defects in the first year and decreasing percentages the subsequent years.
The dollar figures were proprietary, but were significant even for an $11 billion dollar revenue company.
The main concern is that there is a significant dollar investment, so do good research to make sure the tool will meet your needs.
We evaluated IBM tools, as well and a couple of Open Source tools.
It has improved collaboration between our test teams.
We have been using the service for 18 years.
The solution is very stable, if you have the right person to manage it.
The solution is very scalable, if you have the right person to manage it.
The technical support has been deteriorating since the Mercury days. I would give technical support a rating of six out of 10.
We were looking at other solutions, such as JIRA, due to all the issues I have raised.
The setup was straightforward.
HPE ALM has been sold to Micro-Focus. I am not sure if Micro-Focus will be flexible.
If no flexibility is provided, you can easily move out in weeks, if you have the right resources.
We evaluated Zephyr, QASymphony, XQual, Perforce, and Rational Quality Manager.
For integration purposes, we evaluated Tasktop, Orasi, HP Synchronizer, and ConnectALL.
For automation purposes, we evaluated UFT, Selenium, and Robot.
Below is a checklist for others considering a test management solution:
Playback, test tracking, and defect management are the most valuable features.
Playback made it easy to create automated functional tests that could be reused.
Test tracking eliminated the need to track test executions on a separate application, such as Excel.
It provides ease of use, the reporting status and tracking of defects.
I no longer use the product as its price does not fit into my QA budget.
Its pricing does need to improve. As I recall, when I was working at my previous company, we paid over $100,000 a year plus maintenance. At that time, I could have purchased a RadView selection for that much and reduced the annual maintenance to around $15,000.
I have used this solution for about 10 years.
There were no stability issues.
There were no scalability issues.
Earlier, we used a home-grown, Lotus Notes-based system. Later on, I moved to a position where HPE QC was already being used.
The setup isn't too easy nor too difficult; I'd put it somewhere in the middle.
This is an excellent tool for larger companies. It has been my experience that it is not cost-effective in medium or small-scaled companies.
While at my previous employer, I evaluated other options and recommended moving to a more cost-effective tool. At that time, I recommended RadView.
Make sure it fits into your cost structure and consider the annual maintenance cost in your evaluation.
Defects and Test management were earlier conducted with the help of Excel sheets. Now, they are tracked in the Quality Center leading to accountability, dashboards, and being tracked in a single place.
Licensing model: HPE has one of the most rigid, inflexible, and super expensive license models. It is an extremely heavy system application. The UI is very dated. Most applications these days have a light UI that can be accessed by pretty much any browser; QC still uses a UI which has a look almost the same for the past 20 years! I am guessing they are doing this to maintain the same look and feel so that they do not have to get their customers familiar with a new UI. When you compare this system's heavy UI with JIRA or TFS, the difference is evident!
I have been using this solution from the time it was part of Mercury Corp as a Test Director. That makes it around 17 years.
Past versions were a pain to use with frequent crashes. The current version also has its own set of problems with HPE deciding to do away with its HTML/Lite version leading to a lot of confusion.
While getting additional licenses is straightforward, HPE's licensing model makes life difficult with customers having to submit a "non-usage" agreement if they do not want active support for part of the licenses. For example, you have 100 licenses and decide to get an additional 50; later on you want to downscale to 50, you will need to sign a document that says that you will not use those licenses even though you OWN the licenses! We found this extremely irritating and impossible to explain to end customers who were (and still are) irate. Support should not have anything to do with usage!
Technical support is pretty good. However, their SLAs are based on locale, timing, etc.
We did not previously use a different solution.
HPE has one of the most complicated installations. Upgrades are a nightmare. Even HPE recommends doing a fresh installation and a cut-over.
HPEs licensing model is inflexible, rigid, and is not customer-centric.
We did not evaluate other solutions.
I would recommend going towards another solution unless you have an entire HPE shop. Other similar products offer more features, are lighter, and are very light on the pocket book, too. We are also moving away from this product, primarily due to licensing costs
The most valuable thing about the solution is it handles requirements, tests, and defects in one tool.
Most departments and some of our third-party vendors have access, so HPE ALM can be the single source of truth for what we are doing and how things are going.
How they organize content could be improved greatly in an out-of-box way, at least as a possibility for the users. The simplistic folder capability for reqs/tests doesn’t lead the users to a very productive method of test management.
It would be better to have suggested methods such as storing by subject/feature/functional area and to lead users into organizing this way. Then you wouldn’t run into the need to move things around in folders when release schedules/versions change.
Also, the style by which you document your regression tests is more automatic since you stop copying tests to a new folder for each release.
We have used this solution for two years.
There were no issues with stability.
There were no issues with scalability.
The level of technical support is very good.
I’ve been around a while and designed a few test management and automation solutions while I was with Motorola. I think our solutions were better, but of course, we had to spend a lot of resources on their creation.
The initial setup was straightforward, as I’m new to US Cellular and the tool/processes were in place when I got here. We are making some changes to drive improvement, but we are also analyzing how to go agile which isn’t easy.
I think HPE tools are too expensive, but dumping them for shareware tools, like JIRA, Selenium, etc., is also very dangerous and is not a silver bullet.
It’s too expensive for most organizations compared to some other tools on the market. I’d look at QASymphony, Borland, and of course IBM, before committing to any of them.
The report feature has helped me to generate a quality index report which is a critical report for management decisions. The report is generated monthly.
There are many reports that are generated monthly from the tools which assist us in making key decisions concerning the quality of software and products.
The product creates a database per project and this results in poor disk space management, as well as frequent backup and restore. This should be improved upon.
We have been using this solution for four years.
There were no issues with stability.
There were no issues with scalability.
Technical support is good.
We did not use a different solution previously.
The initial setup was straightforward.
I suggest organizations should attempt cloud services.
We did not evaluate other options before this one.
It is a good product, but it still requires customization.
QC has been invaluable in the past for documenting our testing process, especially when needed for audits.
The Active-X technology requires client-side installations that are difficult to manage in environments where the tester's PCs are locked down to prevent installs. Test management is too rigidly dedicated to older ways of testing with scripted test cases. More support for newer approaches, such as exploratory testing or behavior driven testing would make QC more relevant to the way testing is done in many current contexts.
ALM requires that you install client side components. If your organization does not allow admin rights on your local machine, this means you will need someone to run the installation for you with admin rights. This client side install is also limited to Internet Explorer and does not support any other browsers.
As far as the test structure goes, you are limited to to a step-by-step test case with description, expected result, and actual result for each step by default. This makes it difficult to support an exploratory testing approach with ALM. Of course, much of this part of the tool can be customized, but it still pales in comparison to something like the Test and Feedback tool that Microsoft provides for exploratory testing.
My understanding is that the newer Agile Manager product is more friendly to exploratory approaches, but I have not used this product yet.
We have been using this solution for 16 years.
In terms of stability, the QC client crashes often when attempting to expand a node on a tree. Upgrades are a nightmare and documentation is difficult to understand.
There were no issues with scalability, but I have never managed a large user base.
Technical support has gotten better than it was a few years ago, but Tier-1 seems to just go through the motions of asking questions I've already answered.
I have used other solutions, but many do not have the traceability requirements that ALM does.
The initial setup was highly complex, mostly because of the database setup. Upgrades are even worse, especially if you need to migrate to a new server, since the repository needs to be copied over.
Make sure you get the correct license for your needs. The full ALM license lets you use the requirements tab, along with test automation and the Performance Center. You can also just buy the Quality Center edition (Manual testing only), or the Performance Center version (Performance Testing only). I have no idea where they get their pricing numbers from, but they seem to always be negotiable.
Be sure to have a DBA available when you install. There have almost always been changes needed to the DB when I have installed the application.
I personally found the defect tracking feature very useful in my ongoing project. Our complete Agile process of Software development and QA is dependent HPE Quality Center.
It has streamlined communication with business users and technical users whereby business requirements are now tracked on HPE Quality Center.
Thus, providing clarity with simplicity.
I've faced a couple of bugs in the product whereby we were not able to open attachments on a particular ticket.
The session timeout time also needs to be longer in my opinion.
I've been using this solution for close to two years on a daily basis, and it's a wonderful product.
Not in particular, only certain defects that I've mentioned for the question above regarding areas for improvement.
Certain minor issues such as error in opening attachments linked to a ticket and session timeout issues apart, otherwise the product is stable.
The product is scalable and consistently delivers the required purpose.
HPE has dedicated support with this licensed product and they were available for any queries.
Technical Support:Technical issues were resolved based on SR with HP team and the response was good.
We have used HPE QC from the onset as the scale of our operation and the throughput of requirements were greater so using other solutions, such as Kanban, were not feasible in our project.
Initial setup was straightforward with installation and HPE provides tutorials for advanced modification.
It was implemented by our client team and we helped them with the offshore installation.
The value created was high in our project as were dealing with an application with high public exposure, so the smooth working of tracking requirements and defects definitely gave a good ROI.
For pricing, I recommend to buy a bundled package. Check the HPE site for more details.
Other teams had used Kanban and its visual style of quality control, but it was not a good fit in our project.
I highly recommend HPE Quality Center for its simplicity and ease of use whereby Business and Technical teams can see each other's progress and help make better decisions.
Being able to quickly and easily compile executables on multiple platforms.
Just over 10 years ago, our organization joined the Australian Bill payment scheme, part of the technical requirements involve having to build a COBOL interface to their rules validation program. So we needed to source a COBOL development environment which would allow us to quickly develop and deploy a COBOL interface.
None that I can think of from my experience, MicroFocus COBOL does everything we need it to do.
More than 10 years
No, we found the support staff at MicroFocus in Australia to be highly knowledgeable and very much of assistance in helping us get our products setup and deployed.
None, the product has been absolutely rock-solid.
No, the solution has coped with tens of millions of dollars worth of transactions over the years.
Very high.
Technical Support:Extremely good.
No.
Yes, setup was fairly straightforward. It needed a little help with regards to the license server on the development platform, but once that was sorted out everything was smooth sailing.
We developed our solution in-house.
Substantial, the payment platform is an essential customer service.
We found the pricing to be very reasonable.
We did, however MicroFocus stood out with their excellent technical support.
Definitely support of the agile methodology is the most valuable feature. We have received a lot of feedback from our agile teams that the ALM.NET is not supporting their work and it was really great to see that ALM Octane is fulfilling those needs. But our development and testing teams are looking for the new agile and DevOps deliveries.
We see a double benefit because part of our business is still very legacy-type. We are running the mainframes, and so on, the old kind of solutions, where we pretty much see that, at least for the next year to two to maybe even three years, we will continue using the ALM.NET, as such, maybe even for the functional testing and the UFT, as such. But another team is quickly adopting the agile methodology and there we have hardly seen any validity at all on the Octane for over a year now. We started to implement it the first real project in NND Center, and see good results from that.
The new solutions that are soon coming for ALM Octane, such as predictivity and requirements management, are very welcome. Those have been missing from the existing solution. So far, we have been able to manage with the other alternative solution and integrations, but I am also really looking forward to that.
So far, we are very happy with stability, even though knowing that there is quite a lot of new development, especially for the ALM Octane. But so far, so good. I have nothing bad to say.
We are at a very early stage in implementing this solution. But at the moment it looks promising. Although, it is difficult to say how far it goes. But at least, so far, we have started.
So far, technical support is very good because we have been using HPE products, or the earlier Mercury products, for a long time. We have a quite good collaboration with them. From that kind of background and knowing our kind of working environment and solutions, together with their technical support and help, we have been able to implement these tools in the right way the first time, without trying to invent the wheel on our side.
Setup was pretty straightforward. Obviously, we kind of had a bit of discussion internally, because we didn't take a traditional migration from the earlier product. We really started from scratch. That is still somewhat an issue for some of the deliveries, that they don’t want to use the agile method. But we have highly recommended this because they are two different worlds and that it would be better to plan it carefully and not just carry on all the crap from history.
Our development teams are using a lot of open source solutions and other tools, such as JIRA. But, for our business needs and the business purposes, we have seen that HPE solutions are still valid for our business. We need to have backwards traceability. We have to have the capability to show what has been done, what's been going on, and what. In some of the cases, there has been the discussions that, "Yes. We have all this information, but you have to go to the Jenkins, or this and that logs, and it's there." But that's not what the business wants to see. They want to have a high-level visibility on their business. That is why we are still keeping the HPE products, and probably also in the future we'll have them.
Allows us to do defect resolution from logging to resolution. It gives us an overview of all the defects and prioritization for resolution.
The dashboard was not working at first. Later on, it was difficult to customize it.
We have used it on and off for over four years.
In ALM, the most valuable features are the overview, the primary requirements, test cases, defects, and traceability. Manual applications handle the regulations, so we must have the tracking capabilities. Even some of the core systems are not allowed to go down. It's very important that we know what we have tested and what is working and what is not working. That we can find out from ALM.
Stability is no problem.
The first time we installed it was a long, long time ago. We bought small, five license versions of Test Director from Mercury in 2007 and it has continuously grown since then. Today we have 600 users and 130 active projects. The environment gets bigger and bigger all the time.
It's complicated to upgrade. For ALM, we have roughly 600 users. In ALM, we have roughly 130 active projects. So it takes a long time to upgrade. Some of the big projects are 5 GB of data. To migrate that to a new version takes maybe two or three hours, even if we have huge hardware.
It's very complicated. We'd gladly like to upgrade to newer versions. We plan to use Octane, but we will not end up in a situation where we have two tools. We would like to, but we must find a smarter way to do some kind of migration. Several of the applications have regulations that we follow and we must be able to track 10 years back. We can't just throw away the data we have in there.
If not upgrading ALM, probably they would like to search and would like to find something else. They really need to find a smart way to migrate some part of it. Of course, it's a totally different tool.
We have looked at many alternatives. We have compared ALM to almost everything. We even have JIRA for smaller projects now. ALM and JIRA are two totally different products that are for two totally different needs.
For example, we have an on-premises solution of ALM. You have to log into the active directory, so it's not so easy to give to someone outside the company. It's also struggling with different browsers. It's doesn’t work very well on a Mac, for example. The Mac developers and the Mac teams don't like ALM. Now it works much better on Chrome, but we're struggling there as well. They haven't been following the world with browser support. It's problematic to use ALM in Edge, for example.
But with JIRA, on the other hand, you don't have any requirements. It's easy to set up. It's easy to start up and have your backlog there. But after a while, you figure out what is going on. For maintenance and for testing, you need a plugin for this, you need a plugin for that, and you need a plugin for something else. It's not so easy to get the overview or the helicopter view of it, if you compare that with ALM. But I understand why some like it and it has some kind of need. I hope we can mine that capital when we upgrade to Octane.
In our company, the most interesting thing is that ALM can be used for manual testing. The testers can define, by themselves, how they structure the test and then execute it. All the results, both the positive and negative one, are collected. There is easy defect creation.
On the other side, if you look at it as a project manager, you have to see the results, i.e., the current status of the project.
Afterwards, if you get an outage, it is important that you can show the regulators that you did a good job, you executed everything, and you went in production with a concrete status, with no big issues or critical errors.
Our biggest problem with ALM is the version upgrade and especially the migration.
We have 1400 projects which are active. With the next version upgrade, we expect more than 3000 projects that have to be migrated.
The migration itself takes months. Here is something that can be improved. It is very important for us, otherwise each migration would kill us.
I’ve been using ALM since 2004.
If you find the right patch, then it is stable. You can stay with that for years. In our situation, it takes a very long time to roll out a patch and even more time to bring a new release.
ALM is for sure scalable. We are running 1400 active projects with 15,000 users. Concurrently, we have around 1000 users. If there is a performance issue, we have to find out what the reason is. It is true, in most cases, that we need an additional database server. The application servers, if they have enough power, scale a lot.
For such an experienced team as my mine, who have been working with the product for more than ten years, it is not that easy dealing with technical support. They often do not have the knowledge that we have. It takes a while to train them so they understand what our issues are and we have to connect to second or third level support.
The collaboration between HPE and us, especially over the past ten years, has been very good. For that reason, I try to bring in more HPE products, if needed.
This is managed by Tieto, our managed service testing partner. We use ALM as a repository for our automated test scripts. This is only the very beginning of the of our testing and managed testing service journey. The reason we use ALM is it's ability to schedule tests and nightly runs. It creates reports and statistics.
We are only starting off now. I'm able to present the progress on our work with the test-information initiative. I can keep a close eye on what's going on to monitor the progress and to schedule the test runs.
I used Quality Center 5-10 years ago, and I had no issues with it. It is also the de facto industry standard of test management tools. I don’t have enough insight at this point in time. If you ask me in half a year's time, I'm sure I'll have loads more information.
We have had no stablity issues so far.
I know for a fact that it's possible to scale it up. We might add another test management tool in-house that's been there for a long time called Rec Test; a Swedish solution.
It’s a very simple test management and requirement tool. But in the long run, ALM will probably support us better, so that is on my radar to keep track of and see how we can implement them better. This will take a lot of training and convincing of end-users.
I have not used technical support myself, but Tieto is actually doing that in an effort to improve their own framework and initiate a closer relationship with HPE.
Setup was a very smoothly written, spot-on theater.
Not formally, but informally through my own experience. Our use with Rec Test, as well certainly requires more training. It requires a more structured way of working. You really need to set up a good structure, and make sure everyone is following that structure. Otherwise you'll have a mess in no time.
Valuable for us is having all the data in one place. Having the support for a process of testing. That professionalizes testing, more or less. You need input, you need a little bit of training as you could use vicarious approaches when you use ALM.
They can improve, of course on the deployment side, now whether that will be ALM or Octane I don't know. ALM-Octane with its lower footprint is of course easier to roll out and with the old Java client. I don't know if I could take to a roll out point of view. HP is probably stuck with a bit of a beast. We haven't got many areas where we think it can improve now, if we use it in combination with other tools.
A particular problem area for us is really to improve support for requirements. We come from an environment where everything had procedures for requirements from a business analyst point of view. Having to switch to ALM is a bit of a disappointment. You lose lots of functionality, for example VOV in comparison, baseline reporting. So far being able actually to write the requirements from a granular way. There are system constraints, very old times in ALM which make it an outdated tool for requirements to be frank.
We have been using the product since 2012. We use both ALM and UFT together, as a team.
ALM is very stable. I'd still rather you didn't talk about facts, but it's stable. We never had a crash, for example. If that's what you mean.
It is highly scalable.
The technical support is pretty good. It probably depends on the support contract type you have. Our contract works pretty well as we have dedicated support engineers for our product.
They are knowledgeable and responsive. Sometimes you need a little bit more, but then HP help us to find it as they're knowledgeable troubleshooters. So we never had a problem to get issues fixed when we found that particular person. It was very effective I guess.
The setup of ALM is, well, mixed. You know the old Java clients give some headaches. That's for ALM.
When considering vendors we look for stability, support and reliability. And that's probably it. So we probably are not going for small vendors.
We are using ALM especially in business process testings because this solution helps us to connect the business mindset with our business analytics and the IT perspective. When we ran several years ago classical tests business did not understand the quality of our application. When we began to talk with the business process, it helped improve communication between business and IT.
The business value from Octane has benefited us. We see that it can combine all third parties together, free software, integrate all tools together, and create a single pipeline for development.
The previous version of ALM had a greater functionality regarding test cases. Not automatic, BBT and so on, but now in ALM Octane we have only manual testing and integration with other tools.
I think we need to increase functionality to help us to run testing. When we run the ALM project so I think I can give it a better score after we assess our experience.
Stability is OK as we installed the superior version, and now we are in our testing environment. We are investigating the process how best to use it. I heard that we can download a new version and I think after this conference we can begin our pilot project for the real team because we have several areas for implementing developments and several comments.
We have 10,000 employees. There is a huge IT staff so we need tools that can help us to collaborate with each other.
We have used HP technical support for resolving cases and we also asked help from Presales in HP Russia. These guys resolved our problems and helped us to install ALM Octane in our environment.
When we installed it we had a problem because it's running on Linux and we needed to create our infrastructure for installing Linux which was a big problem. The previous version of ALM was working on Windows. But we resolved these problems.
It gave us control over the development of requirements and tests needed for the bank's transition from bespoke back-end systems to an Oracle banking system.
The user interface is still dated. Writing test scripts in HPE ALM is generally avoided as the interface is too awkward to use. At the software development company where I worked, the test scripts were routinely created in Excel and uploaded into Quality Center. This process was seen as much more productive than using the HPE QC interface.
At the bank where I worked, I was responsible for training and supporting the end user testers. I constantly found myself defending the way HPE ALM worked. When executing tests, the users would get themselves lost, would expect an action to have taken place when it had not, had lots of problems logging defects, had problems getting screenshots into a defect, and understanding and using favourites.
The interface they have developed is quite good at the top level, i.e., grouping into Requirements, Test Cases, Test Execution, and Defect Management. However, once you get into each area, the complexity of the application kicks in. There is no 'flow' of the basic functionality. For example, in Requirements, the basic function is to create requirements and link them. There should be a wizard that guides the user through that process, which includes suggestions about grouping and structuring, etc. Instead, you are just left dangling.
Another example: Test cases are to be written or modified, linked, selected for execution, and executed. Test execution in particular is a prime candidate for 'wizard-like' guidance for the tester. A much-clearer indication of where the test execution is up to by test set and test case as well the test steps would be most helpful.
What do I mean by 'flow'? Based on my own application development experience, the basic function(s) of the application should be obvious to the user and the easiest to perform. Extra functionality steps are seen as and executed as digressions from the standard flow. I realize that this is not easy in a product as complex as HPE ALM that has been hacked together by several companies and many developers over the years, which I guess is the reason it is the way it is.
We imported a large number of tests and requirements and found a few 'gotchas' on the way, but generally it filled our needs at the bank that I was working at the time.
I used HPE ALM at the bank for 9-10 months.
We had a number of issues with importing the requirements and test cases we had created in Excel. Mind you, we were using a very complicated setup in Excel.
We only encountered stability issues in the importing from Excel. Things would not go quite as smoothly as expected; quite a few emails back and forth to the local supplier.
I did not encounter any scalability issues.
The local supplier Assurity were good.
Technical Support:The local suppliers were very good, but if it was something that had to go back to HPE, forget it.
We were using Excel spreadsheets to try and test a major system change in a bank. Need I say more?
At the bank, the setup took the local supplier over two days to complete. The issues were mostly around meshing into the bank's security and local IT systems.
The vendor was okay. The person sent in to do the installation was very knowledgeable on the product.
It is totally over-priced.
I did not evaluate other options. The system was picked by the QA manager at the time, based on his experience with the product.
I think it is over-priced and would recommend looking carefully at other options.
In the case of the software development company, the use of HPE QC helped us to become compliant with GAMP and ISO requirements. The process of developing the software had to meet the regulatory requirements FDA 21 CFR Part 11.
I used HPQC at a software development company for about six years. We upgraded it once from 9 to 10. As far as I know, they are still using that version, because the modifications they have made over the years would make it too difficult to upgrade.
I don’t remember any deployment issues. The R&D department at the software development company handled all the install and setup.
I don’t remember any stability issues (it was over five years ago that I was at that company).
I did not encounter any scalability issues. We regularly ran test sets with over 60,000 test steps.
The response from HPE is almost non-existent. Most of our issues were fixed by the company's R&D section.
The switch to HPE QC was made because the existing system was not considered GAMP-compliant.
The software development company spent considerable resources in getting the system set up correctly using modifications developed in-house, as well as developing user guides that we could use for training users of the system.
An in-house team implemented it.
It is totally over-priced.
We had to modify the product considerably to get it to do what we wanted, especially reporting. It never did fully meet our needs for traceability back to user-readable requirements documentation.
Having a system of record that maintains traceability ensures that reporting and audit items are managed in the same system. This has simplified the need for additional documentation to meet audit requirements.
I have used this product since 2003.
We have a large ALM instance. The biggest issue with stability is related to reporting. To offset this issue, we are working on an alternative reporting solution that would use data warehousing and not affect the application directly.
There are scalability issues. HPE does not support clustering of database servers.
In addition, a specific number of users, concurrent usage, or databases has not been supplied by HPE as a best practice for a maximum per node. To obviate this risk, we are looking at leveraging three load balanced servers and one standalone application server.
The standalone server would be used for third party integrations, reporting, etc. End-users and automated tests would leverage a single vanity URL with load balancing spread across three servers.
We have HPE FlexCare. This provides for single points of contact, which is a must with a large organization.
Training is becoming an issue again with HPE technicians. That glib answer of issues being ‘fixed’ in a later release is being provided instead of true research of the issue. This is an ongoing problem we have seen working with them over the past ten years.
We still use a variety of SDLC tools within my organization. However, HPE ALM has been determined to be the best all around solution for testing of software across the enterprise.
We are doing a number of activities to reach a common goal, including leveraging the ALM template functionality and defining fields and list values across all testing applications.
HPE ALM Quality Center, (formerly HP Quality Center, and before that, Mercury Test Director), has been in use for over 10 years.
It is easy enough to set up an ‘instance’ of HPE ALM.
However, it is recommended to understand the business and process it will be supporting. This will ensure that standards, additional fields, etc. are incorporated early in the design.
If decisions on how the application will be used are not defined early on, then a later project to standardize it may be required.
Without standards, data cannot be shared easily between ALM projects, databases, and third party tools.
If you have more than five users, a concurrent licensing model should be considered. With concurrent licenses, there is no need to search for machines with unused licenses.
The most valuable feature is the Quality Center itself. It's a test management tool; so it enables us to manage and track tests, record the effects, and give us full traceability. It enables us to record what's happened and to have full traceability throughout the testing process.
It provides us with a central place for all our tests and all our test results.
The stability is very good.
The scalability is very good.
Technical support can be a bit hit and miss. Sometimes the support has been very good. At other times, the wait has been a bit long to get a response.
We were previously just recording all our test evidence in Word documents. So we needed a test center.
Make sure that you set it up correctly, and make sure you use the full range of tools in it. It will provide you with enough information for you to produce reports and get a full understanding from what you have done.
Our most important criteria when we choose a solution are reliability and scalability.
From a testing perspective, creating our test cases, executing test cases and then raising defects, it gives us good visibility back to our functional requirements.
It’s one product that allows us to do everything from a system test perspective.
We’ve seen a couple demos for Octane and Agile Manager. I think the direction it’s going will deliver what we want. At this time, it delivers what we need, but the direction that Octane is going is what I would like to go towards.
We’ve had the product online for about 7 years.
We have not really had any problems with it. It delivers everything that we need.
The way things are going, it’s going in the direction that we would want; with ALM, Octane, and all the types of products that we want.
I don’t use technical support personally. We have another guy that we report issues to and he looks after those.
I was not involved in the initial setup.
From a testing perspective, it is a great product for system test and delivers everything that you need. It gives you the complete package and full visibility from your requirements to creating tests, test execution and defect tracking. Definitely a great tool.
When looking at vendors, we look for reliability, trust. If you do have an issue, you want to know that somebody is going to take an interest in it.
It's robust and leading edge. They are always ahead of the pack. They spend a lot on research and development and that’s the reason they make good products.
The testing tools and everything that comes with it is very beneficial; especially the REST API environment that is best-of-breed.
I am not an end user so I can't really say. But, I would like to see improvement in the price.
The solution is very stable.
You can use it any way, so I suppose it is very scalable.
I am not personally responsible for technical support, but the other team would have used them.
We have always been using HPE products, so we have just grown with it.
Another team was responsible for the setup.
Specifically for this solution, they need to be in a close relationship to co-develop functionality. They also need to know the vendor is there. That is the biggest plus from an HPE perspective.
ALM makes functional testing much easier for our customers. We tell them that if they use ALM, they will have a productivity gain of at least 40% compared to using traditional spreadsheets, Word documents, and so on. They also need it because their departments are getting larger and larger. They're not sitting in the same place, so they need a tool to combine their teams’ efforts. This is difficult if you are using Excel spreadsheets because you need to send them by email and make sure they have the latest version.
We see the advantage of ALM over Quality Center. You can have templates instead of having specific templates for each product. Once we define the workflow for customer X and the setup for that customer, we include all of that in the template. If we want to make a change, we change it in the template. We'll then do an escalation down through all of the various products so that each and every one gets updated. So it means that things are administratively much easier with ALM compared to QC.
With Octane, HPE is finally trying to combine the agile world together with the functional testing world. It also has an integrated ALI, which means that with Octane you have one point of view of your whole testing process. I see that this as very valuable because we're also competing with JIRA and so on, which has the facilities that we are trying to accomplish with Agile Manager.
JIRA is fancied by developers; so if a war starts between developers and testers, usually what we see - in Denmark, at least, - is that the testers are on the losing side. But if we can get Agile Manager on our side, then we can start competing with products like JIRA.
We should consider not being a testing tool as such. I know that with ALI, we integrated the customer's EDI - the Eclipse, SAP and Visual Studio - but we'll need to do that more. We need to get moving in that direction as well.
ALM is a very stable product. The latest version we install at customer sites is 12.5.3. It's a very stable platform. We have no complaints whatsoever.
Our customers do not generally use HPE support very often; and therefore it's a bit awkward for them to get started with it. They find it pretty difficult. That's not the worst part. The worst part is really when you finally get to someone to talk to, and then they're not qualified. So we instruct our customers, if that happens to you, you should say immediately that I want to escalate this to a duty manager, who can then take charge. It's not as bad as the time we tried to move all of our support functions to India. That was terrible. Thank God we got it back; but I don't think we are there yet. We need more qualified people to take first-line support.
The initial setup was pretty straightforward. We are the experts in Denmark. That means it's a bit easier for us because we know exactly what to do. Various customers use our services to do that for them because it can be very complex if you only do it on a rare occasion. If there is a customer who needs to upgrade from 11.5.2, for example, and they haven’t touched the administrator module for a couple of years, it is easier for us to do the upgrade because we do it all the time.
HPE has failed desperately to offer a competitive enterprise for new customers in this market. If I have a new customer, three years ago I could sell them ALM site licenses. Now I need to sell them ALM global licenses, which is a hard sell because it is double the price. That just doesn't appeal for new customers. So I understand why they take JIRA or stick to Excel spreadsheets because HPE has priced themselves out of that market.
With Octane, you get more functionality; but it's like having Microsoft Word. How much of that functionality do you use? You probably use 20% out of the 80%. So I don't think that adding more functionality solves the problem. HPE desperately needs to get a low enterprise for new customers.
I believe that my largest competitor is our customers who are using Excel, followed by those who use JIRA.
LeanFT:
LeanFT is a new solution, but in general, it's opened us up to a wider audience such as the developers, so they can actually do their unit testing. We couldn’t do that with HPE UFT. This is the big advantage of this tool.
The second thing is you can use more technologies than with UFT, including using different languages like Java.
The third part is that we can use the Cucumber test framework, which is something that you can use easily with LeanFT.
Quality Center:
Quality Center is our testing management tool. When you're running a global team with more than 120 QA staff around the world, you need one repository to write, run, monitor, and share your test cases between teams.
This is the most valuable feature of this solution and you can do it very easily. The UI is very user-friendly. With one click, you can see the status of each project that you're executing. Quality Center is the Rolls Royce of solutions and I would give it the highest rating.
LeanFT:
LeanFT should include more technologies. For example, I would like it to include the Scala programming language. That is one of the main language that we use.
Quality Center:
There is a new product, HPE ALM Octane, which might be the solution for the gaps. I would like to see more connection to more products and processes, and including the DevOps into Quality Center.
Stability for both is okay.
Scalability is becoming more complex. Scaling involves more experienced people because it’s not easy to scale.
LeanFT:
Because this is a new product, we're not at a stage to scale something. I don't know how it's going to scale. Based on what I’ve seen so far, it should be okay.
Quality Center:
Since Quality Center is a web application, it's easy to add more users and products.
Technical support is good. We are located in Israel, with an Israeli team, so it's easy to contact them. We have the right phone numbers and from that standpoint, it's great.
There's always room for improvement and additional customization that would be nice. In general, both solutions are quite easy to install.
LeanFT:
There is a debate between this solution and Selenium, and we use both of them. Your choice of tool depends on the technology and the gaps in each of them. We are not an “all-HPE shop”.
Quality Center:
We looked at IBM’s RTC for an ALM solution. We use it now for implementing SAFe (Scaled Agile Framework), which is designed for feature tasks, user stories, and program board elements.
We chose HPE because I had good experience with them when I was Test Director. In term of overall experience, HPE provides a good experience for the users and a lot of benefits which you cannot find with other vendors.
All types of data can be seen in one place.
It's doing what it should do, but it's a little bit costly. I would like to see some kind of testing analytics.
Stability is good; no problems.
So far, we have not had any scalability problems.
I was involved in the setup. It is an out-of-the-box, simple installation. Now we're doing data migration to the database. We are just reading the manual, but more people are involved. This is a normal process.
Start slowly.
When selecting a vendor, the most important criteria for me are that they are trustworthy and nearby.
From a testing point of view, it takes us a step closer to automation. Our testing is quite manual at the moment. We can use it from start to finish; from testing business requirements right through user acceptance testing, load testing, and performance testing. That's a positive feature.
It has dramatically reduced the number of defects that go into production. There have been no serious outages, nor serious problems where we had to do a rollback or anything like that. The transition into production has been very smooth.
From a tool point of view, I would like see some integration into release management. That is the biggest pain point at this moment.
We bought this solution three years ago.
From a performance point of view, stability is very good.
ALM has not directly assisted scalability. I wouldn't say ALM assists with scalability at all.
I haven’t used technical support, but my team has. I don't know about their experience with them, but if they were not satisfied, I would have found out about it.
Don’t just focus on the technology and buying it, but rather focus on the processes behind the support of the technology. That was our biggest lesson.
It's a very good tool to use for referencing all testing components in the lifecycle of the application end to end. For example, you can include a requirement test case, feasibility of execution, and dashboard reporting for web or mobile applications.
The benefit is to track coverage of functionality, and to have a stronger application without bugs in production.
Integration with other tools would be good, for example, with open-source tools. In the meantime, we do something with JIRA, with Selenium, and so on, and it's good; but we can increase this connectivity with other tools.
For the past month, the solution has been more stable than it has been over the past 10 years. For our mobile center, for example, we started using it this year; but it's not very stable for the moment. ALM, however, is stable.
For the moment, we use it for our projects; but our testing centre is only in one location, and not for offshore. We haven’t had to scale it much.
Technical support depends; frequently, it's not very responsive in resolving our problem, but engineers handle it too late.
In the past, we used a Compuware Solution and an open source solution. We switched to ALM because tracking all activities is better when all your monitoring is on products from the same vendor.
The initial setup was okay.
The IBM solution is very hard to set up and to use. The Silk solution, which is now Micro Focus, is very strong.
With ALM, it’s simple.
Use ALM because it's simple; it has all information you need to communicate with all people involved in a project, whether they are in IT or not IT. This is the aim of the testing lifecycle.
The most important thing when choosing a vendor is that the product is user friendly and can integrate with all your old modules. It helps to have one application rather than multiple applications to connect with all the different companies.
Most valuable to us is the ability to have the system organized into distinct roles and sections. That way, we can grant different users access to the specific section they need to access. We have business users that only need to run tests, so they only need that small section of the application. We have the BA's, product trainers, who only care about the requirements.
It has made our development process more professional. The whole interim process is a lot more professional. You can align it with the development life cycles, get the developers to buy in, and try and get it all linked in to the TFS Visual Studio.
Integration is also important to us. You've got Sprinter, which is quite nice for those that aren't familiar with what they've got to do. It's a nice little guide. Also, you can link it in with performance and automation tools, and kick things off with the push of a button.
New development methodologies, such as continuous integration and kanban boards, are being implemented by Microsoft and others to try to get their tools into the testing profession. ALM's got to push back and think more about the overall end-to-end development process. It's very much still a testing tool. We have a few awkward links rather than being a full solution.HPE ALM lacks a few of these features, but for a testing focus tool, helping to ensure quality, I think it's really good. It's good at its core necessities.
It's very stable at the moment. We're not on the most recent version. We have been using version 1201 for 2 ½ years. I did the upgrade, and I found it easy for me to do, because I'd done the previous upgrade as well. The documentation from HPE isn't that great if you don't know what it means. It’s been stable, but I say that, because I did the install.
Scalability is good.
Technical support is good. We've had to get quite deep down in some incidents, so we've actually managed to get through to third level support and speak to the developers. At that point, you're both talking the same language. They can understand your issues and you get good resolution if it gets to that level.
I was not involved in the initial setup back in those days. A couple contractors did it. It was called TestDirector in those days. I'm going to have a look at the new HPE ALM Octane later.
Their licensing model is expensive. We could scale it up and use it everywhere, but then, you look at how much it would cost for the licenses and you really think, "Is it worth it?"
I was not involved with the decision process, but I did put a case together to continue using it. Our parent company was trying to push us to use Microsoft TFS. I was basically showing how much better ALM is over TFS. For what we were using it for, it's just much better than TFS. It was the testing tool of choice.
Try and have a play with it and don't be afraid to customize. We've got this big workflow in ours, so you can control the rules a lot better as to who can do what, who has access, and what they can see. Out of the box, it's a bit vanilla and there's the risk that someone could be given wrong permissions and accidentally do something they shouldn't.
It provides a central repository for all our testing artifacts and documentation. We use it not only to keep everything centrally housed, but it is also great for answering audits. That is our biggest use of this product.
Centralization of our testing artifacts is probably the biggest benefit. We have a disjointed arena with a lot of different legacy applications and new applications that are being built. We need a central house to store all our procedures, documents etc. and ALM is the tool for doing all that.
It provides a streamlined and consistent approach. One that is repeatable. In today's fast paced IT world, these things are definitely necessary.
We're starting to move more into a microservices enablement world. Using other products and being able to integrate with Docker etc. is going to be key for us. That's one of the reasons why I attended this conference, is to learn a little bit more about how HPE can help us with the integration of those tools.
We have had no stability issues. It is very reliable.
It's handling everything we've asked it to do so I don't have any issues with scalability. It could probably do 10 times more than what it's doing for us.
Other than professional services, we haven't used any technical support.
Initially, we were using other products but HPE acquired a couple of those companies. Now with the recent movement towards pushing their software out to Micro Focus that may change a little bit of the relationship we have with HPE. That's another reason why I attended the conference, is to understand a bit more about how that relationship will evolve.
It was initially setup within my organization but I didn't really have any hands-on involvement with it. Our direct teams were involved in this process. Based on the staff that we have today, it was very straightforward and very easy to do. Then again, we've got people who had experience with the tool so they've done it before.
HPE has a great suite or had a great suite in their software department and everything integrates very well. For those who are looking at HPE or now Micro Focus in terms of their software, I would advise them to consider interoperability of all the capabilities. That is the key for speed and implementation as opposed to feature functions. One of the things that we've found with the HPE suite is that the interoperability is hands-down second to none.
It's 100% reliable to us. It provides us everything we need. It's scalable, flexible, centralized and also integrates well. What more could you ask for?
The most important criteria while selecting a vendor are partnership, value, capability and flexibility. We've partnered up with HPE for years and we enjoy all those different aspects with them.
Defect Management, Test Management, MS Excel Reporting, Analysis Graphs.
Allowed us to centralize our test efforts from end to end so that we have a single source of truth for all of our test artifacts and data.
As an administrator, the ability to add users to their appropriate user groups from inside of the Site Administrator tool instead of having to log into the ALM project itself to make that user group assignment would be HUGE!
Since 1999 when the product was known as Test Director.
Just when folks forgot to check in their assets before migrating / upgrading existing projects.
Not so far, no.
Not so far, no.
Good.
Technical Support:Good.
Have been using a version of this solution since 1999. Have not used competing products day to day as of yet.
No.
In house.
I don't have exact figures, but we are saving time and money using this solution.
ALM is a giant library, and Performance Center and LoadRunner require it to run.
We use it to support Performance Center and it runs underneath it as one big system. The advantage is that we can test applications before they go to production, and as long as we're testing in a production-sized environment, we have a pretty good idea how an application will perform in production.
It's like the overall software framework, and Performance Center is just leveraging that framework for storing things such as tests, scripts and test results. ALM works together with LoadRunner and Performance Center as one big system. As newer protocols are developed and newer technologies come along, it's nice to see HPE be ahead of that as much as possible so that by the time that it's really needed, they're already ahead of the curve and they've got most of their performance issues resolved as far as how the software's going to run.
The stability on the old versions is good. On the newer versions, the bleeding edge is still being worked on.
It's very scalable. No issues with scalability.
Premium support is great, but before that when we just had general support, it was not all that great. We had issues with trying to get support to call us back on tickets and turnaround time on resolution.
We previously used IBM Rational.
It's not exactly straightforward. Their instructions were not all they could have been, but we still got it installed.
As far as we know, it's the best tool on the market right now. They're considered the Cadillacs of the testing tools right now. Don't necessarily go with their most recent version code release right now. It kind of depends on what your needs are and the size of computer shop that you've got.
I would say the most valuable is that we can get people started off really quickly on solutions because we've been partners with HPE for a long time and it helps us tailor the product to ours needs. When we have issues with something we can get support directly from HPE since we paid for it.
The fact that it works with a vast number of technologies works for us because our internal customers use the tool for testing a lot of different applications. That's probably the best feature that it has for us.
There's a lot of centralized testing from some perspectives and our main goal is to provide for a bunch of different groups at a lower cost so we centralize licensing and distribute it to various people. The biggest benefit of that is that it allows us to empower the people that need the solutions instead of manually having them develop the solutions on their own.
We've seen a lot of new things in Octane and other things that we have wished for. One of the hardest things that we're noticing is it might be hard to migrate from ALM to Octane, which has the features we need. What we really like is the ability to track different types of tests to our requirement. If you want to play with Selenium Test or LeanFT, UFT tests or any other framework you can think of. Being able to capture those results in a common area is the biggest thing we would be looking for because we have so many different groups that some of them have their own solutions for testing but ALM is sort of the central repository for our results so that would be a huge benefit for us.
In the past three years it's become a lot more stable. Prior to that, we saw a lot of issues with stability and a lot of patching and concern from our internal customers that they couldn't rely on the tool to always be there when they needed it. We spent a ton of time upgrading to the latest version so we don't have as much experience with the stability of it yet.
ALM has really scaled out for us. We have hundreds of projects in ALM and it's always done well with that.
A+
Technical Support:Our biggest issue was in the switch over from HP Inc. to HPE. I think we had some trouble getting in touch with higher level support so we spent a lot of time going through basic support where the people that work with the tools have a lot of experience with the tools. We think that it would be better if we could bypass the lowest levels of support on some issues. I can understand the process that we usually have to go through but more recently our reps have been helpful in getting us to the people that we need quicker so we can get a resolution.
We don't have time to develop a lot of reporting and our customers want a lot of reporting. It's hard to have the expertise to write the scripts in the version that we have now. That's a major pain point for us, something that's missing. Another thing is we always hear about it performance. We have a huge load balance environment to try to speed up the performance but there's always some things that are slow in ALM. Just basic navigations are running automated tests is a big thing we hear. People want to run the tests as past as possible but they feel like they're limited by ALM sometimes.
We have a pretty strong emphasis on quality, so ALM is our gold source repository for quality. That's where we store everything, from requirements, test cases, defects, and all of the artifacts around certifying that quality is evident in every release, in every STLC product we produce.
The UI is terrible in the sense that we actually use automation scripts to avoid being in the UI, which is just fascinating, and then the data model.
I would say it's stable in the fact that it's up and it works. We have challenges with the data architecture. We're excited about Octane. It has some interesting capabilities, but it's our standard. We're used to using it, so I guess it's the things you want to enhance in it, we're just working through that process from that standpoint, but relatively it works.
We're already at enterprise scale, so it's used across the enterprise. I would say that we're at that point.
We invest very heavily on having strong domain and subject matter expertise, so we use support less. One of the things I would love to have is a pay-per-ticket model or a pay-per-patch model. I think that when we call support, it's either a defect or enhancement. It's not just, "Hey, I need customer support," because we're not novice users. We're on the high end of maturity so we're pushing the products in the spaces that I have very much through limits and it's really getting on their solutions and enhancements team.
From that standpoint we get good interaction. There's a really long cycle time though. That's my only disappointing thing around the support is that tickets tend to age, because they're enhancements. Enhancements have a longer cycle, you have to develop it, get it in a backlog, etc.
I have an entire team, so I'm a director and I have an entire tools team that does that. I did get involved in the planning and the strategy of how we're going to do it. My team said that first installation is relatively easy. When we go to upgrade and migrate, that's where there's pain.
Almost every customer will say the upgrades and migrations are very painful. They could be way easier. A lot of it has to do with having to upgrade the data, the in-place database or stand up in entirety, it's just cumbersome. It's very cumbersome and it takes a long time, longer than it probably should. That's a pain point that I think everyone has. Fortunately in our case, we've never had to call professional services to do it. I have a lot of customers say they couldn't get through the upgrade without it. Now, on the support side, it was really helpful, they were on the phone our first major migration for 72 hours.
It was great to literally be in that, "Hey, we're going through it," they were there the whole time, which was really awesome. We didn't have to involve professional services, but that was a good story to say, "Hey, they're on the phone with us. They're grinding it." So the whole 72-hour period we had someone from support cycle in. They did the hand-offs and all that stuff while our team was grinding off. So that was a good story there.
I think it's a great platform. It does a lot for us, but the fact that our users don't want to be in the application is weird. They'd rather work in a spreadsheet and then upload their results to the actual server. Now it could just be their behaviour pattern, but I think if it was a little easier for them to kind of work in, they would have an easier time with it.
Although on the plus side, the fact that it's open like that and you can just connect, maybe that's a positive too. So it's kind of a plus/minus. The UI they said, "Hey, I don't really like UI," but the fact that you can just upload your stuff from your work space, which could be a spreadsheet, it could be Eclipse, it could be a script that you're working in and it just directly uploads, they love that.
When you talk about easy use from an integration standpoint, definitely high marks there, but the UI is just something they really do not like. I personally, as the person who has to get all the data and metrics out of it, the data model is horrible. It's a constant complaint that I have. The new Octane platform kind of solves that. I just wish they had put some of that into ALM because the product marketing strategy is you have to have both.
Have a well-defined process, have a strong reporting structure, meaning in your process you want a lot of measurability. If you define your output, the reports and the questions you need to answer from what you're doing, which your process should be managing for you. In our company, we are very specific about what our executives and stakeholders want.
We have a very well-defined set of measurements that we have to take. We then put a process designed to ensure those measurements are always taken. That then allows you to deal with your outputs and your reporting structure, which then allows you to properly architect your tooling. The technology is very flexible. You have to decide as a client area how you really want to use it and that's going to start with what your business needs are the values that you're trying to get out of it.
That's the biggest advice that I have, it's not even on the technology. The technology will do great things for you if you have a plan and a structure and you know what you want it to do for you. Half the time they don't know, they want the tool to do it for them and it's the other way around. So that's what I advise people to do.
Think about it, have a vision, have a plan, tie that to outcomes, and measure those outcomes. If you're answering the right questions and asking the right questions, your technology will really enable you. You've got to look at it from that standpoint.
ALM helps us keep track of all the functional testing that we do for projects before deployment and even after it goes live. We also use it for tracking future enhancements, and all the functional defects. Test requirements are maintained in ALM.
It saves time, and definitely mitigates risks in having products which are not very well built, to having a product which will perform well and function well once it goes live.
I work in a bio-pharmaceutical company, so we have lot of validated applications, and when we do functional testing for these validation applications, tracking the e-signatures is very important.
I know there are plugins to track the e-signatures, but the problem is that it's very hard to get them implemented. There's no out-of-the-box way, as far as I know, to implement track changes continuously, that comes with add-ons, and those add-ons operated by third parties as well. They are not very mature and there is a huge learning cycle in adopting them. Due to these reasons, the effectiveness of ALM for an industry like ours is less than what we would see in LoadRunner.
It has been challenging in the past, specifically when a new version is released and we have to upgrade. We haven't been upgrading that often, and because of that, it may mask some other issues which we would encounter because by the time we upgrade the new version we would have gone through some of the new patch fixes and so on. We wait for a couple of years and then apply the fixes. By that time, most of the big bugs are fixed.
It scales for our requirements but we have been finding it more and more expensive for LoadRunner. They're introducing new protocols, but they are quite scalable.
We haven't used technical support directly from HPE. We go through Avnet for all the technical support. They're a value added reseller partner of HPE.
We acquired HPE products a long time ago before I was around.
We have been using ALM and LoadRunner throughout. I can't recall having used any other solution before that. But one thing I have noticed is that there's less and less emphasis on load, scalability or performance testing, and the emphasis seems to be shifting away completely. This is feedback based on the fact that there's less emphasis on performance and load testing in these seminars as opposed to the last few years.
Being able to manage tests as this is something very difficult to find in other products. There are a few open source ones that handle test management, but right now HPE ALM is still the best solution to handle tests.
It helps us to keep track of everything happening. When you test the software you've got results. Results can be OK or not OK. If you just get the results in Excel or things like that, you cannot work as a team because just one person at a time will be able to access it. With ALM, we can have several people working on the same product at the same time. Then we use it a lot for trustability, so we can add trustability to the facts, to requirements. It's very useful for that to verify everything that happens.
As soon as it's available on-premises we want to move to ALM Octane as it's mainly web based, has the capability to work with major tests, and integrates with Jenkins for continuous integration. This is lacking in the standard ALM which was great a few years ago but it did not evolve enough, and that's why we are waiting for Octane.
We've used it more than 15 years, so it's very stable. There is a new version coming, ALM Octane. Octane is new so we don't use it yet.
We have plenty of projects with the current ALM, so it scales well. I'm not afraid of an issue with Octane and believe it will be the same.
I'm disappointed with the support as they're not reactive enough. They don't know the product very well, and to have things changed we need to access R&D directly and then things move. Otherwise, it's kind of difficult.
Beforehand we were using just paper and Excel, and things like that. As soon as ALM was tested at the time we began to use it and sensed it's presence in the company and now every tester is using it.
For ALM it was complex because it's not fully web based so you need to install a client on your desktop and with all the Windows security stuff you need to be an admin on your desktop so it's a very complex set up. On the service side it's kind of complex but we have tech experienced people to do that and to set up the database and everything, so it's OK. With Octane it should be really much simpler because for the user because it's just a web application so you've got nothing to do.
I'd rate the pricing as 3/10 as it's very expensive.
The first criteria we look at is functionality. We have plenty of different projects so we need a full spectrum of functionality. The problem we have today is the price. It's a very good solution but it's expense so we are challenged by our finals and everything but the price.
If you have the money then you can go with ALM, as it's a very good product. You won't have any surprises with it so that's good. Otherwise, there are some open source solutions that are a little bit less functional, but you can play with them and get them to work, products like Squash TM or things like that.
We use ALM with our QA Department and it provides a way for us to show our work as a repository for our test-cases. We're able to show what we do on a daily basis. It's very easy to use and it's worked well for us.
Over the past year, we've been able to decrease our defect reduction by executing and making sure we have test-base coverage in all the areas. I don't quite know the percentage exactly. We've been able to reduce the defects.
We've done our source code in another application for so long, to have developers come over to help QA integrate it would probably be an impossible effort.
It's been okay. I think the problem with us is, it can be used for the entire SDLC process, as far as requirements, development and that type of thing, but we only use it for QA.
What happens is, we're having to use multiple products to come to one goal. That's kind of frustrating for the teams. I think if we could get to the point where we could use it as one solution, it would be so much more beneficial.
We haven't scaled it yet, but we're looking to do that. We're working with a company now to look at some different solutions, or at least the ART tool. It's looks very impressive to us. We're in conversations about the ART tool, because we really need something like that for our analysts.
It's an educational tool, so you are able to link your education and link it with Rally. If you have a module that you want to teach, you can just teach them through that. It's a direct connection to HP ALM.
I've never used tech support. We have an individual that works closely with HP. Any technical issues that comes across the team, he tends to work directly with HP to handle those.
We use ALM for all of our applications and didn't use anything before. We're a maturing software company, so we're really getting into these distinct processes, like ALM. We're currently going through a transformation into Agile, so we're really just ramping up to get to that mature stage as a software company.
I was there, but I wasn't involved. I was an independent contributor.
I wouldn't rate it a 10 because it doesn't have the ability to do all the things the developers use today, like TFS. Overall I would recommend it, because of its ease of use. It doesn't take much to get up to date on it and to learn the process of using it for your test-case execution in ALM.
You don't have much time to spend on education. You don't have two weeks for them to learn an application. So, because it's easy to use, I would definitely recommend it for that reason.
It allows us to track test cases that we create, so for all of our applications that we test we build our test cases, load them into Quality Center, and then we also track our defects inside of Quality Center. It allows us to be able to gather metrics based on the applications that we test.
I would say specific to our business solutions department, we can absolutely take a look for individual applications that we are testing. We can make some decisions about applications being turned over. How defect prone they are. If unit testing is occurring beforehand it helps us at least talk to some "Hey, here is what we received, here is how many defects that we received." It's been helpful with that.
What I am hoping with the latest version of Quality Center is that I would like to see a better interface with being able to load Excel spreadsheets, so a lot of times the key way analysts rewrite our test cases in a spreadsheet, and then we load it up. I would like to see where the interface is better as it's not as user friendly in this release that we have, so I am hoping that it is improved with the latest version.
It's been pretty stable for everything we've been doing.
I would say that at this point I really cannot speak to that.
We haven't used it. I know we are going to upgrade Quality Center this year, so say maybe there will be some more possibilities for us to interact with support.
Quality Center was around well before I got to the company.
It was very easy. LeanFT came with UFT 12.5 and greater. Just deploying the UFT package which we're very comfortable with, we were able to deploy LeanFT as well.
I know there are some other tools out there if you are looking to manage requirements such as JIRA and a couple of others. I know some are really gauged more towards agile development, but a lot of them are used for requirements and they do have the ability to store test cases but we as a organisation use Quality Center.
It works for us in terms of being able to track our test cases, absolutely being able to store results if we want to put in defects and build metrics. It is a pretty decent tool.
With HP ALM, I think it's the fact that it's self-contained application so we can do everything inside the application. We only need to use this one tool.
The availability and the fact that HPE people want to help is something that I appreciate because they are with us, they try to help, they try to understand what we need and they act accordingly.
I think it sells because it's HP ALM. It's because it's a collaboration tool. It helps everybody collaborate within a project and because of that I think we save time and we have less difficulty making sure that everybody is aligned.
The tech support is sometimes not clear when you speak to them.
We had some issue before but now its been fixed. It's because we migrated from an old version and we went to a new one. That created a couple of issues but now it's solved. We need to go to another version so it will be another challenge, but we're working with HPE to understand the best way to do it.
We've had no issues scaling it for our needs.
It's good but we need to manage exactly what we need from them. Sometimes on the business side it's not clear enough. When it's not clear we don't have the results we need. The next time we need to make sure to correctly define our needs and involve them in that way.
We didn't use any other solution previously.
It's not straightforward because for us it was because it was an upgrade of the infrastructure as well. So at the same time we changed the server, we also changed the infrastructure. It was not because of the product itself, it was more linked to what we needed to do at that time.
HP ALM helps us consolidate our efforts. All of our projects are in there. We are also in the life science domain so we have many more compliance requirements which we have to adhere to. It's pretty good so far.
As a user we see one version of the requirements for the application, we keep all our assets together, it gives us a huge traceable. It's all the classic benefits of using an application lifecycle management tool that are available.
We look at service packs, what bugs they have and fixes. From a end-user perspective when you have invested heavily in these tools for the last four, five, six years or more, organizations are there from when it was Mercury. We just want to keep pace with where the industry is going, where the shift is in terms of quality assurance and requirement management. HP is very strong on the testing side, but in the last few years with the agile methodology it has lagged behind. It's slowly catching up and eventually it will get there, but we love the eco-system we're in and will continue to move forward.
It's stable
It's very scalable, a very robust kind of solution and we recommend it to anyone who's looking for application lifecycle kind of tool.
We use an HPE partner for our support needs, but tickets do go to HPE eventually, level two, level three. We have never had an issue.
Our organization is very new in this area. We are a pretty young company. We didn't have any formal task-management kind of tool or testing tool per se. When we were looking at the solution one of our implementation partners for one of the projects recommended it and we looked at it and it's capability. Many of the folks who are on the team have used it in other companies. For the current organization it was a no-brainer not to pick this tool.
It's very straightforward.
Since we are in a regulated industry we have to use the workflow we use, what was built for this. For us it was a straight-forward choice. For large and small companies there are a lot of choices for task-management tools. IBM rational tools are there and then there's JIRA, there's also TFS. There are a lot of task-management tools. They can pick any one that they want to.
First of all, the product works. ALM is traditionally more of a waterfall application, but it does allow you to collect your requirements, your test cases, and you can even execute test cases automatically from ALM, which is great. Everyone's trying to do DevOps these days or Agile, so it's a good product.
It allows us to do things more efficiently. There's nothing like spending millions of dollars upgrading an application, and trying to manage your requirements, your test cases, your defects in a spreadsheet. Who has access to that? That's what that product gives you.
We need to move to Agile or DevOps. We have other products that do that, but I'm trying to standardize on a platform. I'm very interested in HP Agile Manager.
We've had no issues with the stability.
It's real easy to scale and add more licenses.
Not directly through HPE. We go through HPE's vendor partner, which is Checkpoint Technologies, and they provide excellent technical support.
When I took over Quality Assurance, we had Quality Center. ALM is the new Quality Center, and we upgraded to version 12 of ALM. 550 projects with no problems.
I came in and decided that we needed to upgrade to v12. We reached out to our vendor partner, Checkpoint Technologies, and they came in, assessed what it would take to upgrade it, and they did the upgrade for us.
It doesn't do Agile very well. We can make it do it, but it wasn't designed to do it either. That's not being fair to the product. It's a waterfall-based product. You should go straight to HPE Agile Manager.
The tool provides invaluable bi-directional traceability from requirement --> test case--> test execution --> defect.
The ability to have visibility of manual and automated test results within the one product certainly cuts down on the management overhead and eases the creation of project health reports.
The UI is becoming somewhat dated but that shouldn't be a deal breaker.
Out of the box, the tool is very flexible in what it allows the user to do. This can go against data integrity in a regulated world but the tool can be customised to improve data integrity. For example, you could customise the tool's workflow to ensure tests cannot be re-executed after a set of executed tests have been peer reviewed.
The Central Repository is key for us.
It has sped up our regression testing cycle almost three times what it is if we do it manually.
Tighter integration between ALM and UFT, especially from a reporting perspective, for automation reporting. We currently run into reporting issues.
We've been using it for around a year.
ALM's been pretty rock solid for us. Getting it to interact with UFT nicely has been a challenge for us sometimes. There's good integration in my opinion, but it just needs to be a little more rock solid.
It's been able to scale to our needs.
Good, sometimes a little slow, but overall pretty good.
We didn't have any other solution in place, and needed to have a much better solution than doing testing with Excel files.
It's straightforward.
HPE was one of the very few that we actually had on the list. We went with HPE because my boss was very familiar with the product, and felt it fit our organization's needs extremely well.
Give it a shot, if you take the time to invest in it, it works.
We are actually not utilizing the full capability of ALM as a full application lifecycle management solution, but we use it for quality insurance, depositories, and for our difference management. For that, it is pretty good.
If you have to run a manual test it's very helpful. It has the option to perform manual tests so we have resolves, defects, and linkages. We come from the QA perspective, put our own requirements in and it's like a one-stop shop. It's very easy for QA people to take out their metrics and share those metrics with the senior management.
The only thing I would add is that I was really looking forward towards the new generation filler that was coming. It seems that in order for us to get the full capability of the new generation filler, we have to purchase AGM, but we don't use AGM right now. It would have been really nice if the whole feature was embedded into ALM. Otherwise, everything would have go to licensing and then there's a cost associated to it, then you have to go through the cost benefit analysis with the management and share with them a projected ROI. It kind of adds a level of controversy, and right now all the folks are using JIRA . They will just say, "Oh, for your QA, just connect it to JIRA and let's go." That is where I feel like, if you have to use so many features within an ALM, if you have to use everything, you have to buy.
I think the stability has been fine.
Scalability is good. There's also scope for improvement here, so I would say it's pretty decent.
I don't use technical support because I have a tool administrator. He's the one who deals with the technical support. For him, I act as a user of an ALM, and if I have any issues, I go to him and he'll talk to technical support.
I've been at my current job for the last 11 years and we have been using it from its days as Test Director and QC days. So far, we haven't tried anything else and have stuck with it.
I wasn't involved in the initial setup.
It was done by our tool administrator.
It was already in place when I started, but five years ago there was a process shift and we thought we could read results in ahead from other tools. I think we all just decided to stick with the readouts that we got and that because of the way we used the test capabilities, we didn't want to change. We were then able to convince our management that if they didn't want to use it to its full capabilities that the testing capabilities were worth it and they finally decided to keep it.
It's a big solution, I'm just using one part of it. For the other part of it, there are a lot of improvement that needs to happen, so just looking at my little piece isn't enough.
It all depends what your needs are. If you are very modernized, and have short cycles, you should evaluate other tools also. It all depends on your needs because each organization is very different. Maybe some organizations have lots of money and they want to go ahead and go for the big shop, and they can do that.
The most valuable feature is definitely its scalability. It covers the life cycle end-to-end, from requirements to test execution and defect management. There are some features we haven't yet explored, such as project planning and libraries, but these also add to the end-to-end view of the life cycle.
It provides our team with a well-defined structure for the way each person should work, giving us a standardized process. For example, with defect management, we can find a particular issue and know exactly who's working on it and where exactly they are in the work flow. Previously, people tended to work on their own thing without coordinating with others.
We found some difficulty in working with it, as we're a large organization. Once we got to 10,000 users, the idea of an individual user lost its value and there wasn't the ability to create teams within ALM. We weren't able to assign particular work to a team, but there's no function for that. This is something that should be built into ALM.
We have no issues deploying ALM.
The stability is very good, especially when compared with some of the other products from large software vendors.
Scalability has been excellent, going from a user base of several hundred to around 10,000.
I've not personally used technical support, but other in my company have. Some of the queries are responded to very quickly.
I wasn't involved in the overall decision, just on which version to select.
We have a lot of internal processes which elongates the process, but as far as the actual installation and configuration is concerned, it was reasonably straightforward.
We considered other options.
For us, the most valuable features are the task management function and the requirements gallery that's true to test execution. It's a good mapping tool and a good sensor repository for all prior testing. We can keep all that data in one centralized place.
I think the best improvement it's made to our organization is that because it's so stable, we haven't needed to change. We're able to continue with our business because it's a solution that stands still.
Sometimes there are small customizations that we'd like that are not always available. If I need to contact technical support about an small issues, sometimes it takes some time before I can get resolution. And if there's a feature that's not available, we need to wait for another release because we can't just simply add the feature we need.
We've had no issues with deployment.
It has enterprise-grade stability. We never have any issues with it.
It's definitely scalable.
Absolutely fine. We have contacts where if you just send them an e-mail they will ring you in 15 minutes. It's been a good relationship with the guys. Really helpful.
It's very straightforward and the pre-sales guys are really good.
They come in and set it up.
The most valuable features are ALM's flexibility and performance. We're also able to customize it to do what we want it to do.
Using it properly is our biggest challenge, as some people use it on the most basic level and others rely on other tools.
I think the biggest challenge with ALM is getting useful data in one place. They're scattered in different parts of the solution right now.
It should also allow us to get quicker access to data from the things we're working on.
The overall usability of it could be improved as some things are a bit slow to get used to.
It has deployed just fine for us without issues.
It's close to being perfectly stable. We have no problem with instability.
It has scaled easily for us and our needs.
Customer service is generally pretty good.
Technical Support:Technical support has been pretty good so far. They tend to be fairly quick to respond to us.
We're using JIRA alongside ALM, but there wasn't a prior solution to ALM.
Setting it up is not too difficult.
We implemented it ourselves.
We constantly keep an eye on competitors, but there's not been anything that we've considered moving to.
It's our main hub for everything around testing, especially when it comes to defect and department management.
It gives us a uniform way of working and reporting on defects as a project progresses. We've been using it for so long, it's become a standard part of our working protocols.
ALM bridges our development gap, but it's not quite full-scale yet. We'd like to see more functionalities so that we can use it to seamlessly helps us bring applications from development into production.
We never have issues with deployment.
I'd say it's stable 100% of the time. We never have outages.
I don't know if we're really a large scale user. We have around 230 users, and scalability has never been an issue. I think it's because we've been using it for so long that when we have an issue, we can solve it ourselves.
It depends on the case, but most of the time first-line support is okay, but in most cases we have to go to second-line support.
We work with many partners and contractors, and they always come in and say "we would like to use another tool." I always challenge them and say, "If you can give me clear advantages on paper of going with another tool, then I'll go." So far, no one's been able to convince me.
It supports the full test management life cycle. We have other test management tools in place, such as JIRA and a couple others, but ALM provides the broadest coverage from project creation to death.
We have developers, project managers, stakeholders, everybody referring to one single point-of-truth for everything that is related to a project, from requirements, test cases, coverage, defect tracking, and reporting.
The client installation is sometimes quite painful. You need to register some components on the client that need administration rights, which is really tough on the organization. For each upgrade of the software, every minor upgrade, you need to reinstall the client, which means basically somebody needs to travel around and do the upgrades on each client. Basically, what we really would like to see would be a proper web client that has good coverage. There is a web client, but it only covers a very small part of the product, so you can't use it for the full life-cycle, and so we decided not to use the web client.
It deploys without issue.
For us, it's stable. We're happy with the stability.
We have no issues with scalability.
We have had a bit of trouble at times and, in all fairness, sometimes we felt quite left alone. We've approached technical support with real problems and either they referred us to "well, check it on some of the Internet based forums", or "look at the FAQ", or something like that. Also, we sometimes feel left alone. In the end, it turned out that we were better off sorting ourselves on some forum instead of contacting support and opening a ticket. We're in a quite agile environment and if a support call is stuck for 6 or 8 weeks, it doesn't help us.
We were using JIRA before and still are using JIRA. But that is only a section of coverage, so we needed something that has a broader coverage of the process, and the ALM was the choice.
ALM setup was pretty straightforward. We had standard problems like connecting to the active directory and making sure that the missions are set correctly and so on.
Make sure you have full acceptance of all involved or possibly involved groups. Make sure that your management supports it and everybody is happy to use the tool and happy to share a good level of information in the development life cycle. This is where, for us, the most benefit came out of it. If you have a defect, you can easily with a mouse click get the full information.
The most valuable feature for us is probably the full Oracle component of ALM. It allows our users to be connected to other products.
We're able to use it with UFT/QTP for defect management. When it records a test, ALM will produce analyses to do cross-project reporting. This becomes a large repository of data and information that's valuable for us to make necessary improvements.
I'd like to see them move away from a desktop-type client and towards a web-based client, although we've also had ActiveX issues with web clients.
ALM, as a group, has probably been in use for 10 years plus.
Once installed, no issues with deployment.
The application itself tends to be very stable, but when switching to an open-source website, that's where the issues are. So it's not so much the core application having issues. For example, you may find that it would be an Oracle issue and not an ALM issue. But really there are very few occurrences, even after all these years, of a serious application fault.
Scalability is fine. We have in the region 15,000 registered users and up to 2,200 concurrent users of ALM. We don't really have any scalability issues.
Any issues would have to do with what a certain server application is up to. You just need to keep an eye on it.
We have the higher level, premium support. Technical support tends to be quick and reactive to issues and we don't have any major issues with it.
As large as it is, it's pretty straightforward to put in and you can configure it in probably less than an hour.
My advice would be to research the full system requirements you need for the initial install. In corporate environments, once you've got it up and running, it's more difficult to get off of it. Also, plan to scale up based on projected CPU and space that you'll need to get.
ALM allows everyone to work at the same time, doing defects and reporting the same way.
ALM is quite heavy to use because of the process. Everybody needs to do the same process at the same time, and that’s quite heavy. We get a lot of complaints from customers that they're forced to use a process that is so heavy.
It's not always very stable, but that depends on how you implement it in your organization. We put it on a separate server host in Singapore managed by our guys in Bangalore, so they make sure that they’re always available first.
We have a lot of people using it and it works fine.
Technical support is quite good, but sometimes it depends on who you’re dealing with. Sometimes you have bad luck and get a guy who doesn't know much about it, is new, or is in training, but most of the time it’s fine.
ALM is a bit complex since it's scalable and everybody needs to use it. With everyone needing it, you have to open the firewalls to everyone, which makes it complex.
I administer it, and from what I've seen you can create defects and follow up the resolution with defects based on the scenarios that you have created in default program.
Other valuable features I've seen are:
It centralized all the defects in our organization. It also allows us to test scenarios in only one bullet, one database, so it’s easier to manage. All the methodology around the testing scenarios is gathered into the same product, so it’s easy for communications between the business side and the testing side because they all know where to find the information.
I’m not an expert in ALM, but if I have to look into some issues or other occurrences, I can easily find my way around. It’s quite user friendly, I would say.
I supervise the team using and providing support for the product, and there are a couple of things my team feels that could be improved upon.
1) We need to move test cases manually from Test Case module to Test Execution module. This consumes more manual interaction. If this could happen with any in-built functionality, that would reduce the manual work and time involved.
2) Email notification list. Emails are not triggering to any member of the email notification group if just any one of the email addresses is incorrect. So, if one email address is wrong in a group of ten email addresses, no one receives the email notification that they're all supposed to receive.
It has deployed just fine for us.
Stability is fine, except for a few pop ups sometimes. We don't understand why we're getting them, but it's generally quite stable.
Scalability hasn't been an issue.
QA Test Management is good. The menus have changed over the years which is nice, and now it is also integrated with other defect tracking systems. Before, it was only compatible with QTP.
We used to use Excel spreadsheets. Using Quality Center as one tool helped us to track just one tool from beginning to end.
Report summaries help me to figure out where a project stands and how much work is left for the QA team to complete.
I've used it since 2005.
No issues encountered.
Sometimes it runs fast, and sometimes it runs slow.
It has scaled for our use.
They used to respond in 24 to 48 hours, now it's longer, and when compared to SpiraTest, it's not that great.
Technical Support:They used to respond in 24 to 48 hours, now it's longer, and when compared to SpiraTest, it's not that great.
I used the software version of Quality Center, and the initial set-up was straightforward. After changing it to the web version, it was much better.
It was implemented before I joined the company.
6/10.
It's priced high, and they should look into it to make it more competitive.
We also looked at SpiraTest, and it is more affordable than Quality Center.
Check the price and compare to other available tools in the market and decide select the one best fits the needs.
It's helped in formalizing and adding structure to what was up to that point in time a series of checklists that were not version controlled. Also, test results were not rigorously reported back to the internal customer(s).
Report generation needs to be simplified.
I used it for roughly a year on a contract.
I did not choose this product. It was onsite before I started the contract with this particular client. I have seen and used this product at a few client sites.
I did not choose this product, it was onsite before I started the contract with this particular client, and I have seen and used this product at a few client sites.
I was already used to setting up the directory-like structure in Test Plan.
An in-house team performed implementation as needed.
I've never done a deployment.
Our organisation did not have any issues with the stability of this tool.
We were able to have a lot of users logged in at the same time with no lag time or any scalability related issues.
You should invest in Quality Center if you are looking for the following :
Workflow management is a feature we find valuable.
It provides us with common development and test workflow for defect management.
Linking between modules, with actual field values like those between defects and releases.
I've used it for over 10 years.
The desktop deployment causes issues when the enterprise has locked down PC. The application itself is hosted by HP (SaaS).
Very good, 99.9%
It scales to our needs.
7/10 - it could be better, but usually it's good.
Technical Support:8/10 - the TAM and team are very good.
No previous solution was used.
Straightforward, the only issue is doing patch updates as they touch the desktop client which makes it painful to update.
HP hosts the application with no issues, and a vendor does the desktop update. This desktop vendor is expensive and inflexible.
The tool has been in use for more than 10 years, the evaluation was back then and not known now.
Make sure your desktop team have the skills and expertise to handle Quality Center’s client components.
The overarching lifecycle view, from requirements gathering through to testing and defect resolution. Additionally the ability to customize the user permissions so they can only see and do what their job role permits.
It added structure to the test process and enabled the developers to better understand the QA process. This in turn led to an improvement in the code developed in-house.
As a standalone QA tool it meets the needs adequately, but it really needs combining with other solutions, such as Agile Manager, to get the best full lifecycle solution.
Around 14 years from when it was originally called Test Director.
There are still some issues when deploying to a few end user machines but the install and upgrade process is very easy. Some of these issues will be resolved in later releases.
Very stable with no reported issues in years.
There are options to increase the scale of use and extra modules that can be obtained with the full ALM license.
On par with other big companies, sometimes you need to get past the front line support to get to the real answers.
Technical Support:Good, quick turnaround with ideas and solutions to try.
QA was driven by spreadsheets before the deployment of Test Director.
The documentation is not always easy to follow but the answers can be found on the support forum and help desk.
The initial deployment was with a vendor over 14 years ago, but all subsequent updates have been done in-house.
It's unknown, but I suspect it to be quite significant.
More recently it was reevaluated against Microsoft Test Manager to see if it was still the best QA tool for our needs.
It is still the best QA tool on the market that integrates with most of other tools we use. It allows everyone who wants to be able to see the current quality of the project and control the QA process.
Fully understand the different options out there and the license types. Other tools may offer similar and you will probably want to customize some of the options to get the best out of it. Have not tried the cloud option which would take away any implementation and upgrade issues.
Managing the test cases and defects tracking are the most valuable features, which we use daily.
I like all the features and can't think of anything that needs improving.
I've been using it for eight years.
No issues encountered.
No issues encountered.
No issues encountered.
Nowadays, all companies are looking for free software, and for this reason, many companies are developing their own tools similar to Quality Center and using that tool instead. License costs have a direct effect on the ROI of the company.
It's an excellent tool.
There are several features of ALM I found to be extremely valuable.
This question should actually be divided up. Several companies receive a different value add from different components of ALM. Some use it only for managing tests and defects and leave out requirements.
But just for sake of overall added value to me, the Test Planning and Lab portions are extremely valuable especially pertaining to a BPT license. Creating your core BPT components and mapping corner and edge test cases from that makes it easy to create regression test beds as well as facilitate Agile development. Also, if you are talking automation, the BPT component is critical in helping with the BPT test driven framework. Of course, the Defect module along with the Test Runners are key for execution and defect reporting. I love the ability to customize different attributes to defects in order to facilitate a specific release type.
ALM has driven some of the projects in my past organizations.
The defects section is the most valuable.
It has improved our testing designs and test reporting.
I've used it for two years, although not in my current role.
My department doesn't do deployments, but I believe that the team doing it does have problems.
We have a specific department that deals with customer service.
Technical Support:We have a specific department that deals with technical support.
Test Planning and Test Lab modules are the most valuable to capture test cases and track execution. Defect module for tracking defects in testing and to capture production incidents.
The primary HP QC modules, requirements, test plan, test lab, and defect management have become, over time, foundation stones in our project teams development methodology. In each area, the modules provide the fundamental processes to record scope, capture test cases, track execution for each phase of testing (functional unit, string/business process, integration, user acceptance, etc.) and our project management team are all HP QC "savvy" from a standpoint of using the tools to manage the project team, the component releases and change requests, that flow through our team.
The product continues to evolve and improve and we are now on v12.01. The defect module, while fundamental and more or less consistent over numerous versions, is an area we would like to see improved regarding how response time is measured in the standard application. Reporting is another area that could stand improvement - many times the data is simply exported out to Excel for analysis.
We have used HP ALM/Quality Center going back to its days as a Mercury Software product, 2006-2007 and have evolved up thru 12.01 at present.
At Verizon we are 'clients' on a supported application base. Application project teams are supported with domains and projects within a central installation. We didn't deploy the application, per se.
As a client, no, we have not have any major issues with stability. The application is pretty much available during business hours with the exception of routinely scheduled maintenance windows.
No issues to date. We're just a client (one of many project teams supported thru a central HP ALM/QC test tools support team) but the number of project teams that are supported via our central team would seem to imply that the application can scale to support large organizations split amongst multiple project teams.
As a customer/client of a central VZ QA/ALM installation, the few times we have needed to be in direct contact with HP, they have been responsive. We had a better relationship, overall, with Mercury Software before their acquisition by HP, but that was several years ago now.
Technical Support:Most of our technical support questions are fielded by our own in-house QC ALM support team. I can't directly speak as to their relationship with HP regarding direct technical support questions. Where we've had issues with specific installations, etc., they have been quickly resolved, so the assumption, always dangerous, would be that technical support is responsive with the primary vendor.
We have used this application for a number of years now. There have been explorations of a variety of open source, "DevOps-inspired" applications, as a potential replacement. To date, there has been no determination to move away from this application as our standard.
From a project team standpoint, the setup was very straightforward. All the tools are accessible and installable via browser.
We have an in-house one team who are supporting several portfolios within our IT organization. I would say their level of expertise is good to excellent.
I hate to say we haven't taken an independent project level analysis of ROI -- at this point, it's more an integral part of our application support model and a focal point for project level activities. Overall, even if informally measured, it's very high, if by no other measure than how deeply ingrained it has become in our project methodology and project tracking metrics.
Licenses are a major factor -- they are not inexpensive but with concurrent licensing our global IT groups are able to share licenses around the clock.
At the time we first utilized Mercury Quality Center, they were pretty well established as the industry leader in this space. When HP acquired them (2009?) they were the 800-pound gorilla in the test tools field.
For most large companies/installations, you will need to establish a core testing tools support group. This group can handle the care and maintenance of the application itself, the plug-in tools, user management, and deployment to various project teams. I would think taking this one within an isolated project team would be asking for headaches. Many organizations have turbulent histories with centralized testing -- it seems to typically depend on what is business critical -- not only externally, but internally (HR Payroll, for instance -- most companies can't tolerate issues with defects around payroll..
Previous to the utilization of Quality Center our requirements were created and stored in Word documents, and all Test Plans were facilitated through Excel, and there was little coordination or consistency to testing standards. Quality Center has allowed us to better track our testing coverage and plan our releases.
I think there are still some changes to help integrate with agile processes better without having to use a separate product. I think that since Quality Center has had functionality added over such a long period of time, that certain modules and other HP tools could be better integrated.
Initially when we started using Quality Center we had some issues with scaling the solution throughout and updating across multiple teams but these issues have since been resolved.
I would rate the level of customer service very high.
Technical Support:Its very high, and once you establish the proper channels and key contacts to work with it is pretty seamless.
We did not previously utilize a different solution for managing our requirements and testing efforts.
Our implementation and setup was somewhat complex do to our enterprise architecture. We have multiple divisions across two companies that share the same servers and architecture but have different needs with regards to setting up and managing projects.
Initially we implemented Quality Center ourselves but then went with an outside vendor later due to some complications. Depending on the complexity of your organization I recommend working with an approved vendor or service partner to setup your installation.
We haven’t really calculated ROI on our testing efforts as of yet.
There are many valuable features HP Quality Center has to offer, but if I had to narrow it down I would say the following for me are the most valuable:
What makes this product very useful in improving the quality of an organization, is the fact that it has the ability to create a test script and then to write them in detailed steps. For all test case executions, we are able to generate customizable reports and charts, which is very useful for sending reports to higher management. With these great features, QC has made communicating between upper management and the QA team much easier, which gives better insights to our defect tracking and managing. This reporting is then better used for tracking the finances for the team.
There were few issues I faced while using Quality Center, but I’m sure they have been fixed in the new ALM version. One issue I have faced was that while I was importing test cases from Excel to Quality Center, it was not asking to check out the test cases but instead it would overwrite the default test cases and create a new version for it. This was not a consistent issue but it did happen a few times.
I have used HP Quality Center for about 4 years. I have knowledge on the current ALM version but personally have not used it in any projects yet.
Quality Center has a Starter Edition which is usually for entry-level quality assurance organizations and the Enterprise version (originally called Mercury TestDirector) which is for medium to larger level releases. The new release of the software is HP ALM (Application Lifecycle Management) 11 has integrated the capabilities of Quality Center Enterprise with Project Planning and Tracking, Enterprise Release Management, and Asset Sharing for requirements management through application delivery. HP ALM’s intended use is more for large and global organizations.
I’ve personally never had deployment issues.
Quality Center for the most part is pretty stable besides some common issues.
Scalability-wise, Quality Center is an awesome tool. Quality Center itself doesn’t actually place a limit on creating projects or folders. Most of it will depend on the users, servers, and hardware and not the Quality Center client itself. But the more data the user has in a specific module, makes it slower to load on the client. Most of all, it will depend on the implementation.
HP Quality Center is not a new product and has been out in the market for quite some time so there are plenty of online support and help that can be found. Quality Center forums can be found for almost any issues that can come up anytime.
Quality Center has a very intuitive GUI which makes it fairly easy to use and follow. Even if you are a beginner picking up on how to use this product will not take much time. But it can be difficult to implement as it is dependent on the size of the organization on the amount of teams. Identifying current methods of communications is critical to implementing HP Quality Center.
Quality Center can be a bit costly, but the ROI is great for all the great features you get.
Quality Center is a very powerful tool. It is not only a defect tracking tool but also a management tool. It can be used for everything from creating requirements and test plans to test creation, execution and defect reporting.
The processes of both test execution and test tracking have become more transparent.
We use it for big SAP implementations, which provides ROI after the first project.
More flexible reporting would be good.
I've used it for one-and-a-half years.
No issues encountered.
No issues encountered.
No issues encountered.
The support is good, however their response time could be improved.
No solution was previously in place.
It was straightforward.
A vendor was used for support during the implementation, but we had internal knowledge as well.
The implementation ROI highly depends on the size of the project the tool will be used for in the future. For big SAP implementations, ROI will be gained after the first project.
SAP Solution Manager TAO was considered as we are using that tool for Application Lifecycle Management.
But due to the integration possibilities with HP ALM and SAP Solution Manager, HP ALM was the choice we made.
The Open Test Architecture (OTA) and development of the REST API. The OTA is a published set of functions that administrators and users can use to interact with HP ALM programmatically. The most common example HP ALM users would recognize is the Microsoft Excel upload template, which allows users to upload test scripts to HP ALM projects directly from an Excel worksheet.
The REST API sneaked into HP ALM with little fanfare. The REST API has no application overhead and is fast. HP extended the API through patches in v11.0. Please check your current version and patch level to see which functionality is now included in the REST API.
On a recent multi-year project, the average defect resolution time for all defects was over twenty-two days. My goal was to reduce this number by 20%. It was an easy goal to reach because no one realized that 80% of 22 days was still a number out of bounds for defect resolution. I used custom fields, defect workflow and custom reports to move defects through their lifecycle. Within thirty days the defect resolution time was reduced to 3.1 days and averaged 1.1 days over the next eighteen (18) months.
The graphical user interface has the most room for improvement. Not all screens within the integrated suite refresh the same, some screens or activities are self-refreshing and some are not.
I would also like to see the “Disable Quick Runs” added back as a site parameter or built as an internal function within a project.
I've used it for seventeen years (1998 to present). This product was initially developed by Mercury-Interactive and released as Test Director. My first enterprise installation and administration experience with Test Director 2.0 was in 1998.
Over a seventeen year period, yes. The key to maintaining a site today is in patch management. Keep the patches up to date.
When the patches lag too much, it may be safer to build a new site and port the data than to try and patch an existing site in place.
Customer support for this product is with the vendor, not necessarily HP. After a site has been up and running a few years and all the original implementers are gone, it can take some time to even determine the vendor. My satisfaction level with vendors range from acceptable to excellent.
Technical Support:My personal satisfaction level with HP service and support website is low. I get the majority of my technical information from colleagues or third party discussion forums.
In the Application Lifecycle Management space, HP ALM and IBM Rational are the two big players. I recently participated in an evaluation of the IBM Rational Jazz Platform. The client had been using IBM’s ClearCase and ClearQuest for many years. During the evaluation, an unrelated IBM audit detected a long dormant pack of five ClearCase licenses on an active server. The cost associated with this incident ended our evaluation of the IBM solution.
Historically, most people considered this to be a defect tracking only tool. In that domain, tools are plentiful. Over the years I’ve used VI editor on UNIX, Microsoft Excel worksheets Microsoft Access databases, Bugzilla and Notepad for defect management.
The installation and configuration of an HP ALM site is straightforward for those with enterprise software deployment experience. An installation requires at minimum, a dedicated server with an operating system and database connection. The most typical, physical or virtual, hardware configuration I encountered over the years was a single Microsoft Windows server running web, license and application software servers. Both Oracle and MS SQL Server databases respond adequately, and when given a choice now select a schema based on DBA agreeability.
The installation of an HP ALM site establishes a service endpoint for communicating with other applications via Web Services. I believe the configuration and management of these services is the most complex part of a site installation and requires substantial planning to map fields and permissions across multiple applications.
HP ALM resellers typically perform the initial set up and configuration of the HP ALM site and user projects. In some cases, larger testing firms are also resellers and provide the tool as part of the project. I fundamentally disagree with buying a tool from its eventual user.
I advise clients to do the upfront planning and limit users with access to the site administration console to three or less. The planning required for a successful implementation requires much more time and effort than the deployment itself. Deployments are typically scripted while planning requires humans. Access to the HP ALM site console is separate from project access. I have seen sites with twenty or more registered site administrators. I believe this occurs more as a symptom of long term neglect than an implementation issue.
I encourage clients to use the built in service accounts and APIs where practical.
A comprehensive solution aiming for total Test management. It has bindings with HP QTP which is a test automation tool. With the addition of BPT (Business Process Testing), QC along with QTP has been the choice for test automation in most of the IT businesses.
Maintenance of test artifacts becomes messy when there are many scripts and supporting test artifacts. HP QC makes it easier to maintain the scripts and also organize them into various business areas.
Backup strategy is lacking in QC. Once the QC server crashes, all your automation artifacts are unrecoverable. Though we can create QC project files to create a backup, there is size limitation when you restore the backup.
4 years and 6 months
Configuration of QC contains a number of steps which could be made simpler. Many of these settings can be default with an option for the user to change them.
Getting to the root cause of any issue with QC is difficult. In many cases, restoring the QC projects is the way to resolve issues. Though HP support is available, it is not immediate.
As with the scalability, there are no issues with QC. It handles multiple connections very easily. And also if there is a crash on one machine where one test artifact is locked, the lock is automatically released in some time. But it would be great if there was recovery options built in after a crash.
The technical support is prompt and understand business urgency.
I have worked on TFS. The features of TFS are limited when compared to QC.
The initial setup was pretty straightforward.
I've been using it since 2004, when it was known as Mercury Test Director.
No issues encountered.
No issues encountered.
No issues encountered.
7/10.
Technical Support:8/10.
No previous solution used.
It's simple, but customization adopting for a regulated environment is complex as it requires 15,000 lines of code.
It was done in-house.
Review if the all-to-be licensed functionality is needed as certain modules are not used as they introduce needless complexity. You should aim for concurrent licensing if global us is needed as slack periods in one time-zone can be picked up by another.
No other options were evaluated, we just upgraded from Test Director>Quality Center>ALM, and we are planning to upgrade from v11 to v12.
ALM/Quality Center is expensive, but it has its value and, in certain cases, the Enterprise edition is way too much, but it is very stable and reliable. You should review v12 Webclient solution for requirements management.
We now know more about how to test.
Import Library - Synchronization problems are often because the ALM interconnection with HP Solution Manager is too fragile.
I've been using it for six years.
We had a problem with the consultant.
Not really. Our big problem is that our in-house support of Solution Manager is not compatible. It's not a fault with the product.
For migration from v10 to v11, we needed to convert manually each project because the HP consultant, and our project chief, couldn't use ROBOT in Apache.
6/10.
Technical Support:5/10.
MS Excel was our first solution.
We paid for an HP consultant who wasn't experienced. They called everyday about everything to the central development team of HP.
It's not so good because some users don't see its real potential because they are afraid they will lose their job.
Keep it simple for an easy beginning, and don't waste time with incompetent people.
The company needed an option to integrate all our open source tools like JIRA, Jenkins, LFT, UFT, etc., and Quality Center does this.
It should have the option to integrate open-source and third-party tools. I'd also like more collaboration options. They could make it more lightweight and improve its performance.
I've used it for eight years.
There were some data migration issues when we upgraded to the next version. This was with the user-defined table/column as the extra tables were complex when converting to the newer version.
No issues encountered.
No issues encountered.
It's good, 8/10.
Technical Support:It's good, 8/10.
We have other open source products that do not meet all our requirements
It was complex.
We did it in-house.
It’s good to use for a big organization.
Easier Test Lab management and defect life-cycle tracking, which is useful during triage calls.
Quality Center has definitely added value as it has easy accessibility for all our employees and is very useful for defect triage and scrum meetings.
I've used it for six years.
No issues encountered.
No issues encountered.
No issues encountered.
9/10
Technical Support:8/10
We previously used Bugzilla and switched because Quality Center is easier to use, has good reporting, and has better accessibility.
Initial setup was straightforward.
We implemented it in-house.
100%
There are many free-ware tools available, which you can try.
The reports we can generate are better.
The performance needs work, as over the past couple years, where I have worked with large companies, I noticed slower response times.
I have used HP ALM and Quality Center and TestDirector for 18 years.
There were some issues.
It is stable in its functionality, but there is some slowness with its performance.
It seems to function with very large companies, but sometimes there seems to be a slower response time, and it could be an internal network issue within the company, but I'm not sure.
I have not had to work with the customer service.
The Test Plan and Test Lab modules.
My clients were able to implement an end-to-end process for software testing. From release management and requirement gathering, to testing and defect management, Quality Center provides a centralized location for managing all aspects of your testing data and results.
The reporting features could be improved.
I've used it for nine years.
There are a lot of nuances with integrations and implementation of Quality Center and third-party applications. With proper planning and expertise guidance, issues with deployment can be resolved. I have not encountered any issues with deployment that could not be resolved.
Stability depends on the hardware/infrastructure where the web application resides. Availability, access, memory, speed are dependent on the environment which is setup, and used, for Quality Center deployment.
No issues encountered.
7/10 for customer service.
Technical Support:Being involved in the HP partner program for many years and involved in a certified support team, I had access to higher levels of support team members which alleviated the hurdles that may be present at the bottom level support team members.
No previous solution used.
It does require a level of expertise to install/setup Quality Center. Doing it without any prior experience, could cause a delay in deployment, as well as unintended issues.
I was a consultant for a Fortune 500 company where we implemented this product not only in our own environment, but for our clients as well. I have implemented this product to over 50 companies since 2006, including top pharmaceutical, and medical device companies within the health and life sciences industry.
Understand what your company needs are, and how many users will need access. There are different licenses based on local, regional, and global access as needed as well as total amount of users. There is also licensing based at a modular level as well. Work with your HP Sales representative to get a pricing/licensing plan for your specific needs.
No other options were evaluated.
Do your research and talk to an expert regarding the product. Having demonstrations and trial access to the product helps with decision making. Understanding the requirements and your current environment helps guide the discussions with an expert. Understanding the limitations will also help.
All features have their own value, but the most valuable ones are--
Change Management integration - The ability to create change documents on Solution Manager linked to an event and to change its status according to ALM status or to customize it. This is new and I've only used it on one project so far.
Business Process Change Analyzer (BPCA) - It can analyze objects on SAP transport requests to create a Test Set according to scenarios created. Also, because ALM is integrated with blueprints that generate requirements that are converted into a test scenario to validate the changes, it checks if those changes will cause an impact on the selected business process.
Manage Regression Testing and Integrated Test - It's the most important and most popular feature for all the projects I have worked on.
I've been using it for five years, and currently use it alongside HP ALM v11.52.
We had problems with Solution Manager/SAP integration and use through customizing RFC calls.
Not with the tool. Usually problems happens because of a network delay or instability.
An HP expertise team was put together for implementation if needed, but there was no need for them.
10/10
Technical Support:10/10.
I had already used IBM Rational, which is good too, but the HP tool is more complete.
It was done in-house. The team that works here has experience with HP Quality Center and ALM on other projects. The team expertise is high.
Depends on how much you pay for this product and the size of the project. For a big project, it's a great tool that will help a lot.
Use all that his product can offer as there is no need to buy others that can do the same tasks that HP Quality Center does. It's a complete tool that you can customize according to business/IT/user needs.
It needs compatibility with browsers other than IE.
I've used it for more than eight years.
Sometimes it has tricky errors, but rarely.
Sometimes, it has performance issues at some points, but this all depends on a million different things.
It is a very close community, and luckily there are a lot of posts.
Technical Support:I've not had any direct contact with them.
No previous solution was used.
We've never calculated it.
This is for big software houses, so costs and especially yearly renewal of support is very very expensive.
No other options were evaluated.
Be organised, as the tool has the abilities to support this.
Accessibility! The reason I gave it a 9 not 10, is that I doesn’t support Apple machines or any browsers other than IE, and even then, later than IE 10! This is a big problem if the development team who should be working on defects are using Apple machines, which is very common. This is also a big problem for us if higher management wants to take a look on defects for one reason or the other. They are usually on the run and can’t access it using their iPads, for example.
This is a problem that JIRA solved, and it's now practically accessible from any browser on any type of operating system, and can be opened on a cellphone/tablet browser or through mobile applications. It’s perfect when it comes to accessibility and this is what Quality Center desperately needs.
I have been using it for more than six years.
Performance issues are very common. The degradation of performance and consequent failures continuously happen. “Failed to Login” errors are common as well, and some random issues like creating the issue twice and deleting one deletes the other, etc.
No issues encountered.
It's the perfect tool for testing purposes, but you need to consider other options if development teams do not use the environments supported by Quality Center.
I've used it for six years.
When we were installing v10, the installation became corrupted. So when we upgraded to v11, it was very expensive, and at our own cost, to do. This was regardless of our maintenance contract with HP.
No issues encountered.
Our vendor is 10/10, but HP is 7/10.
Technical Support:Our vendor is 10/10, but HP is 8/10.
No previous solution was used.
The installation tools are not the best, even for experienced IT/admin, it self-corrupts, and there is no good tech support to help with install issues unless you pay them to fly in a team.
We did it in-house.
It's 300%.
It depends on your vendor. SkyIt was the best, as they were able to get the initial cost low enough so a small startup could afford it.
For small companies where audits/lawsuits etc. are not a factor, it's not worth the investment. You should use open source or lower cost alternatives (JIRA project/defect tracking, Test Link open source QC like Test Tool). However, for any company that wants a mature, highly developed platform that is constantly improving, need to survive audits, etc., you must consider HP ALM solutions such as HP Quality Center.
It is useful in test-case maintenance as it helps with traceability.
I used it for two years.
Not that I am aware of.
The stability of the product is good.
No issues encountered.
I had little to no interaction with customer service.
Technical Support:It's good.
We figured out HP ALM is good and switched from using Excel spreadsheets.
It's straightforward. The user guide provides detailed steps on installation.
We used an in-house team to implement.
We also looked at MTM – Microsoft Test Manager.
If the company plans to use QTP/UFT then HP QC/ALM is a great option. UFT integrates with ALM and we can run test cases remotely.
The ability to keep test cases and defects centrally located, accessible to multiple people instead of in document format, is the most valuable feature.
We no longer need to use documents for test cases, which are brittle and difficult to keep updated.
I've used it for a few months, but I also have experience with Quality Center, the predecessor.
I wasn't involved in the deployment.
The site has to be reloaded every time there is a change in the background.
No issues encountered.
It was complex, as I was not given permission to delete items, for example. Trying to create test cases via copy-and-paste can be cumbersome, and it was easy to misread the directory structure or put cases in the wrong location.
I believe that the tool is probably not worth great expense, although it is better than remaining tied to documents and spreadsheets.
Try MS Test Manager first, particularly if your code is .Net and/or your developers use Visual Studio.
It has a very large footprint, and takes an inordinate amount of time to load the components and seems to need to do this quite frequently. Also, with many of these tools, there is always room for improvement with the UI in terms of intuitiveness and functionality.
I've used it for five years.
Just the amount of time to install as well as the reinstall frequency.
No issues encountered.
No. Scalability is another key reason for using this tool as mentioned above.
As with any big company, HP support is good but costly.
Technical Support:I have not had to deal with them.
We really started using ALM about five years ago when our testing automation efforts kicked into high gear. Up until then, we were tracking testing using various other smaller tools.
Setup was straightforward once the needed hardware was defined and in place.
It was done in-house.
We have other tools with HP, so it's bundled in with these and hard to measure ROI specifically for one tool.
You need to negotiate with HP.
It's a great tool for a company looking to establish a scalable solution that will give you flexibility as you grow, BUT I would highly suggest you have individual(s) with the expertise to care for and feed it.
The performance of this product is really poor, and there is no dashboard (reports) for the groups or users on the home page. Also, instead of one database, there are separate ones for each project.
I've used it for three years.
No issues encountered.
No issues encountered.
If you add any custom workflow (scripting), then in the long run there will be a deviation in application performance.
It's excellent.
Technical Support:It's excellent, but they don't support scripting unless it's in your PO.
We used JIRA. Its is also a good tool for defect tracking, but due to the lack of test and requirements management in JIRA, we have moved to ALM.
Straightforward. I've gone through the installation admin, and followed the same steps and it was fine.
We did it in-house.
It's a bit costlier when compared to other tools on the market, and you have to haggle with HP for this product.
We also looked at TFS.
Its a good tool for a perfect software development company where release/project roll outs happen on a planned basis. It's not a supportive tool for an agile environment.
The most useful are the requirements repository, the ability to link requirements to test script steps, and the use of traceability matrix reporting.
Houses requirements and testing with approvals all in one place. Signature approval capability was very useful.
It could be better by incorporating more spell checking and word processing functionalities in steps. Also, it could be more user friendly to "call" other scripts.
I have used the latest version of ALM for about four years, but I have used all previous versions going back to 2001 through to today.
No issues encountered.
No issues encountered.
Mostly with the blocking of users when the license limit was reached.
It's 5/10 as our own resources were more useful at solving issues.
Technical Support:It's 5/10 - our own resources were more useful at solving issues due to time constraints with responses.
We did not have a previous solution. This was our first solution.
Make sure it is easy to use for your roles. If you have technical people, other solutions may be better (like resources with VB expertise, etc.). Make sure you implement it as organized by system functionality, not by project.
My current company is just starting to use it, and they keep copying test plans for each project instead of reusing original test plans, which is a waste of time and resources.
Quality Center has helped my organization in monitoring the testing process and improving productivity. The test execution and creation monitoring feature in Quality Center is one of the most advanced features available in the industry. You can easily track the testing process as minute as test cases executed on a particular system.
Almost all of the areas are very advanced, but one module which needs improvement is report extraction, and billing module is missing.
I've used it for eight years.
No issues encountered.
Too many users logged in at a particular time affects the Quality Center response time dramatically.
No issues encountered.
It's excellent.
Technical Support:It's excellent.
There was no previous solution was in place.
It was very easy and straightforward.
We used a mix of an in-house team and a vendor team.
We've seen ROI, but I can't share any specifics.
Pricing is high, but with new tools available in the market at a lower price, it is worth doing a decision analysis and resolution.
There are so many other tools available in market, so before investing a huge amount in Quality Center, you should analyze other tools as well.
We are using the solution for multiple purposes: test management, defect tracking, traceability, requirement tracking, and test execution.
We're regulated by the FDA so the number of manual signatures and paper-based signatures was reduced. We have had less waste, have been able to collaborate more, and have saved time.
In future releases I'd like to see better reporting, a more simplified UI, and improved metrics. It would also good if they removed some features to simplify the solution. It only supports internet explorer right now, so it would be good if it could support other browsers. Browser costs are also fairly high.
Pretty stable.
It has been fairly scalable.
HP support is good and they are very responsive. They're not world class, but they're good.
We used Word documents, which was not efficient.
Though we only purchased ALM less than a year ago, we believe ROI is good.
Prospective buyers should know that it doesn’t support all browsers and that browser costs are high.