What is our primary use case?
We use it for defect management and for test cases. We synchronize it with JIRA for the requirements and the defects side of things.
We're also using it for our UFT script repositories, but that is more than likely going to change, in the next couple of months, as we go across to GitLab. It's just simpler to have all the artifacts for a particular iteration in one place.
Quality Center is cloud-based with a local client.
How has it helped my organization?
The way Quality Center improves our organization is with the traceability and through standardardization. It's about having the test cases all in one place. That's very important for us. It will be even more important once we revive the regression suite in the coming months. It's extremely important to have one source of truth.
It definitely helps in standardizing our testing process and, if utilized properly, it will streamline it because everyone is using the same standards and capabilities. It has helped with that in the past and will in the future as well.
Quality Center also assists with risk-based testing. You can put risk ratings on test cases as you go, and if you do that you know which ones need to be run, for sure. It doesn't have very much smarts around it though, it's just a field that we fill out. It doesn't utilize AI, which some of the tools in the market are purporting they can utilize to determine which test cases need to be run. But I think it's very early days for that yet and I'm exceptionally skeptical about it.
What is most valuable?
The automated scripts give us management control.
Defects are widely used within our organization.
We've had a little bit of a hiatus on the test-case side of things, because we decentralized the testing team, but that's about to be re-centralized. The test-case repository and linkage through to regression requirements will absolutely be a key component for us. We haven't got it yet, but when we've got an enterprise regression suite, that will be a key deliverable for them. We will be able to have all of the regression suite in one place, linked to the right requirements.
Also, its traceability and visibility features are good when it comes to managing multiple projects, which is how we've got it set up. The reporting was a little bit clunky to start with, but we've built some reporting out of it now as well, to give us a cross-portfolio view of those projects that are using ALM. Each project can do its own thing, to a certain degree. There are some standard fields that we don't bend on, so that we can get the correct reporting out.
There's no problem at all with its ability to handle a large number of projects and users in an enterprise environment. We only ever have up to 60 concurrent users, but the number of users we've got in the database is in excess of 250. We manage it reasonably well, that way. Project-wise, we've got about 40 to 50 projects in there.
The security features are good. They will be better once we get the single sign-on capability with ADFS on ALM 15. We're very keen to get that capability up. We're looking at the implementation process for single sign-on right now. It should be okay. It makes things a lot more convenient for us, particularly as we have a number of contracts users come in. When they go, we've got to manually remove them from ALM at the moment, because it's got its own authentication. Because it's in the cloud, anyone can get to it directly from anywhere. They don't have to come through our network to get to it. That is good in some regards. But it does give me some concerns when people have departed, or when organizations that we've been working with have finished up with it, because we have a separate swipe that we've got to do to remove any users who are no longer working with us.
What needs improvement?
There's room for improvement on the reporting side of things and the scheduling, in general, is a bit clunky.
They can also improve on its interoperability with other tools. All tool sets need to evolve in that regard. They need to understand that you don't buy all one color of tool sets these days and that some tools do a job better than others, depending on what it is. If I've got an industry-strength configuration management tool and repository, like GitLab, I'll pull my stuff out of ALM and I'll interface with GitLab from ALM. That interoperability with other tools sets, the standardizing of interfaces, is an area to work on. All of the tools in the industry are the same. You get a new version of JIRA and it no longer works with the likes of ALM, or you get a new version of IBM UrbanCode Deploy and it doesn't work properly and you've got to do a configuration with GitHub or Artifactory or even ALM, for that matter.
The other thing that ALM could do well with is to move away from Internet Explorer. I believe they're doing that with version 15.
For how long have I used the solution?
I go back to Test Director days, Test Director 8. That was around 20 years ago.
What do I think about the stability of the solution?
The stability has been fine. If ever we do have problems we're straight on the phone to our customer success manager and he gets onto any issue that we've got, immediately. But it very rarely goes down.
What do I think about the scalability of the solution?
We only use whatever our concurrent is. We run very lean at the bank, very lean. That goes with all of our tooling. We have a concurrent licensing model that is well under the maximum number of users. If we find that we haven't got enough licenses we adjust the time-out so that people are not holding onto licenses unduly.
With Quality Center, for user scalability all we do is get extra licenses. We've never hit any sort of limit on the size of the project.
We've got a number of admin users, a few site admin users; there's one per domain in our model at the moment. They are the super-users who look after everybody within their domain. Within projects, it's up to the different projects or squads to work out whether they need what we call TD admin users in there. There are also defect-owner users. We also have some analyst users and some tester users.
We'll be increasing usage because we've just kicked off our transformation program with a third-party. As a part of the agreement they are using it, so we'll be upping the number of users that we have. And by reestablishing the centralized testing thing, we'll also be ensuring that Quality Center or ALM is used as our tool of choice. We will reestablish the standards that somehow were dropped when we went to Agile.
How are customer service and technical support?
They coordinate it for us but I do have direct access to the tech support guys. Typically, if there's an issue, I'll get on the phone and notify our customer success manager. Either we will already have raised a ticket or he'll raise one for us. Then we'll work through anything that we need to do to get things fixed so we're up and running as quickly as possible.
There have been some issues around getting any major problem that we've had resolved, although we've had very few major issues. It's just a matter of keeping at it until it's fixed. Having that CSM in place allows that to happen.
Which solution did I use previously and why did I switch?
Before Quality Center the only thing we were using was JIRA. We interface with JIRA. Some teams want to use it for defect tracking. We keep JIRA and ALM in sync using the synchronizer tool that comes standard with it.
JIRA and ALM have different strengths. JIRA and Confluence do Agile planning and management well, and ALM does defect management and test case management and reporting well.
How was the initial setup?
The fact that we've got it in the cloud at the moment, as software as a service, enables us to keep up to date. If it's a back-end or a server-only change, it just gets done. That's the beauty of the arrangement we have with a SaaS or cloud-based version.
We started using the cloud-based version about four years ago. The setup was very easy and very quick. I did the migration. We had to unload the databases on-premise and FTP them across to the cloud overnight. We did it project-by-project or by groups of projects. Each one of them had its own backup/transmit/reload. They then went through a series of validations and were up and running the next day.
I did it on a project-by-project basis because there was a lot of data that had to go across and be uploaded to the cloud. Once it was up there, I logged on, checked it, and then got the SMEs from the different projects to validate that everything they needed was there.
Having to package up and coordinate clients is, occasionally, difficult, but that's just a project management issue: scheduling things at the right time. Sometimes we have problems and we have to go in and individually blow away components for the product for the client. That's more because of our setup, our configuration on our network, than it is the tool set. We do that with most tools. Occasionally have to rebuild when we've had version upgrades, but not for everybody.
For maintenance there's only two of us, myself and one of the guys that works for me.
What's my experience with pricing, setup cost, and licensing?
As an end-user, of course I'm going to say that it's too expensive and I want things cheaper, but don't we all?
Aside from the standard licensing fee there are no additional costs. It's set up with a good agreement that runs three-yearly.
What other advice do I have?
Do your homework on it to really understand how it works. I've worked at a number of different organizations that have had Quality Center, Test Director, and ALM. They have all been set up differently. I'm also guilty of having gone in as an external contractor and setting it up the way that I want it to run too. But if the time is taken to set it up properly, you will get strong value from it.
The biggest lesson I've learned from using Quality Center is that, when it's used well, it's an exceptionally powerful tool. When you use all the features of it, when you have things that are standardized and locked, it's a really handy tool in governance around testing and projects. But in an environment where you've got multiple external contractors or vendors coming in, where they all tend to bring their own way of doing things, it's good that it's flexible enough to accommodate that, but at the same time it leaves you with a bit of a mess to clean up afterwards.
It's really about making sure when you do implement it that you understand your process, you understand your workflows, you understand the standards that and the reporting that you want out of it, and you set it up accordingly. If somebody comes in and says, "Oh, I want to know what my defect aging is," you can say, "Well, here's the report that does that," if everything's filled out properly.
I've seen it set up really well in a couple of places, and it was really good to have it set up well because we could get the information out of it when we needed it and we could ensure that things were tested properly.
When it comes to connecting all related entities to reflect project status and progress, we have to do a little bit of tweaking, but we can customize it. We can always do better with the cross-project reporting. But the biggest issue we have is that we need to re-centralize testing to get the standards enforced. At the moment, since we've moved out and become very Agile, we've become very lax as well in being able to keep the likes of test cases — in particular regression suites — up to date. That is one of our reasons for reestablishing a centralized testing team. It's nothing to do with the product. It's just that everybody decided, "Hey, Agile's the way to go," and a lot of people with Agile thought, "Oh, we don't have the formality and the structure and standards around testing," which was not good.
At the moment we're in a bit of a state of flux because we've had the whole Agile movement start to hit us. Unfortunately, that meant that there was a decision to decentralized testing and put it out into the different Agile squads, which in turn meant that there was no standard way of doing things. Now that we're engaging in a transformation program, we need to re-establish that standard way of doing things, because we're working with third-party vendors. We're centralizing, ensuring that things are handed over in the format that we want, ensuring that the third-parties are utilizing ALM as the tool set for their test case repositories, and as the defect management tool as well. Being an industry-wide, and understood, standard tool, it's very easy for us to go to our partners and say, "You've got to use ALM because that's what we're using." We are still going to be Agile, but we'll be doing centralized testing.
I wouldn't say Quality Center has reduced the time required for testing. It's a tool. It supports our testing process. It gives the governance and standards around the testing that's done, but as a tool it doesn't reduce the time for testing. Something like automated testing will reduce the time for testing. However, by association, I suppose it might reduce testing time because it's where we execute our automated scripts from.
We haven't found that Micro Focus is still investing so much in Quality Center and releasing valuable features. They did do a big push to go towards Octane and we trialed that. Because we have multiple best-of-breed tools in the organization, Octane could plug-and-play with a lot of them, but then it became an overhead to be able to manage and maintain.
With ALM in Australia at least, there's enough support and development going on. I know the APIs into ALM have improved, and they needed to because aspects were pretty clunky. Now that we've got a REST API that we can use, that's a lot better. So they're sort of keeping up.
I would rate Quality Center at about eight out of 10, but I have a testing background. I'm very stingy when it comes to rating things. I don't think I've ever rated anything to 10.
Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
What is the difference between Micro Focus ALM and Micro Focus ALM Octane?
ALM/Quality Center provides a comprehensive quality management platform including test planning and execution across the application lifecycle, to continually improve and deliver high-quality applications on time and ensure that they meet your business requirements and standards.
ALM Octane provides an integrated DevOps management platform including scaled agile management, continuous quality and delivery optimization.