Share your experience using SAS Data Governance

The easiest route - we'll conduct a 15 minute phone interview and write up the review for you.

Use our online form to submit your review. It's quick and you can post anonymously.

Your review helps others learn about this solution
The PeerSpot community is built upon trust and sharing with peers.
It's good for your career
In today's digital world, your review shows you have valuable expertise.
You can influence the market
Vendors read their reviews and make improvements based on your feedback.
Examples of the 83,000+ reviews on PeerSpot:

Senior Director at a retailer with 10,001+ employees
Real User
Gives a visual representation of how the data flows from tables to the metrics and has a big impact in terms of the transparency and understanding of data
Pros and Cons
  • "Being able to capture different business metrics and organize them in different catalogs is most valuable. We can organize these metrics into sales-related metrics, customer-related metrics, supply chain-related metrics, etc."
  • "There may be some opportunities for improvement in terms of the user interface to make it a little bit more intuitive. They have made some good progress. Originally, when we started, we were on version 9 or 10. Over the last couple of releases, I've seen some improvements that they have made, but there might be a few other additional areas in UI where they can make some enhancements."

What is our primary use case?

Our primary use case is that we want to enable self-service for different business teams to be able to find different data. We are using erwin Data Intelligence platform to enable data literacy and to enable different users to be able to find the data by using the data catalog.

It can be hosted on-premise or in the cloud. We chose to run it in the cloud because the rest of our analytics infrastructure is running in the cloud. It made natural sense to host it in the cloud.

How has it helped my organization?

I represent IT teams, and a lot of times different business teams want to do data analysis. Before using erwin Data Intelligence Suite, they used to constantly come to IT teams to understand things like how is the data is organized and what type of queries or table they should use. It used to take a lot of my team's time to answer those questions. Some of those questions were pretty repetitive. With erwin Data Intelligence Suite, they can now do a self-service. There is a business user portal using which they can search different tables. They can do the search in different ways. If they already know the table name, they can just directly search for that table name, and they will find the definition of each column there, and that would help them in understanding how to use that table. In some cases, they may not know the exact table name, but they may know, for example, a business metric. In such a case, they can search by using a business metric, and, inside the tool, they can link those business metrics to the underlying tables from which these metrics get calculated. They can get to the table definitions through that route as well. This is helping all of our business analysts to do the self-service analytics, and, at the same time, we can enforce some governance around it. Because we enabled self-service for different business analysts, it has improved the speed. It has easily reduced at least 20% of the time that my IT team had to spend answering questions from different business teams. The benefit is probably even more for business teams, and I think they are faster by at least 30% in terms of being able to get the data that they need and perform their analysis based on that. I would expect at least 25% savings in time.

It has a big impact in terms of the transparency of the data. Everybody is able to find the data by using the catalog, and they can also see how the data is getting loaded through different tables. It has given a lot of transparency. Based on this transparency, we were able to have a good discussion about how the data should be organized so that we can collaborate better with the business in terms of data organization. We were also able to change some of our table structures, data models, etc.

By using the data catalog, we have definitely improved in terms of maturity as a data-driven decision-maker organization. We are now getting to a level where everybody understands the data. Everyone understands how it is organized, and how they can use this data for different business decisions. The next level for us would be to go and use some of these advanced features such as AI Data Match.

In terms of the effect of the data pipeline on our speed of analysis, understanding the data pipeline and the data flow is helpful in identifying a problem and resolving it quickly. A lot of times there is some level of ambiguity, and businesses don't understand that how the data flows. Understanding the data pipeline helps them in quickly identifying the problems. They can solve the identified problems and bottlenecks in the data flow. For example, they can identify the data set that is required for a specific analysis and then bring in the data from another system. 

In terms of the money and time that the real-time data pipeline has saved us, it is hard to quantify the amount in dollars. In terms of time, it has saved us 25% time on the analysis part.

It has allowed us to automate a lot of stuff for data governance. By using Smart Data Connectors, we are automatically able to pull metric definitions from our reporting solution. We are then able to put an overall governance and approval process on top of that. Whenever a new business metric needs to be created, the data stewards who have Write access to the tool can go ahead and create those definitions. Other data stewards can peer-review their definitions. Our automated workflow then takes that metric to an approved state, and it can be used across the company. We have definitely built a lot of good automation with the help of this tool.

It definitely affects the quality of data. A lot of times, different teams may have different levels of understanding, and they might have different definitions of a particular metric. A good example is the customer lifetime value. This metric is used by multiple departments, but each department can have its own metric definition. In such a case, they will get different values for this metric, and they won't make consistent decisions across. If they have a common definition, which is governed by this tool, then everybody can reference that. When they do the analysis, they will get the same result, which leads to a better quality of decision-making across the company.

It affects data delivery in terms of making correct decisions. Ultimately, we are using all of this data to get some insights and then make decisions based on that. It is not so much of the cost but more of the risk that it affects.

What is most valuable?

Being able to capture different business metrics and organize them in different catalogs is most valuable. We can organize these metrics into sales-related metrics, customer-related metrics, supply chain-related metrics, etc. 

Data catalog and data literacy are really the great capabilities of this solution that we leverage. A data catalog is coupled with the business glossary, which enables data literacy. In the business glossary, we can maintain definitions of different business terms and metrics, and then the data catalog can be searched with them. 

Data lineage is very important to us. It is related to the origin of the data. For example, if a particular metric gets calculated from certain data, how did this data originate? From which source system or table did this data originate? After we have the data, lineage is populated, and some advanced users, such as data scientists, can use this data lineage to get to the details. Sometimes, they are interested in kind of more raw data so that they can get details of the raw table from which these metrics are getting calculated. They can use that raw data for their machine learning or AI use cases.

I used a couple of different Smart Data Connectors, and they were great. One of the Smart Data Connectors that we used was for our Microstrategy BI solution, so it was a Microstrategy Smart Data Connector. Microstrategy is our enterprise reporting tool, and a lot of the metrics were already built in different reports in Microstrategy. So, it made sense to use this connector to extract all the metrics that were already being used. By using this connector, we could connect to Microstrategy, pull all the metrics and reports from that, and then populate our business glossary with those details. This was a big advantage of using the Microstrategy Smart Data Connector. Another Smart Data Connector that we used was the Python Connector. It enabled us to build the data lineage. We already have a lot of ETL kind of processes built by using Python and SQL, and this connector can reverse engineer that and graphically show how the data flows from the source. This work was done by our data engineers or IT teams, but the business teams didn't understand how it is built. So, by giving them a visual representation of that, they became more data literate. They understood how the data flows from different tables and ultimately lands in the final tables.

It can be stood up quickly with minimal professional services. Its installation and configuration are not that complicated, and we can easily and quickly stand it up. We wanted to get faster time to value. So, we did a small professional services engagement to come up to speed in terms of how to use the product. Its installation and configuration were pretty quick, but afterward, for configuring it, we wanted to make sure that we have the right processes established within the tool.

What needs improvement?

There may be some opportunities for improvement in terms of the user interface to make it a little bit more intuitive. They have made some good progress. Originally, when we started, we were on version 9 or 10. Over the last couple of releases, I've seen some improvements that they have made, but there might be a few other additional areas in UI where they can make some enhancements.

For how long have I used the solution?

We have been using erwin Data Modeler for quite a while. We have been using erwin Data Intelligence Suite for about one and a half years.

What do I think about the scalability of the solution?

It seems to be easily scalable. I haven't seen any problems so far from the scalability aspect. We have strong support, so whenever I have some issues or there is something for which I need technical support, their support is always there to answer the questions. Their support has been great.

We have a few key users, such as data domain experts, and we have different business areas, such as marketing, sales, finance, supply chain, etc. Each of them has a domain expert who also has an account in erwin to maintain definitions. The rest of the organization kind of gets a read-only view into that. 

We have about 30 people who can maintain those definitions, and the rest of the organization can find the data or the definitions of that data. These 30 people include data stewards or data domain specialists, and they maintain the definitions of different business terms, glossary terms, and business metrics. There are about five different IT users who actually configure data lineages and data catalog definitions. These are the core teams that basically make sure that the catalog and Data Intelligence Suite are populated with the data. There are more than 200 corporate business users who then find this data after it is populated in the catalog. 

I would expect its usage to grow from 200 people to 2,000 people within the next year. When we become more mature at using this data and analytics, we will use the advanced features within the tool.

How are customer service and support?

In terms of the support that I'm getting, I'm able to get all my requests fulfilled. The only thing that happened was Erwin got sold and then Quest acquired them, but so far, I haven't seen any issues because of this acquisition.

When we upgraded the version, we had some issues related to Smart Data Connector not working properly, so we had to log a ticket with them, and they were responsive. They set up meetings with us to go through the problem and helped us in resolving the problem. Their support has been pretty responsive. When we submitted tickets, we got immediate attention and resolution.

How was the initial setup?

The initial setup was straightforward. We only had to work with the Erwin team to get some of the Smart Data Connectors configured properly.

Its deployment was within three months. Installing, configuring, and getting it up to speed wasn't that much of a pain. Getting business users to use the tool and making sure that they are leveraging it for their day-to-day tasks is what takes more time. It is more of a change management exercise.

In terms of the implementation strategy, we worked with Erwin's team. In fact, I hired their professional services as well because I wanted to make sure we get up to speed pretty quickly. The installation, configuration, and some of the cleaning were done with Erwin's professional services team. After my team was up to speed, we retained some of the key data stewards. We included them as part of the planning, and they are now kind of driving the adoption and use of the tool across the company.

What about the implementation team?

We didn't use any company other than Erwin's team. 

You don't need many resources for its deployment and maintenance. You just need one person, and this person also doesn't have to be full-time. Only in the initial stages, you have to spend time adjusting or populating these definitions. 

What was our ROI?

We are in the early stage of ROI. The ROI is more in terms of the time that we saved from the analysis. If the analysis is much faster, let's say by 30%, we can get some of the insights faster. We can then accordingly make business decisions, which will give the business value. So, right now, the ROI is in terms of being able to be faster to market with some of the businesses.

What's my experience with pricing, setup cost, and licensing?

Smart Data Connectors have some costs, and then there are user-based licenses. We spend roughly $150,000 per year on the solution.

It is a yearly subscription license that basically includes the cost for Smart Data Connectors and user-based licenses. We have around 30 data stewards who maintain definitions, and then we have five IT users who basically maintain the overall solution. It is not a SaaS kind of operation, and there is an infrastructure cost to host this solution, which is our regular AWS hosting cost.

Which other solutions did I evaluate?

When we were looking for a data catalog solution, we evaluated two or three other solutions. We evaluated data catalogs from both Alation and Collibra. We chose Erwin because we liked the overall solution that Erwin offered as compared to the other solutions.

One of the great features that Erwin provided was the mind map feature, which I did not see in any of the other tools that we used. A mind map gives a visual representation of how the data flows from tables to the metrics. Another great feature was being able to pull the metric definitions automatically from our reporting system. These were the two great positives for us, which I did not see in the other solutions when we did the proof of concept.

What other advice do I have?

We are not using erwin's AI Match feature to automatically discover and suggest relationships and associations between business terms and physical metadata. We are still trying to get all of our data completely mapped in there. After that, we will get to the next level of maturity, which would be basically leveraging in some of the additional features such as AI Match.

Similarly, we have not used the feature for generating the production code through automated code engineering. Currently, we are primarily doing the automation by using Smart Data Connectors to build some data lineages, which is helping with the overall understanding of the data flow. Over the next few months, as it gets more and more updated, we might see some benefits in this area. I would expect at least 25% savings in time.

It provides a real-time understandable data pipeline to some level. Advanced users can completely understand its real-time data pipeline. Average users may not be able to understand it.

Any organization that is looking into implementing this type of solution should look at its data literacy and maturity in terms of data literacy. This is where I really see the big challenge. It is almost like a change management exercise to make sure people understand how to use the data and build some of the processes around the data governance. The maturity of the organization is really critical, and you should make your plans accordingly to implement it.

The biggest lesson that I have learned from using this solution is probably around how to maintain the data dictionary, which is really critical for enabling data literacy. A lot of times, companies don't have these data dictionaries built. Building the data dictionary and getting it populated into the data catalog is where we spend some of the time. A development process needs to be established to create this data dictionary and maintain it going forward. You have to just make sure that it is not a one-time exercise. It is a continuous process that should be included as part of the development process.

I would rate erwin Data Intelligence for Data Governance an eight out of 10. If they can improve its user interface, it will be a great product.

Which deployment model are you using for this solution?

Public Cloud

If public cloud, private cloud, or hybrid cloud, which cloud provider do you use?

Amazon Web Services (AWS)
Disclosure: PeerSpot contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.
Data Analyst at Tenaga Nasional Berhad
Real User
Valuable data flow features and good technical support services
Pros and Cons
  • "The product's most valuable feature is the ability to select the data flow and lineage."
  • "Informatica Axon's response times and certain aspects of the admin panel could be enhanced for better usability."

What is our primary use case?

We use the product to store data. We have established a structured system where we store all relevant information, including donor details, images, and series, ensuring easy accessibility for users.

What is most valuable?

The product's most valuable feature is the ability to select the data flow and lineage.

What needs improvement?

Informatica Axon's response times and certain aspects of the admin panel could be enhanced for better usability.

For how long have I used the solution?

We have been using Informatica Axon for more than one year.

What do I think about the stability of the solution?

I rate the platform's stability as seven out of ten.

What do I think about the scalability of the solution?

We have around 400 Informatica Axon users in our organization. I rate the scalability a nine out of ten.

How are customer service and support?

The technical support services at the moment are impeccable. The support team has answers for everything. They have maintained this level of quality as we progress. There aren't many issues currently, although we're awaiting the migration to the cloud.

How would you rate customer service and support?

Positive

How was the initial setup?

The implementation process could be simplified.

What's my experience with pricing, setup cost, and licensing?

The platform has a premium cost. I rate the pricing as seven out of ten.

What other advice do I have?

The product needs to be adequately supported in the current setup. I advise others to conduct a proof of concept (POC) and assess how it progresses. The learning curve can be steep, but with sufficient time investment, they can also impart knowledge to the users, which is crucial for effective change management.

I recommend it to others and rate it an eight out of ten.

Which deployment model are you using for this solution?

On-premises
Disclosure: I am a real user, and this review is based on my own experience and opinions.
Flag as inappropriate