The main purpose of HPE Ezmeral Data Fabric for me is that it acts as a database. In my company, we store our data with the help of HPE Ezmeral Data Fabric. It is possible to use Spark engine with HPE Ezmeral Data Fabric. HPE Ezmeral Data Fabric is also useful for data processing and data ingestion.
Regional Head of Data and Application Platform at a financial services firm with 10,001+ employees
Real User
Top 10
2023-01-18T12:45:54Z
Jan 18, 2023
We've got a variety of use cases. We use Ezmeral for large file data storage and distributed processing with Spark and other utilities. We also use it as a NoSQL. The HPE Ezmeral Data Fabric database lets you store tons of data in NoSQL format. You can also use it as a messaging layer. It's a combination of five or six use cases, but our primary use case is to store big data and make it easily accessible across locations and namespaces. More than a hundred people use it across multiple teams.
Hadoop architectures refer to the various design patterns and configurations used to implement Hadoop, an open-source framework for distributed storage and processing of large datasets.
The main purpose of HPE Ezmeral Data Fabric for me is that it acts as a database. In my company, we store our data with the help of HPE Ezmeral Data Fabric. It is possible to use Spark engine with HPE Ezmeral Data Fabric. HPE Ezmeral Data Fabric is also useful for data processing and data ingestion.
We've got a variety of use cases. We use Ezmeral for large file data storage and distributed processing with Spark and other utilities. We also use it as a NoSQL. The HPE Ezmeral Data Fabric database lets you store tons of data in NoSQL format. You can also use it as a messaging layer. It's a combination of five or six use cases, but our primary use case is to store big data and make it easily accessible across locations and namespaces. More than a hundred people use it across multiple teams.
We use HPE Ezmeral Data Fabric to consolidate data into the data lake. It's our primary use case. We are using HPE as a data lake.
Our primary use case is for creating preview models.