We use this solution for finding anomalies and applying the rules to the streaming data.
There are around 50 people using this solution in my organization, including data scientists.
We use this solution for finding anomalies and applying the rules to the streaming data.
There are around 50 people using this solution in my organization, including data scientists.
The ability to stream data and the windowing feature are valuable. There are a number of targeted integration points, so that is a difference between Stream Analytics and Databricks. The integrations input or output are better in Databricks. It's accessible to use any of the Python or even Java. I can use the third party, deploy it, and use it.
Support for Microsoft technology and the compatibility with the .NET framework is somewhat missing. There should be reliability between these two. Databricks is based on open sources. If it's more synchronous between the Microsoft technology and the programming languages, it'll be better. Python has better languages, but compatibility would be a great help.
I would like to have better support for Microsoft technology and better language components.
With Azure or Cosmo DB, I can store other data links or time series data tables. That would be a great help for analytics in real time.
I have been using Databricks for eight months.
The scalability is fine. We had thousands of devices and were sending data infrequently, so that worked for us. If the amount increases, the windowing function and job schedule may not perform as expected.
I would rate technical support 4 out of 5. We had some issues with setup, and they were finally solved but it was after following up a few times.
Azure Stream Analytics is easy to use and easy to deploy. It's a little bit better. Databricks is still having some stability issues. Azure Stream Analytics has a few input and output sources, and it's scalable to all types of third party or interfaces.
Setup was complex. There were some issues with setting up a database and installing the third party component on top of services. I would rate the setup 3 out of 5.
Implementation was done in-house.
The cost is around $600,000 for 50 users.
I would rate the price 2 out of 5.
I would rate this solution 8 out of 10.
We use the solution for reliability engineering, where we apply ML and Deep Learning models to identify the fear failure patterns across different geographies and products.
Databricks is hosted on the cloud. It is very easy to collaborate with other team members who are working on it. It is production-ready code, and scheduling the jobs is easy.
Databricks would have more collaborative features than it has. It should have some more customization for the jobs. Also, it has an average dashboarding tool. They can bring advanced features so we don't depend on other BI tools to build a dashboard. We are using Tableau to create a dashboard. If Databricks has more advanced features, we can entirely use Databricks.
I have been using Databricks for one year.
The product is stable. It has been giving consistent outputs without any major issues.
The solution is hosted on the cloud. It supports high scalability features.
10-20 users are using this solution.
There was a training session from Databricks where they explained how to use it. We never had to contact them because they had already given us proper training on the platform.
I have used Alteryx before. We switched to Databricks because it can compute and turn your code into production-ready code in very few seconds. Also, the stability is relatively high.
The initial setup is easy.
We have a dedicated team for the deployment.
Delta Lake is a free system. We practically work on the data that we get from Snowflake. Databricks are returned to the model outputs that are returned to Delta Lake. It is easy for us to collaborate using Delta Lake, and the computation speed is also quite high for Delta Lake.
The learning curve for Databricks is not very steep. It's pretty easy, and you will find a lot of materials online. So, if you are comfortable coding in Python, it's very straightforward. There is nothing to worry about when using Databricks.
Overall, I rate the solution a ten out of ten.
We are using Databricks for machine learning workloads specifically.
Databricks aligns well with our skillset and overall approach. We sought out their solution specifically for a big data application we are currently working on, as we needed a platform capable of handling large amounts of data and building models. Additionally, the fact that they use open-source software and can integrate data warehouse and data lake systems was particularly appealing, as we have encountered such issues in the past. We determined that Databricks would be an effective solution for our needs.
The most valuable feature of Databricks is the integration of the data warehouse and data lake, and the development of the lake house. Additionally, it integrates well with Spark for processing data in production.
The solution could be improved by adding a feature that would make it more user-friendly for our team. The feature is simple, but it would be useful. Currently, our team is more familiar with the language R, but Databricks requires the use of Jupyter Notebooks which primarily supports Python. We have tried using RStudio, but it is not a fully integrated solution. To fully utilize Databricks, we have to use the Jupyter interface. One feature that would make it easier for our team to adopt the Jupyter interface would be the ability to select a specific variable or line of code and execute it within a cell. This feature is available in other Jupyter Notebooks outside of Databricks and in our own IDE, but it is not currently available within Databricks. If this feature were added, it would make the transition to using Databricks much smoother for our team.
The most important feature other than the Jupyter interface would be to have the RStudio interface inside Databricks. This would be perfect.
We have been using Databricks for approximately one year.
The stability of Databricks is good.
I rate the stability of Databricks a nine out of ten.
Databricks is scalable.
I rate the scalability of Databricks a nine out of ten.
I have been receiving responsive answers from Databricks's support. I have been pleased with the support.
I rate the support from Databricks a ten out of ten.
Positive
The initial setup of Databricks is simple. I did not experience any challenges. The time it takes for the deployment is approximately four hours.
I rate the initial setup of Databricks.
We did the deployment of the solution in-house. There were three people involved in the deployment. A data engineer, data analyst, and machine learning engineer.
We have only incurred the cost of our AWS cloud services. This is because during this period, Databricks provided us with an extended evaluation period, and we have not spent much money yet. We are just starting to incur costs this month, I will know more later on the full cost perspective.
We only pay standard fees for the solution.
We use a data engineer, data analyst, and machine learning engineer for the maintenance of the solution.
I rate Databricks a nine out of ten.
I would like to see the integration between Databricks and MLflow improved. It is quite hard to train multiple models in parallel in the distributed fashions. You hit rate limits on the clients very fast.
I have been using Databricks for three years.
I would rate the stability of this solution a nine out of 10, with one being not stable and 10 being very stable.
I would rate the scalability of this solution an eight out of 10, with one being not scalable and 10 being very scalable.
There are three people using this solution in our organization.
I would rate the available customer service a three. It's worth mentioning that this is Microsoft and not Databricks itself. I haven't spoken to Databricks people directly, but I know the people who have and they have been a lot more pleased.
Negative
I would rate their pricing plan a six (on a scale of one to 10, with one being cheap and 10 being expensive). I think the prices could be lowered a little bit.
Overall, I would rate this solution an eight out of 10, with one being quite poor and 10 being excellent. It is fast, it's scalable, and it does the job it needs to do.
We use this solution for the Customer Data Platform(CDP). My company works in the MarTech space and usually we implement custom CDP.
The Delta Lake data type has been the most useful part of this solution. Delta Lake is an opensource data type and it was implemented and invented by Databricks. It is the most important element of the solution. Databricks also offers exceptional performance and scalability.
The data visualization for this solution could be improved. They have started to roll out a data visualization tool inside Databricks but it is in the early stages. It's not comparable to a solution like Power BI, Luca, or Tableau.
In a future release, we would like to have a better ETL designer tool to assist in the way we move data from one place to another.
We have been using this solution for four years.
This is a stable solution.
This is a scalable solution.
The initial setup is very easy. It is a managed solution inside Azure so you just need to search for Databricks. There are a couple of pages to follow in the setup wizard and Databricks is up and running.
We implement this solution on behalf of our customers who have their own Azure subscription and they pay for Databricks themselves. The pricing is more expensive if you have large volumes of data.
When we first started using Databricks in 2018, there were not many comarable solutions to consider. Right now there are many solutions to consider including Snowflake, Azure Synapse, Redshift and BigQuery.
Databricks continues to be our solution of choice but Snowflake does have a better user interface and is easier to work with the data pipelines and with the overall UI.
I would advise others to first define a strong data strategy and then choose which data platform suits your needs.
I would rate this solution a nine out of ten.
We build data solutions for the banking industry. Previously, we worked with AWS, but now we are on Azure. My role is to assess the current legacy applications and provide cloud alternatives based on the customers' requirements and expectations.
Databricks is a unified platform that provides features like streaming and batch processing. All the data scientists, analysts, and engineers can collaborate on a single platform. It has all the features, you need, so you don't need to go for any other tool.
I like that Databricks is a unified platform that lets you do streaming and batch processing in the same place. You can do analytics, too. They have added something called Databricks SQL Analytics, allowing users to connect to the data lake to perform analytics. Databricks also will enable you to share your data securely. It integrates with your reporting system as well.
The Unity Catalog provides you with the data links and material capabilities. These are some of the unique features that fulfill all the requirements of the banking domain.
Every tool has room for improvement. Normally what happens, a solution will claim it can do ETL and everything else, but you encounter some limitations when you actually start. Then you keep on interacting with the vendor, and they continue to upgrade it. For example, we haven't fully implemented Databricks Unity Catalog, a newly introduced feature. We need to check how it works and then accordingly, there can be improvements in that also.
Databricks may not be as easy to use as other tools, but if you simplify a tool too much, it won't have the flexibility to go in-depth. Databricks is completely in the programmer's hands. I prefer flexibility rather than simplicity.
I have been using Databricks for a year.
Databricks relies on scalability and performance. Every cloud vendor prioritizes scalability, high availability, performance, and security. These are the most important reasons to move to the cloud.
Deploying Databricks on the cloud is straightforward. It's not like an on-premise solution, where you must create a cluster and all those other prerequisites for big data.
I don't think it's challenging to maintain, but you need an expert programmer because Databricks isn't GUI-based. With GUI-based tools, building ETLs is drag-and-drop. Databricks entirely relies on coding, so you need skilled programmers to building your code, ETLs, etc.
The price of Databricks is based on the computing volume. You also need to pay storage costs for the cloud where you're hosting Databricks, whether it is AWS, Azure, or Google.
I rate Databricks nine out of 10. Databricks is one of the best tools on the market.
We use Databricks for batch data processing and stream data processing.
Databricks provides a consistent interface for data engineers to work with data in a consistent language on a single integrated platform for ingesting, processing, and serving data to the end user.
The flexibility of Databricks is the most valuable feature. It gives us the ability to write analytics code in multiple languages.
There is a single workspace for different data roles like data engineers, machine learning engineers, and the end user, who can connect to the same system.
Databricks computes separate from storage, so you are not coupled with the underlying data sets, allowing for multiple processes and multiple programs to be written on the same code.
I would like to see improvement with the UI. It is functional and useful, but it's a bit clunky at times. It should be more user-friendly.
In future releases, Databricks would benefit from enhanced metrics and tighter integration with Azure's diagnostics.
I have been using Databricks for eight months.
Databricks is very stable.
The scalability of this solution is good. In our organization, users include analysts, data engineers, and data scientists.
I would give Databrick service and support a four and a half out of five overall.
Positive
Prior to using Databricks, we used Azure Stream Analytics. We made the switch because of the scalability and integrated platform.
The initial setup of Databricks is more complex. I would rate it a four out of five on the complexity of the setup. It took two days to deploy the solution.
We used a third party for some of the implementations of Databricks. The number of staff required to deploy and maintain this solution depends on the number of processes you have. Due to the cloud nature of the technology, it is easy to deploy and maintain.
The licensing of Databricks is a tiered licensing regime, so it is flexible. I feel their pricing is a five out of five.
Databricks is a one-stop shop for everything data related, and it can scale with you.
I would rate this solution a 9.5 out of 10 overall.
Our use case is confidential, but I can say we use it for a deep learning model for machine learning.
The solution is built from Spark and has integration with MLflow, which is important for our use case.
Databricks is also user-friendly, providing customizable codes and models that allow people to experiment quickly.
Integration of Delta Lake is another useful feature.
Writing pandas-profiling reports could be easier.
The ability to customize our own pipelines would enhance the product, similar to what's possible using ML files in Microsoft Azure DevOps.
I have been using this product for one and a half years.
For now the solution seems stable.
The solution is easy to scale horizontally and it has a useful auto-scaling feature. For vertical scaling, you need to bring the system down and make some adjustments.
On my current project I have a team of 30 members under me, including data engineers and data science people. Our data science, engineering, and MLOps projects are expanding, so we are planning to do some vertical scaling to increase the team size to over 100 members. In our company, we are trying to certify more and more people in Databricks because it's cloud-agnostic.
We have never needed to contact customer support, online resources have been sufficient to solve our problems.
The initial setup of the solution is straightforward, once you understand the UI it is easy to implement. I would rate Databricks a four out of five for ease of setup.
One migration project took two to three months, including writing all the code and implementing end-to-end pipelines.
We are planning to deploy the solution in stages over the next 15 months to completely implement MLOps for our organization.
I'm not involved in the financing, but I can say that the solution seemed reasonably priced compared to the competitors. Similar products are usually in the same price range. With five being affordable and one being expensive, I would rate Databricks a four out of five.
I find that deployed systems work out cheaper than having to operate manually, which appeals to our customers.
I would rate this solution an eight out of ten.
There is an issue where clusters are automatically deleted after termination or after 100 days of non-usage. This could be more user-friendly, and they could include an enabler to pin the clusters you want to keep, instead of having to go and research why clusters got deleted after implementing the product. That documentation needs to be right in front of the user to avoid issues.
I definitely recommend this product to other users.