Azure Stream Analytics is easier for Databricks. In this ecosystem that we are developing in this project, we use many transformations, and we are catching data from different sources in a particular area of commodities. This convergence in one resource group or even two, with Azure Stream Analytics, is easier and quicker. Everything could be converged in a high-level overview in terms of cost, performance, and scaling up the teams. I think it could be more interesting in that particular area because technology, in terms of performance, has other solutions. But based on these variables that I described, they could take more complexity in terms of organization, in terms of networking, and this could affect performance, not only in terms of the streaming of the data but also in the continuous process of development. I utilize SQL-based language queries with Azure Stream Analytics, and we are also analyzing and doing filtering in the Databricks process, a batch process that executes very quickly. In this case, we have no filtering analytics in the demand front-end. We already know exactly based on the requests that we need to generate the transaction, and the filtering is almost defined there. For making a catch-up for all the records, if some transactions need conversion exchange ratios because the customer has an invoice in dollars and will pay by euros, in that particular transaction, we need to give filtering to have the total amount of the debt based on the list of invoices that are prioritized by the due date. That filtering is quite quick. It's one of the tests that we made, and it's very easy for the development team. The built-in monitoring tools in Azure Stream Analytics assist me in maintaining analytics workflows. The team is using Azure Insights for that, and they are quite lean. I am not yet fully involved as I need to mature this process first and try to understand a little bit more about the monitoring. But the team is very familiar, and they always say that they really appreciate it because it's much more effective. The things that I'm concerned about are the accounts and the logs. That is also very important. The threshold to the time lapse of logs that we need is also in discussion with the team, but I'm not so familiar yet. Regarding integration with Azure Event Hubs and Azure Blob Storage, we have Blob storage. When we receive some requests, we also have some requests that come up with a pipeline based on sharing CSVs and text files. We use Blob storage to receive that. We also have some other systems, target systems that receive some files, and we convert that data. It's a very small percentage, but it occurs.