My use cases were primarily in the banking domain, where we tracked real-time transactions. Apart from real-time transactions, we also had another project in the retail domain. Our client's name was McDonolad, and their focus was on package delivery. By "package," I mean the customized burgers created at their restaurant that needed to be delivered to various restaurant locations. When customers visit their restaurant, they would order their personalized burgers. Sometimes, customers would request specific ingredients that were not available in the pre-made burgers displayed at the counter. The restaurant faced the challenge of understanding why customers were not opting for ready-made burgers or patties. To address this issue, they aimed to enhance their product by offering customized burgers. These customized burgers had to be prepared according to the customer's specific preferences. To achieve this, they started taking orders from customers. However, it was difficult to determine the exact taste or requirements the customers were looking for. To overcome this, they decided to transfer all the customer's order data to the Kafka server for streaming. This allowed the restaurant's backend system to analyze the data and understand the customer's new tastes and requirements. By collecting this data, they were able to enhance their product and make necessary changes to improve customer satisfaction.
What is Streaming Analytics? Streaming analytics, also known as event stream processing (ESP), refers to the analyzing and processing of large volumes of data through the use of continuous queries. Traditionally, data is moved in batches. While batch processing may be an efficient method for handling huge pools of data, it is not suitable for time-sensitive, “in-motion” data that could otherwise be streamed, since that data can expire by the time it is processed. By using streaming...
We use the software to facilitate building integrations between systems.
We are currently running tests and experimenting with other solutions.
My use cases were primarily in the banking domain, where we tracked real-time transactions. Apart from real-time transactions, we also had another project in the retail domain. Our client's name was McDonolad, and their focus was on package delivery. By "package," I mean the customized burgers created at their restaurant that needed to be delivered to various restaurant locations. When customers visit their restaurant, they would order their personalized burgers. Sometimes, customers would request specific ingredients that were not available in the pre-made burgers displayed at the counter. The restaurant faced the challenge of understanding why customers were not opting for ready-made burgers or patties. To address this issue, they aimed to enhance their product by offering customized burgers. These customized burgers had to be prepared according to the customer's specific preferences. To achieve this, they started taking orders from customers. However, it was difficult to determine the exact taste or requirements the customers were looking for. To overcome this, they decided to transfer all the customer's order data to the Kafka server for streaming. This allowed the restaurant's backend system to analyze the data and understand the customer's new tastes and requirements. By collecting this data, they were able to enhance their product and make necessary changes to improve customer satisfaction.
We are only using Amazon MSK for basic use cases.
Behind the scenes, we use Kafka. We tried using MSK to collect old data surrounding inventory from e-commerce websites, for example.