We mainly focus on Microsoft Azure Data Lake. We use Microsoft Azure Stream Analytics, but not on a wide level.
We have multiple use cases for Microsoft Azure Object Storage. We are storing streamline data on Microsoft Azure Blob Object Storage. Our media files for AI/ML models, data, and PDF files are uploaded to Blob Storage due to their elasticity features. Previously, our data was stored in our data center with SAN, but due to elastic capability, we moved to S3, and then due to organization group level changes, we moved from S3 to Microsoft Azure.
In one scenario, we integrated Microsoft Azure Blob with our IoT device where our related blog is capturing on the blog. When I generated a SAS token for the blog, I forgot to create a policy, and the default expiry date was set to 60 seconds. It did not prompt anything regarding the SAS token generation of the blog, which was cumbersome. It would be easier if I created a policy for the user level and then added access policy for the user of the blog. For direct creation of SAS tokens with default seven-day validation, it did not prompt whether data was uploading or downloading correctly. This created issues regarding alert notifications, as I wanted to configure alert notifications over the blog for different containers and alert mechanisms.
Object Storage and Geo-redundant Storage are features I appreciate most.
Hot and Cool storage options allow us to implement different budgeting systems. When creating our data budgeting, we can implement archive mode for less frequently used data. While retrieving data from archive mode takes some time, it provides a cost-effective storage solution.
For security solutions in Microsoft Azure, I particularly appreciate Microsoft Azure AI foundation. Since I work mostly with AI/ML, data piping, data integrations, and ETL tools, these features are valuable. Additionally, O365 Copilot features are very beneficial.