We haven't released the Amazon S3 platform in a production environment. It's more for prototyping and POC so far.
Normally we use it for the data dump. It's something similar to a data lake wherein we use it to obtain the data.
We haven't released the Amazon S3 platform in a production environment. It's more for prototyping and POC so far.
Normally we use it for the data dump. It's something similar to a data lake wherein we use it to obtain the data.
The most valuable feature is the storage.
I would like to see translations or description context for the options.
It is difficult for me as a consultant to explain the science, or the configuration processes, or why I am using this much power or the size to the customer. It takes some effort to describe how you got to that sizing.
We have some problems with connectivity. In my country, most of the issues we have are with stable connections rather than stable platforms.
In the next release, I think that it would be good to have wizards that would update into specific applications, for example, a one-touch configuration in Pagemaker. It would mean that you don't need to activate S3, and then have to do a configuration on the page. You would want a single wizard that would do all of the necessary applications.
Normally I work with on-premise solutions, then I started using Amazon S3 just over a year ago.
This solution is stable and I haven't had any issues with it slowing down.
There are many options, but there will be a learning curve to appropriately size one for a specific use case.
We have teams divided into application, web design, web technology, and cloud applications.
I haven't been in a situation where I have needed to contact technical support.
The initial setup is straightforward.
Sharing with other things that I have been doing, I was able to deploy this solution in one afternoon.
Azure is another solution that we were thinking of applying.
We have heard of Google Apps, but we haven't tried that solution.
You have to know specifically what it is being used for.
One of the issues we face is that it is a developer's environment.
In our country, the Philippines, people are often forced upon with brands such as Microsoft, Zadara, and even Oracle.
Azure is something that is explored by developers. The onus is with the formal organization to get the people back in.
I would rate this solution a fair eight out of ten. It does what I need it to do, but it is a developer's tool. It's not something that I have heard many people harp about.
Set up a performance and stress-based test suite to measure the endurance of a web application on Amazon AWS cloud system using S3 as the back-end storage infrastructure.
In order to test application and measure its performance, it is not possible to buy the real hardware at the infant stage of product development. Hence, I used Amazon AWS/S3 solution to prove my concept before deploying it on real hardware. It was a great experience and was really scalable.
S3 provides various other ways than what a traditional web app provides. The down side is the web app should be aware of all the features that S3 provides in order to leverage its full potential.
It is a great product now. Maybe they need to expand their range of physical locations.
Speed, size, and security are the most valuable features.
This was the company’s first foray into storing data offsite. The old way of thinking was that this was too dangerous to contemplate and you can get hacked either way.
This solution, indirectly, allowed us to recover internal and expensive document storage and replace it with cheaper offline storage.
Capacity-wise, we’re looking at 200GB of transactional data in Redshift. More importantly, you have a lot of storage of other assets, some slow and some fast. These include document archives and web images. This is several years of documents of more than 500GB, and most of it will remain untouched.
That stuff ends up in S3 Glacier storage. It is not really that large in the grand scheme of things, but certainly does not warrant the use of expensive internal storage systems or hiding the data on backup tape somewhere.
The web interface was frustrating when dealing with large numbers of files. We ended up using an interface client (via FTP, I think) which also had its own issues.
How do you make it easy to manage 50K documents in one folder using a web interface? I guess some more advanced filtering and selecting capabilities would have been nice, but it was in the early days. It would only read about 120 files into the cache. If you wanted to remove 1000 out of 2000 documents, you had to continually repeat your actions.
This happened surprisingly regularly when you have a live data transfer that ships 100 files per cycle and does 20 cycles per hour. It could eventually delete them itself, but we didn’t have time to engineer that piece.
Amazon’s approach was to delete the old files after a certain number of days. That is money in the bank for them right there.
I have used this solution for three years.
There were no stability issues.
Technical support always met my expectations.
There were business intelligence solutions via the web. We had similar home-grown reporting applications running on in-house hardware for over 10 years prior, but this was directly impacting our ERP resources.
It was exceptionally simple to configure servers, although most of this was done by the boss.
Even though it appears cheap, be careful on how you use it. Optimizing early will save money spent on storage and resources long term, so make it part of the design process. The beauty is you can control it at a very fine level.
Follow the guidance. The documentation is excellent. Take the time to get it right.