We haven't released the Amazon S3 platform in a production environment. It's more for prototyping and POC so far.
Normally we use it for the data dump. It's something similar to a data lake wherein we use it to obtain the data.
We haven't released the Amazon S3 platform in a production environment. It's more for prototyping and POC so far.
Normally we use it for the data dump. It's something similar to a data lake wherein we use it to obtain the data.
The most valuable feature is the storage.
I would like to see translations or description context for the options.
It is difficult for me as a consultant to explain the science, or the configuration processes, or why I am using this much power or the size to the customer. It takes some effort to describe how you got to that sizing.
We have some problems with connectivity. In my country, most of the issues we have are with stable connections rather than stable platforms.
In the next release, I think that it would be good to have wizards that would update into specific applications, for example, a one-touch configuration in Pagemaker. It would mean that you don't need to activate S3, and then have to do a configuration on the page. You would want a single wizard that would do all of the necessary applications.
Normally I work with on-premise solutions, then I started using Amazon S3 just over a year ago.
This solution is stable and I haven't had any issues with it slowing down.
There are many options, but there will be a learning curve to appropriately size one for a specific use case.
We have teams divided into application, web design, web technology, and cloud applications.
I haven't been in a situation where I have needed to contact technical support.
The initial setup is straightforward.
Sharing with other things that I have been doing, I was able to deploy this solution in one afternoon.
Azure is another solution that we were thinking of applying.
We have heard of Google Apps, but we haven't tried that solution.
You have to know specifically what it is being used for.
One of the issues we face is that it is a developer's environment.
In our country, the Philippines, people are often forced upon with brands such as Microsoft, Zadara, and even Oracle.
Azure is something that is explored by developers. The onus is with the formal organization to get the people back in.
I would rate this solution a fair eight out of ten. It does what I need it to do, but it is a developer's tool. It's not something that I have heard many people harp about.
The primary use case of this solution is for the storage of large amounts of data.
It's a public cloud deployment model that anyone can deploy.
There is no need to upgrade as they don't have versions, it's a managed service, it's hybrid. There are changes that we make internally. It's Amazon S3.
The most valuable features are the high availability, high performance, IAM access control, and server-side encryption.
Some of the areas that could be improved are the dashboard, and to have a richer functionality.
Because there are so many services offered by Amazon, they can do anything. If we were to add anything it wouldn't be anything inside of it, but services on top of it.
There is a concern with security. In one of our main use cases, we prescreen, but we have to create a gateway or layer on top of it to access the data in that particular case. Because the user accesses the data, they to be authenticated before doing so.
It indicates that the users of our systems need to gain access to this data. Amazon allows access with this mechanism called Presigned URL.
We need to share files, so we upload the file and request a link from Amazon, which allows you to share with anyone. The link is signed, which is the reason it is called Presigned. However, this sign is not compliant with internet regulations.
In our company, we are concerned with privacy regulations. In the United States, there is a law and regulation that is called HIPPA. It's a regulation on how to keep patients' data private and how to protect it. Amazon S3 is eligible for HIPPA compliance, but not with the Presigned URL.
This is very important and because we cannot use the Presigned URL, we have to build the layer on top of Amazon S3. As a result of having to do this, we lose performance, availability, and we lose some benefits of Amazon S3.
A feature that should be included is to find and provide a HIPPA compliant solution for the Presigned URL.
This a stable solution. We are not aware of any technical issues.
We are a large company and everyone uses this solution. Approximately one hundred users in different areas made up of developers and administrators.
While we have a small technical team, there is almost no maintenance with this solution. It's fully managed.
This solution is scalable.
Technical support is very good.
They are very professional and they respond quickly. If they don't have a solution, they will contact they development team in Amazon and provide you with a detailed response with a solution to resolve the issue.
They are very thorough and I really appreciated it. They helped me.
I am more than satisfied with the support and would recommend using the support.
We have not used any previous cloud-based solutions.
The initial setup is straightforward.
The download is very easy and can be used in seconds.
I performed the implementation of this solution.
This solution has reasonable pricing and a low cost.
We didn't evaluate other options. Amazon accounts are straightforward.
There is always a choice on whether or not you will use the cloud. If you make the choice to use the cloud then you can choose if you are just using Google or Microsoft. When you already have an account with Amazon, your choice is Amazon S3.
There are other solutions, but for us the choice to use Amazon S3 was clear.
It's difficult to offer advice as it depends on the use case and what this solution is intended for.
This solution is managed fully, and there is no need for upgrades or anything. It's cloud-based. If you just Gmail, you don't upgrade Gmail.
I would recommend this solution for companies of any size. It's good for starting up because you only pay for what you use. It's internet-capable, making it good for any company.
I would rate this solution a ten out of ten.
Set up a performance and stress-based test suite to measure the endurance of a web application on Amazon AWS cloud system using S3 as the back-end storage infrastructure.
In order to test application and measure its performance, it is not possible to buy the real hardware at the infant stage of product development. Hence, I used Amazon AWS/S3 solution to prove my concept before deploying it on real hardware. It was a great experience and was really scalable.
S3 provides various other ways than what a traditional web app provides. The down side is the web app should be aware of all the features that S3 provides in order to leverage its full potential.
It is a great product now. Maybe they need to expand their range of physical locations.
Speed, size, and security are the most valuable features.
This was the company’s first foray into storing data offsite. The old way of thinking was that this was too dangerous to contemplate and you can get hacked either way.
This solution, indirectly, allowed us to recover internal and expensive document storage and replace it with cheaper offline storage.
Capacity-wise, we’re looking at 200GB of transactional data in Redshift. More importantly, you have a lot of storage of other assets, some slow and some fast. These include document archives and web images. This is several years of documents of more than 500GB, and most of it will remain untouched.
That stuff ends up in S3 Glacier storage. It is not really that large in the grand scheme of things, but certainly does not warrant the use of expensive internal storage systems or hiding the data on backup tape somewhere.
The web interface was frustrating when dealing with large numbers of files. We ended up using an interface client (via FTP, I think) which also had its own issues.
How do you make it easy to manage 50K documents in one folder using a web interface? I guess some more advanced filtering and selecting capabilities would have been nice, but it was in the early days. It would only read about 120 files into the cache. If you wanted to remove 1000 out of 2000 documents, you had to continually repeat your actions.
This happened surprisingly regularly when you have a live data transfer that ships 100 files per cycle and does 20 cycles per hour. It could eventually delete them itself, but we didn’t have time to engineer that piece.
Amazon’s approach was to delete the old files after a certain number of days. That is money in the bank for them right there.
I have used this solution for three years.
There were no stability issues.
Technical support always met my expectations.
There were business intelligence solutions via the web. We had similar home-grown reporting applications running on in-house hardware for over 10 years prior, but this was directly impacting our ERP resources.
It was exceptionally simple to configure servers, although most of this was done by the boss.
Even though it appears cheap, be careful on how you use it. Optimizing early will save money spent on storage and resources long term, so make it part of the design process. The beauty is you can control it at a very fine level.
Follow the guidance. The documentation is excellent. Take the time to get it right.
