Since its inception last April, Amazon.com has been billing its Simple Storage Service (S3) as a tool for online companies and Web developers to store and serve content, but some Web 2.0 companies said the service has room for improvement when it comes to retrieving stored items fast enough for e-business and maintaining reliability.
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
One of S3's most highly touted customers, photo-sharing service SmugMug.com, is still using the service but has seen performance and reliability issues serious enough to prompt the company to rethink how it used S3, according to its CEO Don MacAskill.
So far, SmugMug hasn't seen much of a problem with reads from its S3 storage. MacAskill said the difference, when S3 serves files from an East Coast data center through SmugMug's infrastructure in California, is acceptable to users. However, there have been some issues with write speeds, even to Amazon's closest data center in Seattle, he said.
That's on a day-to-day basis, but there have also been two specific "performance outages" over the last year that caused the service to slow drastically and had SmugMug's customers "at our doors with torches and pitchforks," he wrote in a blog post on his experiences with S3.
After these slowdowns, the latest of which occurred in early January, SmugMug moved "hot" storage -- data undergoing revision or being requested frequently -- back in-house, leaving the bulk of the company's "cold" storage, some 200 terabytes (TB) in all, on S3.
Outages also a problem
MacAskill said that a more significant problem than latency with Amazon has been downtime. Last year, there were two catastrophic outages in which Amazon lost a core switch at its data center that took down its main bookstore site, as well as S3 services for between 15 minutes and 30 minutes.
"If they're down, we're down," according to MacAskill, who said it was a failure rate he was also willing to accept, now that his S3 deployment has changed. It helps that he went into it assuming the new service would have its share of problems. "Anyone using this has to keep that possibility in mind when building an application around it," he said, and added that Amazon has asked him for feedback as long as he's been a user. "I think we're an interesting use case for them. It's a fairly new product, and we've been working with them on these issues."
According to Bob Ippolito, chief technology officer and co-founder for Mochi Media LLC, a Web company that serves advertisements into online video games for sponsors through Mochiads.com, his company tested the S3 service last year attracted by low costs for storage and an easy-to-use API. But in the course of the testing, Ippolito said his company experienced an hour-long outage with one of the switch failures while testing the product late last year. "More than five minutes is unacceptable, really," he said. "Downtime is directly translatable into lost revenue for us and our publishers."
S3 "is not a really general solution to all [storage and content-delivery] problems," Ippolito added, due to high latency and what he called "significant downtime".
In a blog post about his company's experiences with S3 written in December, Ippolito measured latencies from different companies in his hotel room in Taipei, comparing the times between the company's own internal server at 0.803384 seconds, and CacheFly, a rival online storage and content delivery service, at 0.526801 seconds, to S3 at 1.652920 seconds.
"In other words, it's great for backing up data," Ippolito told SearchStorage, "but I wouldn't recommend it for anything that needs to be on the public Internet or in continuous use." Right now, the company is still using S3 for some backups and archives, but has looked elsewhere to store its primary content.
Other items on the wish list
Another problem with Amazon, in Ippolito's experience, is that it doesn't use the same caching and content-delivery mechanisms as other Web-based services. Other services, like CacheNetworks LLC's CacheFly use rsync, a standard Unix tool used to synchronize files between servers. "The downside of S3 is that they have their own way of putting files on their service, so you will have to do integration work to use their service, and that integration work is not (currently) reusable with any other service," Ippolito said.
McAskill added that he is also hoping to see improvement in the service around customer support and pricing.
"For a company like us or MySpace, we buy bandwidth in gigabit-sized chunks at very low prices, so it's not cost effective for us the way S3 has priced its service, at 20 cents per GB of data transferred," he said. "That's great for a small company, though, so we're hoping to see them put a tiered pricing structure in place on a sliding scale for larger companies."
Finally, on his blog MacAskill added, "Amazon is not unique in terms of providing a great product but average support -- [but they do] need to get better about communicating with their customers. They need to have a page which shows the health of their systems and proactive notification of major issues, a 24/7 contact method, etc."
"We are constantly working to enhance Amazon S3, as we know how that more and more companies are depending on it every day," said Andrew Herdener, senior public relations manager for Amazon.com. "We've had a few problems over the past year and each time we learned something and instituted a new process or safeguard to prevent the problem from happening in the future."