IBM executives say rather than looking to improve storage efficiencies by moving to private cloud storage, companies must make their storage more efficient before
At its IBM Storage Innovation Executive Summit last week in New York City, IBM execs outlined the types of best practices companies should follow as a path to get to the private cloud. Those practices include implementing the latest storage management features, and internal IT organizations must also define the right service-level agreements (SLAs) before moving to the cloud.
“Cloud computing isn't about storage efficiency. It’s backward, because storage efficiency is how you enable the private cloud,” said Dan Galvan, vice president of marketing and strategy for IBM’s storage systems and technology group. “The private cloud is about improving service levels to end users of the enterprise. From a storage perspective, to build a private cloud you at least have to have efficiency of storage.”
Galvan said customers need to virtualize and consolidate server and storage infrastructures so they can take advantage of economies of scale. Galvan claims organizations can get a 30% increase in utilization with storage virtualization, which is a similar result you see in server virtualization.
“We're seeing this over and over with our clients,” Galvan said. “Consolidation of virtualization is the first step. You should virtualize to move toward efficiency so it gives you flexibility.”
He also recommends customers implement automated data tiering before moving to private cloud storage. He said IBM plans to extend the capabilities of its Easy Tier software that launched last June for DS8000 enterprise arrays, making it available across all IBM systems. Easy Tier uses heuristics algorithms that view client workloads and predict which type of storage drives it should reside on for a particular performance level. Galvan said Easy Tier picks out the pieces of data most likely needed for quicker access and automatically moves it to the fastest storage, such as solid-state drives (SSDs).
“We're finding that with Easy Tier, that with as little as 2% of the storage array stored on solid-state drives and the rest on spinning disk, clients can see a three times performance improvement,” Galvan said. “That's huge.”
Galvan said the ability to get storage on-demand when they need it without having to request it from a storage administrator is another key requirement for the cloud. “We're big believers in the cloud but we think this is the way to do it,” he said.
Galvan said many corporations will need more time to trust the cloud. “I’m not aware of any corporation that's not concerned with putting their information on a public cloud,” Galvan said. “It’s going to take two things for it to happen. It will take certainty that information is secure and the client will need to be sure data is accessible when they need it, especially transactional data. That won’t go to public clouds until corporations are certain it's secure and accessible. The worst thing that can happen to a company is the data isn't secure or they just don’t have access to it.”
Mesabi Group analyst David Hill agrees that organizations need to create storage efficiency and define service levels before moving to the cloud. For instance, companies have to define data retention polices first because cloud service providers usually don't communicate with their customers’ legal departments.
He also said it's important to not only dedupe data before moving it to the cloud, but to keep archived data separate from backups to avoid comingling inactive data with active data.
“That ultimately interferes with the ability to recover data in a faster way,” Hill said. “Inactive data doesn't have to be cluttered with the active production data. You're wasting I/O resources, it costs more money and it’s not efficient. IBM says ‘You have to walk before you leap.’ Customers first have to go through several steps or you're not going to get the full benefits of the cloud. The cloud isn't a magical way of doing business.”