What you’ll learn in this tip: For IT professionals or organizations still toying with the idea of testing the...
By submitting your email address, you agree to receive emails regarding relevant topic offers from TechTarget and its partners. You can withdraw your consent at any time. Contact TechTarget at 275 Grove Street, Newton, MA.
cloud, cloud-enabled storage arrays can be a great option. Whether it’s simply called tiering, a gateway or a hybrid cloud, each of these solutions allows companies to not only place data on the cloud, but to test cloud storage in their own environment. In this Storage magazine column by Enterprise Strategy Group (ESG) senior analyst Terri McClure, learn about the storage vendors who are making testing the cloud a possibility and why she believes this approach could become a large part of IT strategies in the very near future.
There seems to be a new category of data storage system on the horizon: a cloud-enabled storage array. The broad market isn’t using this terminology yet; but while some call it tiering, others a gateway or a hybrid cloud, they all have one thing in common: a low-risk opportunity for end users to stick their toe in the water and experiment with cloud storage infrastructures.
Traditional storage vendors are using cloud as a storage tier within the storage array; perhaps the best known is EMC Corp. tiering to the cloud using Fully Automated Storage Tiering (FAST) with its Celerra product line. In that system, cloud is treated as a storage tier for long-term archiving of infrequently accessed data. We’re also seeing F5 Networks Inc. take a similar approach with its ARX product line in which it tiers file data to the cloud.
Similarly, on the gateway front, we have Panzura Inc., StorSimple Inc. and TwinStrata Inc. offering systems that can be used as on-premises storage systems or as a gateway to cloud service providers. And with these systems, the way cloud services are leveraged can be configured in multiple ways. Cloud can be the primary storage target, with the gateway only holding cached data to eliminate the latency associated with storing data off site. (These systems also typically encrypt, dedupe, compress and provide snapshot functionality.)
Alternately, data can be pinned with the local system holding the primary copy of the data and the cloud being used as a disaster recovery (DR) target. This approach gives cloud skeptics a low-risk option to test cloud services because the primary data stays on site and the remote copy is encrypted. This is much more affordable than using array-based remote copy tools and maintaining a remote site yourself.
There’s also an emerging category of software to consider designed for virtual use. It can leverage storage capacity and characteristics regardless of whether it’s on site or in the cloud, creating a stretch or geo-distributed cluster. There are different capabilities of cloud services depending on which type of storage media is assigned to it, and these software programs recognize that for users. Some of the gateways probably fall into this category, so there’s bound to be a bit of overlap, but as storage increasingly becomes more virtualized and more of a software-layered-on-commodity-hardware play, this is an area to watch. Gluster, with its software-only offering, has made a notable early start here.
To buy into the hybrid cloud message you first need to buy into private clouds. And the jury is still out on what exactly constitutes a private cloud. Is it simply that your IT department has deployed virtualization and transformed to a service-oriented architecture (SOA)? Others will tell you that you must meet very strict criteria having to do with RESTful APIs, global namespaces and scalable object stores owned and operated by IT. We (recalling my storage vendor days) used to represent the storage-area network (SAN) in PowerPoint charts with a cloud and describe it with a lot of the attributes we use to describe cloud today. Indeed, many definitions of the SAN could also match the description of a private storage cloud. And a cloud-enabled array may not even be a part of an overarching cloud strategy; it may just be a safe, easy way to enable remote replication data recovery services.
In an ESG survey, IT pros were asked why they thought public cloud computing services would have almost no impact on their organization’s IT strategy over the next five years. Multiple responses were accepted, and the top five answers were as follows:
- Data security/privacy concerns (43%).
- Feel like we would be giving up too much control (32%).
- Too much invested in current IT infrastructure and staff (32%).
- Cloud computing offerings need to mature (29%).
- Satisfied with existing infrastructure and process (28%).
Why do I think this cloud-enabled array approach may be a big part of IT strategy over the next few years? That’s one part of this equation that’s simple to answer. IT is a combination of people, processes and technology. For existing storage users who want to stay with legacy arrays or keep existing processes in place, using cloud as a storage tier behind what appears to be a conventional array gives them the extensibility and price points of cloud without radically altering processes or having to retrain staff. Or, for those users who need a more comprehensive DR strategy but can’t afford a remote site, system and the staff to manage it, using tools within a storage array and turning some dials to mirror encrypted data to the cloud is low risk and affordable. Cloud-enabled storage arrays are bridging technology, tying the present to the future, and a familiar, safe way to test a cloud strategy.
BIO: Terri McClure is a senior storage analyst at Enterprise Strategy Group, Milford, Mass.