Storage Magazine - UK
  Faster, better, cheaper - smarter

Faster, better, cheaper - smarter

From STORAGE Magazine Vol 6, Issue 8 - October 2006

According to recent statistics, enterprise storage needs - embracing the storage, archiving, backup, protection and retrieval of data in large-scale environments - will increase by something like a factor of seven over the next few years. The challenges in coping with this are daunting and manifold, as Brian Wall reports

The pressures on companies seeking to manage their storage needs and obligations at the enterprise level have now reached staggering proportions - which will come as no surprise to anyone familiar with the sharp end of retaining vast amounts of data and making it available on demand.

But where does that leave CIOs and IT managers already struggling to maintain and administer effective, efficient organisations? With budgets unlikely to be expanded, they have been left with few options in their efforts to control costs and drive productivity. So what comfort can the new generation of data protection technology solutions afford them when it comes to finding innovative and cost-effective ways of handling burgeoning data?

How do they identify the best solutions and working practices that will control costs and drive productivity, while also meeting all of their customers' highest expectations? What role can enterprise storage play in helping them? And what structures/technologies do they need to introduce - and how will that affect current systems and methodologies?

"There are numerous factors that have to be considered when managing data storage and archiving requirements in a large enterprise, says Hugh Jenkins, enterprise marketing manager, Dell. "What types of data need to be stored and for how long? How quickly and easily can data be retrieved? Once stored, can the data still be modified? What processes need to be implemented (such as policy-based archiving) to move protection of data from an individual to corporate responsibility? And, once stored, how should the data be secured?"

IT managers are coming under increasing pressure with the introduction of new legislation and growing numbers of servers and applications all putting a strain on the company's enterprise storage needs. Jenkins believes that, in order to cope with these new demands, it is important that IT managers put in place a simple and manageable storage solution. "They need to decide which data is business critical and look to introduce a cost-effective and robust storage solution that will meet the company's growing needs. Today, Dell is seeing companies looking to better utilise their existing storage solutions, with a significant trend towards the implementation of pooled storage, such as sharing a Storage Area Network (SAN).

“In the last few years, these have become much more cost effective - Dell has driven down the cost of its own storage by over 90% in the last three years - and easy to manage, which will enable the IT manager to control costs whilst providing their customers with a good quality service. In addition to SAN, IT managers should also consider other processes, such as migration of data over a certain age into lower-cost disk or tape storage, as this will also enable them to reduce the overall cost of storage."

Once an effective storage solution has been implemented, he adds, IT managers need also to consider how to secure the data and ensure its security on an on-going basis. "A system or network that is relatively safe today may be insecure tomorrow as new security holes, viruses and threats are identified. Along with ensuring that the storage system is effectively backed up IT managers should continually assess their ability to detect, protect and respond to all security threats and make the necessary adjustments."

With data growth ever increasing, best practices must be put in place in order to manage this on tight budgets. The two primary drivers for data protection in the enterprise are compliance and the value of corporate information, says Nigel Ghent, marketing director, EMC UKI.

"Compliance covers everything from regulatory compliance to industry compliance, to corporate governance to litigation. It is a company's responsibility to save and be able to recover requested information quickly and completely. It is also required that a company can discover certain data to comply with requests, and courts look suspiciously on any and all excuses; the consequences can be costly and severe. The business value of corporate information is immense in large businesses, because, if data has business value, it must be protected, discoverable and kept available."

With compliance and business value the primary drivers for protecting data, the enterprise must institute procedures and strategies for both short-term and long-term storage. Short-term storage represents the critical time period for newly backed-up data, which should be kept immediately available and recoverable in case of data loss. Long-term storage covers disaster recovery requirements and archived data that must be kept protected and accessible for compliance and business reasons. Actual time periods will differ by the business and the nature of the data.

Ghent outlines what he considers to be the best practices that should be adopted at the enterprise level:
• Classify data to establish data management policies based on service level requirements. Classification includes disaster recovery requirements, point in time recovery objectives, retention periods, etc.
• Use disk technology for applications that have significant SLAs around recovery point objectives and recovery time
• Separate the concepts of backup and recovery from archive. Use backup and recovery for disaster and operational recovery, and archiving for long-term retention and discovery
• Clearly understand what your compliance requirements are government, industry, corporate governance or litigation. For most large businesses, the answer is all of the above, so corporate IT should build storage management strategies around differing requirements

According to Alan Stuart, chief strategist, IBM Data Retention Solutions, a paradigm shift for enterprise storage is on the horizon. "When organisations, analysts, vendors or anyone else talk about storage these days, they use terms like Enterprise Storage, SANs, NAS, SATA, Virtualised and Archives. They also talk about terabytes and petabytes. Can Exabyte be far behind? IT executives are also talking about Information Lifecycle Management [ILM].

"However, in implementation, ILM is typically just a shorthand term for moving data from expensive disks to less expensive disks. The paradigm shift will occur as two concepts become realised. The first realisation is that a storage archive, not just cheaper disks, is the proper repository at the end of an ILM chain for a very large portion of data. The second realisation is that archives are not backed up in the traditional sense.

"Before looking at this further, it is important to understand the concept of 'Just In Case' (JIC) Data. The emergence of JIC Data* has changed the preferred storage technology used in storage archives today from disk to a disk/tape tiered approach. Users realise that it is no longer affordable or necessary to keep terabytes or petabytes of data spinning on disk storage for years and years. The concept of JIC data shows us that the greatest percentage of data being archived today is not WORM (Write Once Read Many) but WORN (Write Once Read Never)**.

A tiered storage archive can deliver significantly lower TCO by utilising tape or optical libraries as a storage media for JIC Data, states Stuart. "How and when would you backup a 4 petabyte archive? The answer is you don't back up a petabyte or any other archive; you create one or more replications. Replications are copies of the data objects in the archive and they are created at the time of ingest or shortly thereafter. Since data objects in most archives are managed individually, why would you ever need to back them up again? In many cases, even if you wanted to, the sheer size of the archive would prohibit that operation. So what would happen if we applied this concept to JIC files that reside on enterprise storage?"

When you simply move these files to less expensive storage, your IT staff will continue to take periodic volume backups of their media. This data will not change, but the content of the disk volume is changing. "However, when you consider an archive as the end of your ILM chain," he adds, "a number of positive effects can be observed. While the amount of the data can be moved into an archive range from twenty per cent to eighty per cent, many customers are able to archive as much as fifty per cent of their enterprise files. You still get the traditional ILM benefit - you are freeing up expensive storage for other uses, possibly deferring a future storage purchase. With the data safely in an archive like the IBM DR550, all of that data is no longer repetitively backed up over time. This can result in a significant reduction in the time required for the backup window.

"By taking into account the idea of JIC files and moving them to an archive, your organisation can gain greater benefits. In addition to freeing up expensive disk space, it can significantly shorten backup windows and, when you take advantage of a tiered storage archive (disk/tape), the actual long-term cost of storing the data can be far less than the 'so called' cheaper disk."

Today's unprecedented storage demands are increasingly being exacerbated by regulatory compliance procedures and legislations. The latest in a long line is the 'Markets in Financial Investments Directive' (MiFID), due to come into effect in November 2007. The directive aims to build a single financial services trading market in the European Union. While benefiting traders, MiFID will drastically change how businesses manage data, as firms must to be able to prove best execution on deals, and keep records for five years.

"The adoption of MiFID alone is predicted to cost financial institutions up to £8 billion - with a further £1 million annual charge for larger firms," points out Laurence James, UK & IRL services marketing manager, Sun Microsystems. "At a recent customer event, Sun Microsystems asked delegates about the impact of MiFID on their organisations.

Only twenty-two per cent thought the associated bandwidth and storage costs would be manageable. Unfortunately, such legislation cannot be ignored - there is no choice but to comply. The trick is how to do so efficiently and cost effectively. Compliance aside, unmanaged data reserves can undermine business processes and lead to bloated servers containing largely useless information.

"The most helpful and significant strategy CIOs can put in place to achieve efficient and cost-effective enterprise storage is ILM," he argues. "ILM is essentially a sustainable storage strategy that balances the cost of storing and managing information with its business value. Sun MicroSystems has the most advanced ILM methodology, its IM3 model. This takes a strategic approach to archiving and can be applied through a three-step process: assessment of assets and uses; adapting the storage infrastructure to install; and, finally, maintaining the data balance by performing regular audits to monitor archiving and compliance standards. Together, these elements define a top-down and bottom-up model for information lifecycle management, integrating business intent with storage reality."

James states that ILM also enables existing storage elements to be reused at the appropriate place within the hierarchy, lowering costs and enabling a smooth transition from any pre-existing storage strategy. "Storage evolution, not revolution," he stresses.

"A well-executed ILM strategy also brings significant additional benefits: from the business perspective, a more agile organisation and reduced business risk; and from the technology perspective, reduced storage unit and storage management costs."

The challenge for CIOs and IT managers isn't storing data - it's retrieving it when requested and deleting it when no longer needed, says Paul Hargreaves, consultant systems engineer at NetApp.

"CIOs and IT managers are looking to reduce management cost and improve backup and recovery performance. The cost of raw disk storage has reduced dramatically over the years, primarily due to market demand. Technologies have been developed to bridge the gap between primary storage and offline storage, providing much faster data access than offline storage at a cost much lower than primary storage. New technologies such as inexpensive SATA disk drives have further enabled cheaper, low-cost disk for long-term archive."

So what role can enterprise storage play in helping them? "Enterprise storage can help businesses by providing systems that are able to migrate data easily," he responds. "A lot of data is required to be kept for over seven years, such as financial records; however, the average lifecycle for a storage array is only three years. This means the data must be migrated at least three times during its life. As the amount of storage continues to grow, the challenge facing IT managers will be moving this vast quantity of data, without increasing application downtime."

Most businesses suffer from the problem of not having a clear understanding of their data, he maintains. Without data classification, effective data management policies cannot be defined. Changes need to come from the way the IT administrators use the systems they have access to.

"For example, a business can put into place a 'keep all financial data for seven years' policy that is very explicit, but the challenge is classifying financial data," states Hargreaves. "To a computer, it's all just bytes, so the email sent to a business partner with a copy of financial records should be recorded for seven years. But the mail I sent to my solicitor with personal information probably should be immediately deleted for data protection purposes."

His view is that businesses need to apply content classification and management systems that provide assessment and data access capabilities to simplify and automate the management of unstructured data. "Application vendors can help by developing systems that automatically produce data in formats that can be archived and deleted easily in a granular and controlled fashion that match the individual business requirements."

Also of vital importance to all organisations is the ability to meet business continuity requirements in the face of disaster. The overall demand for tape products continues to be strong with storage analysts estimating that there will be some 57 million enterprise and midrange data cartridges shipped in 2006. This will be enough to store nearly 8,500 petabytes of data.

"In order to meet the demand for increased storage capacity, Fujifilm makes continuous efforts to increase the storage capacities of single cartridges," says Roger Moore, strategic business unit manager, Recording Media, Fujifilm UK. "For example, advanced Fujifilm NANOCUBIC technology is currently being used by IBM researchers to demonstrate the potential of multiple-terabyte storage capacity on a single data tape cartridge. In May 2006, IBM researchers were able to demonstrate the ability to store 6.67 gigabits per square inch of data tape, which is more than fifteen times the recording density of current LTO Gen 3 tapes."

Fujifilm's 3592 Enterprise Tape Cartridges for IBM 3592 TotalStorage Enterprise Tape Drive Systems, based on Fujifilm's proprietary Nanocubic technology, have a native capacity of up to 300GB, a native data transfer rate of up to 40MB per second, and they are designed to provide data life of more than 30 years.

"With data storage demands exploding, in part driven by compliance and other issues, the capabilities of our research and development teams in Japan have been a great asset to our partners and customers," adds Moore. "By working closely together with end users, partners and technology companies like IBM, we can not only help support today's storage solutions, but also help our customers understand what future technologies may bring."

*Just In Case, or JIC, data is that data which is being kept 'just in case' someone, or some authority, might ask for it. This concept was first proposed in "Understanding JIC Data and Information Life Cycle Management", Alan Stuart, IBM, February 23, 2005. A Clipper Group White paper based on this notion has since been published, "IBM's Data Retention Appliance - The Case for Just In Case", David Reine, Clipper Group, March 13, 2006

**There is also WORR - or Write Once Read Rarely

The products referenced in this site are provided by parties other than BTC. BTC makes no representations regarding either the products or any information about the products. Any questions, complaints, or claims regarding the products must be directed to the appropriate manufacturer or vendor. Click here for usage terms and conditions.

©2006 Business and Technical Communications Ltd. All rights reserved.
No part of this site may be reproduced without written permission of the owners.
For Technical problems with this site contact the Webmaster