Amid all the talk about the brave new data environments
coming our way, there are still some hard facts that won't change no matter
what kind of data infrastructure is in place. One of these is that there are no such things as unlimited resources. Sure, in
theory you may soon have access to everything you could conceivably need, but
the amount is still finite-and so is your ability to pay for it. This is
especially true for storage, which does not gain the same kind of benefit that
virtualization brings to servers and networking. A byte of data requires a byte
of storage-another one of those hard facts.
And yet storage needs to scale up with the rest of the data center, which has
left CIOs with two choices: make better use of what you have or buy more. So
it's no surprise, then, that companies seeking to expand virtual infrastructure
the most are also rapidly deploying data reduction techniques like
deduplication in the storage farm.
According to the Aberdeen Group, highly virtualized organizations employ three times as much dedupe as less virtualized ones. But the benefits are not strictly limited to improved scalability. Users also report improved application mobility, better protection from server failure and easier remote data replication.
For most of deduplication's history, though, the technology has lacked a real-time component, which has relegated it largely to backup and archival platforms. Lately, however, a new generation of software has driven the technology into primary storagethrough direct control of CPU and I/O resources. The result, according to processor.com, is a storage environment in which duplicate copies are never written in the first place, rather than tracked down and eliminated later.
While keeping storage consumption to a minimum is a worthwhile goal, it's really only part of the picture, says CTO Edge's Mike Vizard. How about the enormous amounts of time, money and effort that go into managing unnecessarily huge data volumes? A recent survey from Quantum put the cost of file restoration alone at about $9.5 billion worldwide, nearly two-thirds of which could be eliminated through dedupe and other measures.
As Vizard points out, many organizations still are holding
off on thorough dedupe platforms until major upgrades come due. That may make
sense provided the upgrade is scheduled for the near future and is accompanied
by an overhaul of the general data management platform.
But if you've been content to scale back your virtualization plans simply because tools like dedupe seem too complicated, you'll quickly find out that the performance costs of doing nothing far outweigh the benefits.
No comments:
Post a Comment