The evolution of storing ones and zeros—or bits and bytes—has definitely evolved,” says Matt Starr, chief technology officer at Spectra Logic. The sheer amount of data being stored today is stunning. Instead of storing megabytes or even gigabytes, some companies now have petabytes (equal to one million gigabytes) or even exabytes (one billion gigabytes) of data in their archives. The management of large data-storage programs and the automation of data backup is the specialty of the folks at Spectra Logic.
Cloud storage may be getting a lot of attention these days, but Starr says that a hardware storage platform still lives at the heart of every cloud solution. “Cloud-based back up is still using storage hardware,” he explains. Instead of end users storing information in a tape or disk archival solution themselves, physical storage is handled by a cloud vendor, who then provides access to the data.
But getting data into the cloud often isn’t an easy task. “You can’t move petabytes across the wire,” Starr says. As companies look to transmit larger amounts of data into the cloud, a new tape format, linear tape file system (LTFS), is coming to the forefront. It’s already popular in media and entertainment environments, and Starr expects it to gain traction in the supercomputing sector soon. Data can be written to an LTFS-format tape library, which is then shipped to a cloud provider. “They ingest those tapes and, voilà—your data is now in the cloud,” Starr says.
Recovering from a disaster is another area where tape and disk backup solutions shine. One primary reason is the dreaded bandwidth bottleneck. “If all of your data is in the cloud and you lose the data at your corporate site, you have to pull everything back over a relatively low-bandwidth pipe,” Starr explains. That transfer process could take days, weeks, or possibly even months for companies with large archives. “That’s a really good place to augment the cloud with tape backup,” Starr says. Tapes can be loaded into multiple drives and the data, even vast quantities of it, can be streamed back into an organization’s systems.
Data storage solutions have evolved tremendously over the past decades, and Starr sees new innovations on the horizon. “I think in several years you’re going to see a move away from traditional backup,” he says. It all comes down to expanding storage needs. “Data growth rates are estimated at between 48 percent and 60 percent per year,” Starr says.
That works out to doubling the amount of data every two years, and simply making another copy of large data libraries doesn’t work. “The solution is to move into an archive environment,” Starr explains. Instead of storing low-use data within a primary drive or system, it will likely be deleted off the local servers and moved to an archive environment. “When I need it, I can bring it back to live storage, but it’s not sitting here taking up a bunch of expensive disk or system space when it isn’t being used,” Starr adds.
Not only is the amount of data being generated growing, but the value of that data is also increasing. A private Facebook page with 100 megabytes of pictures may not have much monetary value, but, Starr says “that same 100 megabytes of storage could be a company’s most important digital asset.” This disparity in value is beginning to present problems when it comes to deciding on the appropriate service level agreement (SLA) for data storage. “What’s the SLA of some old pictures that somebody took on a phone versus the SLA of a corporate infrastructure?” Starr asks. That question will likely be sparking debates in the years to come.
A one-size-fits-all approach is not the way to solve today’s complex business challenges. Each industry presents unique challenges, and we have the experience and the knowledge to meet your every need. At McGladrey, we pride ourselves on helping companies like Spectra Logic succeed. Our solutions are rooted in a deep understanding of you, your business, and your industry.