Prediksi Data Center Storage untuk 2014

Meskipun sudah hampir setengah tahun, sebagian masih valid dan on-track.

Data Center Storage Predictions for 2014

04 Apr 2014, KB Ng - Product Marketing Director, Asia Pacific, Cloud Storage & Enterprise Products, HGST, DATAQUEST
data-center-interior-lit1-large
To enable and enhance the advancements in mobility, cloud computing, social media, and data analytics, the innovations behind the storing of information in enterprise and cloud data centers have moved at an accelerated pace over the last decade. As more and more businesses seek to realize the benefits of the ‘3rd Platform' across more and more applications and data sets, 2014 promises a continuation of the fast and exciting progress for data center storage products, technologies, and architectures.
Beyond the hype of ‘big data' the business benefits of data analytics for virtually every industry and business model have been studied and published. With data creation growing at a sustained annual rate of more than 40%, companies are challenged to retain the information that could bring valuable market insights and growth in profits. But whether they've deployed new solutions in their private data centers or turned to public cloud services, industry experts estimate that the shortfall in capacity to store all the data that's created will hit 60% in the coming years. That's a lot of valuable insight just flowing down the drain.

FACING THE CHALLENGES
At the heart of the challenge is not only the cost of the capacity, but also the cost to operate the data center housing that capacity. With regulatory requirements and long-term cyclical patterns for analytical insights, the operating cost is quickly rising as data longevity stretches out to several years or even a few decades. During this extended lifetime, data needs to be quickly accessed by analytics applications or for compliance purposes, making traditional methods of archiving unsuitable.
Overall, the rapid pace of innovation is being driven by three forces. The first is the volume, velocity, value, and longevity of data. The second is the total cost of deploying and operating data center storage systems. The third is the management of the high volume of data and more importantly, the accessibility of data by multiple applications over its extended lifetime.
For the coming year, following are some of the key innovations to look for that address these forces:
  • Ambient Data Centers will Breathe Fresh Air to Boost Power Efficiency: It's been estimated that in India the biggest expense when running a data center is power, which accounts for approximately 70% to 80% of the overall cost of running a data center facility. About 60% of this is used to just keeping its lights on, to let its chillers run 24*7 and also to help the servers do not stop running.
In addition to focusing on hardware that consumes less power, companies have started building data centers that use unconditioned, outside air for cooling instead of traditional chillers and filtration systems. This has been demonstrated to reduce the power required to cool the data center by 96%. A key enabler to expand the deployment of these ‘ambient' data centers is hardware that is resilient to more hostile levels of temperature, humidity, and dust.
To better withstand the elements, expect data center IT system providers to offer more in the way of coated circuit boards and sealed hard drives. When the latter is filled with helium, it provides the added benefit of up to 23% lower power consumption.


  • The Emergence of Cold Storage Systems will Enable Fast Access to More Data at a Lower Cost: Data is only valuable if you can get to the information and knowledge locked inside of it. To address the need to readily access massive amounts of data for analytics or compliance, a new breed of storage systems will provide a new architectural layer with high-density ‘peta-scale' capacities at a cost that falls between traditional disk and tape systems. Energy-efficient designs and adaptive power management can reduce power costs and enhance longevity.
  • Storage-class Memory will Proliferate to Accelerate: When we think of analytics we tend to visualize needles of insight being extracted from mountainous haystacks of historical data. However, real-time analytics and decision automation systems can be even more valuable for industries that are hyper-sensitive to time. These high-performance applications can benefit from shaving millionths of seconds off the time it takes for data to reach central processing units. To meet these needs, storage-class memory (SCM) solutions bring the high-capacity and persistence of flash memory to the CPU through the high-bandwidth PCI Express bus within the server.
Until recently, the capacity of these caching modules was limited and had to be dedicated to a particular server and its applications. For 2014, look for a new wave of deployment for SCM solutions that offer several terabytes of capacity per card, which can be pooled and shared across multiple servers and applications.
  • Object Storage Systems will Bring Hyperscale Capacity to the Masses: Along with the rapid growth in data is the burden of installing and managing the capacity to store it and ensure that applications can easily and reliably find the data they need. Public cloud service providers, with their need to scale across millions of users, have led the charge in deploying object-based storage systems as an advancement beyond the file-based storage systems of typical network-attached storage (NAS) solutions.
  • Flash and Hard Disk Drives will Thrive Together: The life of data and storage used to be so simple. Data was created, backed up, and archived. In today's world, data lives an active life for years beyond its creation. During that time, it's accessed by several applications-not just the one that created it. The combination of longevity and activeness forms new demands throughout the data center storage ecosystem. A single resting place from creation through years of cyclical access would either be too expensive or too slow. Luckily, a new generation of solutions, under the category Software Defined Storage 2.0 (SDS 2.0), will emerge in 2014.
Through a tight integration of software with pools of storage in multiple tiers along the cost-performance curve, SDS 2.0 solutions will be able to dynamically place data in the most cost-effective tier and cache layer based on its state in the usage cycle. Under this model, the value of performance and capacity, all with automated management and at an optimum cost, promises to reach new heights.
Historically, servers and applications have been the stars of the data center. With the volume, velocity, value and longevity of data, however, we're entering an era when data storage is taking over the spotlight as a key enabler for advancements in the data center. It's not that processing the data is easy; it's that data has become the currency of business insight and needs to be stored and readily accessible for companies to fully realize its value.
Here's to another exciting year and the dawn of a new age for data center storage.