Whether it is Data Center Infrastructure Management (DCIM) or Software Defined Data Centers (SDDCs) or the new acronym in the air - Application Centric Infrastructure (ACI), the cold and lone metal has been changing into so many forms and so fast, that it is hard to recognize it as the same backyard where servers or other iron boxes were stacked and cables ran amok.
Bare Metal days! They are long past gone. From isolation, fragmentation, licenses, lock-ins and capex mountains, data center have evolved into a new paradigm of modular, elastic, workload-centric and software-ised models.
There must be some direction as to where all these modular and converged forces are going – and how are they folding the metal boxes so well into unprecedented forms! We all know where to start. Hyper Converged Infrastructure (HCI), which is no more a new-fangled word.
HCI: The Paper-Crane Flies?
When HCI entered the fray, suddenly the idea of infrastructure became more smart, simple and flexible.
As HCI went plaiting server, storage, and network functions together into a modular unit – armed with a software layer, of course, for all the heavy-lifting of discovering, pooling and reconfiguration – abstraction and provisioning underwent a notable change for data centres.
Amit Mehta, Director –Modern Date Center, Dell Technologies tells us why it is not hard t believe the HCI market at 162M in FY 2019. IDC pegs the overall storage market at $407M (which is 40 per cent) while the market for HCI is projected to grow at CAGR of 37.1 per cent till FY 23. DELL EMC market share in Q2 of FY 19 is 22 per cent.
Mehta insists that Hybrid IT I the reality of today’s IT environments. “Hybrid IT is driving digital transformation. This is redefining the data centres of the future. Now you can simplify and standardize them. There is no need for multiple layers of architecture. Companies can scale out hardware easily. We have entered the era of intelligent, automated and AI-driven IT with agile infrastructure that learns from application environments. There are advantages of always-available uptime, service availability and data centres that are turning smaller and easily fitting into economic stacks. “
HCI can easily sound like pixie-dust for managing the overwhelming sprawl and layers of stuff that affects data centres and is affected in turn. But there is a lot that can lurk behind this single pane of glass.
There is an underlying complexity and a slew of nuanced capabilities that an HCI packs within itself. In fact, much of this complexity can be cleverly hidden somewhere inside by the providers thanks to the way they encapsulate a complete compute, virtualisation, management, and storage stack. As Richard Fichera, Forrester analyst, noted in a report on Hyperconverged Infrastructure, enterprises of all sizes have heavily adopted HCI offerings as I&O teams have begun to understand the operational advantages for heavily virtualised environments. However, organisations can be resistant to the required structural transformations. To best use HCI, you must embrace everything it has to offer. Fichera recommends there that a comprehensive economic model, including capex, opex, skilled labor, and extrinsic factors, must be part of a successful HCI evaluation.
So has HCI shaken up a lot in this space?
Unfolding DCIM and Abstraction
DCIM’s relevance and strategy focus are the first two areas that come into observation when one is reckoning HCI’s dent here. As Subram Natarajan- Chief Technology Officer at IBM India spells out the new enterprise's infrastructure strategy in the HCI world, “The main criteria should be around automation and ease-of-use. HCI is characterized by the simplicity with which we interact with computers and increasingly the interactions will become seamless, context sensitive and ubiquitous. Therefore HCI should focus on the appropriate design of the interfaces and task simplification.”
As far as DCIM is concerned, the ability to automate simple tasks will become the primary component of the strategy, he argues. “Breaking various management tasks into simpler functions which will render them to be easily automated will eventually allow human beings to focus on higher value functions.”
But for Naveen Chhabra, Senior Analyst, Forrester, HCI does not seem to redefine DCIM a lot. “Infrastructure management has been an integral part of any data center since its being.
What has, perhaps, changed or deepened then is the degree of abstraction that has taken a foothold in these hitherto-physical spaces. What virtualization started is being taken to another level with containerization now. Natarajan agrees. “Absolutely yes. Abstracting infrastructure elements such that they can be automatically deployed and changed is a real boon to enterprises, specifically when it comes to provisioning. Think of typical deployment cycle times (say for example, for a test environment) in the case of bare-metal set-ups where environment has to be hand-built and compare them to a containerised environment, which literally is ready to be productive as soon as it is launched.”
He lists down some other benefits of containerisation here - building a strong orchestration layer on top gives the flexibility to scale automatically and also failover in case of container environment outage etc. “These are not inherent in a bare-metal environment: one has to explicitly build these functions.”
Chhabra leans more towards containerization when the abstraction word comes up. “Virtualisation has helped and it’s a real old story. Today firms, application developers are adopting containers as that promises a number of benefits along with being light-weight, transportable across cloud providers, easy to setup, and organize.”
“Our research shows that the containers adoption in the public cloud has grown by 600 per cent in the last 12 months alone.”
He, however, observes that the containers have not proven to be easily manageable as the eco-system around it is still developing.
With these industry changes happening, an ACI scenario looks inevitable. But would more abstraction mean more complexity? More abstraction? More costs?
Mehta believes that if the whole direction is changing then we need software-defined and intelligent environments. This is not possible in a bare-metal set-up, he reasons. “That’s why virtualisation is important because it can abstract and give agility. Abstraction allows you to move away from lock-ins because the hardware choices change. Cloud is a no-brainer. You cannot do digital transformation without hybrid IT. ACI is a different ballgame.”
Once Folded, Ready For More
So what new challenges stare at data centres as we enter a world of SDDC, smart Data Centres (DCs), micro DCs and edge computing?
Speaking of SDDC, Chhabra points out that it requires integration of a lot of infrastructural parts. “It is being achieved using automation (it’s a big theme today) across a particular silo and orchestration across multiple silos. Easier said than done as the approaches, technologies, point integration challenges are a plenty. API-driven infrastructure and application help but firms still have a long way to go.”
Natarajan assesses that the data centre evolution is bringing a level of abstraction that makes consumption of data centre resources much more straight forward, impactful and relevant. “The software definition of data centre elements allows for the consumers to not only define and use the resources they need, but also helps in fine-tuning the optimal usage through continuous and strong feedback mechanism. In a way, this has changed the way one looks at infrastructure management. Traditionally, provisioning happens at the infrastructure level and then the consumption of the same take place through application deployments. In an SDDC world, the consumer drives the infrastructure requirement definition first, which then gets translated directly into infrastructure provisioning functions (in the form of templates and automated tasks).”
As he points out, the main challenge will be in terms of moving a legacy data centre towards a software-defined entity so that they become smart enabling deployment of edge or microDC. There are however, methodical ways of how one can approach this migration.
As to edge computing, Chhabra calls it an infusion of new-age applications that firms want to run at the remote locations as laws of physics apply and there are more than enough reasons for firms to run the applications locally. “To say, it is an infusion of edge requirements driven by the applications like the artificial intelligence and machine learning. We are touching the tip of the iceberg in the Edge computing space. We, at Forrester, believe that a lot of innovation – in terms of infrastructure form factors, chipsets designed for purpose is on its way and waiting to hit the market.”
Turns out that there are many more shapes that data centres will take as the hands of software, abstraction and intelligence twiddle at them with new possibilities - now and ahead and furiously.
DCIM and HCI are great examples of how confluence of key technologies can positively impact and revolutionise the traditional practices, reminds Natarajan. “With hybrid cloud and automation and proliferation of open source-based tooling, new ways of thinking how HCI and DCIM can bring real value to an enterprise through significant cost-saving, time-to-market and above all flexibility, are fast becoming a reality.”
Mehta maintains that infrastructure should be cloud-friendly, automated, AI-enabled and software-defined. There is no way and no room to go back.
The metal has melted.
- By Naveen Chhabra, Senior Analyst, Forrester