I see a mixed data center environment in your future

Topics in this article

At last week’s Gartner Data Center conference in Las Vegas, the usual suspects in the race for the data center of the future were present: the ever glamorous Ms. Virtualized Infrastructure; the enigmatic Cloud brothers (Private and Public) who were often bickering with each other; a hip-looking Mr. Converged Infrastructure; an ever-smiling Mr. Proprietary Stack, looking very much like a used-car salesman; the super-athletic Mr. Hyperscale Computing dressed in black spandex to show off his 1% body-fat physique, and a sneak appearance by Mr. Physical Infrastructure, in suit and tie, who had not been invited to the race but had showed up anyway.

Who would win the race for the data center of the future?  This being Las Vegas, there were lots of bets and the conference was full of presentations that said it would be one or the other.  Some vendors said it would be a sweetheart combination of Virtualization and Private Cloud.  Another vendor was pretty sure it would all be Public Cloud.  Some were certain the future would be based on Hyperscale computing given that all the super-cool customers (like Facebook and Google and Zynga) had adopted it.  The vendors who sponsored Mr. Proprietary Stack talked him up as a contender, made it sound painless to adopt him (“good for you and won’t hurt a bit” they said with a smile).  One thing was for sure, no one thought Physical Infrastructure had much of a chance.

There was however one voice that had a different point of view from all the rest.  It came from Dell (and presented by yours truly) and it said that in the data center of the future, all of these technologies would be present and get along just fine (except perhaps Mr. Proprietary Stack).  Dell’s point of view was that the data center of the future would most certainly be a mixed environment.  It would be hybrid.  It would have some physical infrastructure, a lot of virtualized infrastructure, much of it implemented as a private cloud, and links to one or several public clouds.  The mix of these technologies might be different depending on the customer, but for most of them (let’s say the bottom 99% to use a currently popular phrase), it would definitively be a mix. 

How could Dell be so sure?  Because Dell talks directly to a lot of customers and one thing that they say in common is that they want a pragmatic approach to their IT future.  What that means is they want a path that leverages what they already have—they don’t want forklift upgrades that require them to throw away their existing investments; they don’t want proprietary solutions that lock them in or won’t integrate with their existing infrastructure; they don’t want to have to find specialized IT skills or retrain their staff; they don’t want to have to rewrite applications to run on some new technology platform, be it virtualization, hyperscale computing, or cloud.  And this applications issue is a really big deal—companies rely on thousands of applications, many of them custom, to run their business.  They need to make sure the infrastructure they buy will be able to run the applications they depend on without need of rewriting them, which can be a lengthy, risky process.  For this reason, many customers stick with physical infrastructure even though virtualization could provide them with infrastructure cost savings.  This is also one reason customers may hesitate to rush to the cloud—because they first want to ensure their applications will work in that new environment.

The pace at which customers adopt interesting new technology for their data centers is governed by what we can call their “technology absorption rate” which is a lot slower than the rate at which vendors develop the technology.  It is slower for three main reasons:

  • Risk: will the new technology really work as advertised?  Will it handle their applications under load?  How tried and tested is it?  Will it end up costing more because of unforeseen challenges?
  • People and Process: any new technology that requires radical changes in customers’ IT processes or their people or organizational structures will find a slow ramp.  Changing process or people is a non-trivial task and most IT leaders approach it cautiously.
  • Applications:  applications, as mentioned earlier, are king in a customer’s environment.  If the new technology won’t support their apps, custom or packaged, then it won’t be adopted broadly.

Dell understands these concerns and our strategy is to provide a unique combination of leading-edge technology and pragmatism.   For example, we can help customers with the latest technologies such as advanced tiered storage architectures, private cloud, self service delivery of IT, Hadoop map reduce to handle their big data needs, burst capacity to public clouds, SaaS delivery etc; but we also approach all this very pragmatically.  We don’t expect our customers to do forklift upgrades; we don’t expect them to have to hire specialized new skills, we don’t expect them to take risks with their applications.

That is why the design philosophy for our enterprise products is to provide:

  1. Innovation that delivers and is uncompromising where it matters
  2. Technology that is simple to use every day, and
  3. Architectures that are open, that integrate with existing environments and allow choice. 

Some examples of this design philosophy are Dell’s Advanced Infrastructure Manager (from our Scalent acquisition) which converges and orchestrates storage, networking and compute (this is the foundation for a private cloud) but does it in a heterogeneous and open manner, or our storage arrays based on the fluid data architecture which are designed to simply work, or our Boomi integration offerings which help connect applications running on-premise and off-premise so you can run the applications where they are best suited but ensure that data flows coherently between them. That’s being practical.

An illustrative case study is Carnival Cruise Lines.  Carnival Cruise Lines is about fun and indeed their ships are called The Fun Ships.  Carnival operates over 20 cruise ships, each of which has a data center.  A Carnival cruise ship data center is unique in that when the ship is out to sea, its data center gets pretty remote.  If there is a major IT issue in this floating data center, say the storage systems go down, it’s not as if you can just fly out a specialized IT resource to the ship to fix it.  That’s why Carnival Cruise Lines bought EqualLogic storage from Dell because of its superb uptime record (uncompromising where it matters) and because it is so simple to use every day (simply works) that general IT staff can manage it.  Dell helped Carnival modernize and virtualize their ship board server farms with the latest x86 based (open) servers, allowing them to throw about 60% of their servers overboard and achieve significant cost savings. Take a look at this video interview with Doug Eney, Carnival’s VP Of Information Systems Engineering for more insight.

Dell’s unique approach of innovative yet pragmatic technology could end up being the winning approach to the data center of the future for it takes into account the latest technologies and also the customers’ rate of absorption of these technologies.    

About the Author: Praveen Asthana

Topics in this article