Virtualization and “The Cloud”: No Silver Lining Ahead For Enterprise Data Center Vendors

sagawa
Print Friendly
Share on LinkedIn0Tweet about this on Twitter0Share on Facebook0

Paul Sagawa / Artur Pylak

203.901.1633 / 203.901.1634

sagawa@ / apylak@sector-sovereign.com

January 19, 2012

Virtualization and “The Cloud”: No Silver Lining Ahead For Enterprise Data Center Vendors

  • Virtualization and “the cloud” are NOT equivalent concepts. Virtualization is a software technology that allows many users to flexibly and securely share a computing resource. It can be implemented within a single private enterprise data center, or by a public host with a distributed network of Internet attached shared data centers. It is the latter example that is typically called “the cloud” and the large-scale providers offering this service have very different technology needs and buying habits than do enterprise IT departments. As enterprises shift focus from the virtualization of their own data centers, through web connecting them as a “private cloud”, and onto public cloud hosts, so will the basis of competition for most IT vendors. With incremental demand growth coming from public cloud hosts, IT hardware markets will commoditize, commercial infrastructure software vendors will lose share to self-supported open source solutions, and applications will see the rise of new cloud optimized rivals.
  • Virtualization software divides underlying computing, storage and network resources amongst many users, each with a secure “virtual machine” that appears as a dedicated resource. Users may not be aware that their applications are not running on their own device, but rather on a shared server, using shared storage, and appearing on their monitors via shared networks. The data center may be local or distant, and, keeping individual sessions strictly separate, will allocate resources as necessary and authorized. For the enterprise, virtualization offers many advantages, chief amongst these are avoiding device hardware upgrades, efficient allocation of data center resources, greater ease of software maintenance and upgrades, and reduced burden on IT organizations.
  • Cloud computing refers to data centers that are accessed across many locations using the Internet, typically using virtualization technology to efficiently share IT resources. Cloud computing is not synonymous with virtualization, although most cloud implementations employ virtualization technology. Today, most virtualization takes place in enterprise data centers accessed by a closed set of users across a proprietary enterprise network.
  • Enterprises are completing the virtualization of their privately operated data centers, and many are adding internet accessibility, a configuration often called a “private cloud”. Private cloud implementations are a very modest step from data center virtualization. The computing and storage resources are shared only within the enterprise itself, limiting the opportunity to gain economies of scale, maximize capacity utilization or off load management responsibility. Moreover, the operational benefits possible from providing broader access may be muted if the applications are not designed to take proper advantage of opportunities for greater collaboration, more timely access, and location awareness.
  • Most CIOs anticipate moving much of their computing to commercial 3rd party cloud-based hosts offering significant long-term cost and flexibility advantages. For most organizations, virtualizing data centers is a step on a longer road to the public cloud. Relative to private solutions, public cloud data centers offer superior performance for users, reduced costs via scale and utilization, greatly improved flexibility, world-class support and maintenance, and opportunities for collaborative, time sensitive and location aware applications. As such, “The Cloud” has risen dramatically in recent CIO surveys, while “Virtualization”, meaning private data center virtualization, has fallen in priority.
  • Enterprise data center virtualization, including “private cloud” implementations, has been lucrative for IT hardware and software leaders, but spending may be peaking. Growth for major virtualization linked vendors like VMWare, EMC, NetApp, Citrix, RedHat, Oracle, and the server arms of Dell and HP, has been strong. However, this growth has been decelerating, making the drop in priority for CIOs and Oracle’s recent miss ominous. We believe that 2012 is likely to be disappointing for enterprise data center spending with increasing pessimism over future years likely.
  • Spending on public cloud data centers will accelerate, spurring market shift to commodity hardware and open source software. The big bright spot in IT spending will likely be aggressive expansion by major public cloud hosting operations, such as Amazon, IBM, Microsoft, and Google. While many companies have turned to focus on this arena, we believe dramatic economies of scale and the ability to lever investments in enterprise cloud businesses will quickly separate leaders from pretenders. Unfortunately for data center hardware and software players, these leaders are amongst the most technology savvy companies on earth. As the cloud becomes the primary agent of industry growth, it will drive commoditization and an increasing reliance on open source and internally developed proprietary software, to the detriment of the existing market leaders.
  • Big public cloud will win. Specifically, the top cloud hosts, the best purpose-built SaaS vendors, the most cloud savvy IT consultants, and the lowest cost commodity component suppliers are positioned to capture a disproportionate share of enterprise IT market value.
  • Traditional enterprise IT vendors will lose. Similarly, premium hardware and software vendors into the enterprise IT data center will suffer, as customer relationships open to competition and commodity solutions pressure prices. This includes makers of PCs, servers, storage systems, and networking gear, as well as most traditional software players.

Virtualization is Real

The concept of virtualization dates back to the dawn of the mainframe.  Simply, virtualization slides a layer of software on top of a physical server or storage system that allows that resource to be shared by various users and applications while keeping the activity of each distinct usage wholly separate.  For example, a user pulls up an application on a desktop computer.  That application is actually running on a “virtual machine” managed by the virtualization software layer and running on top of a data center server (Exhibit 1).  To the user, it looks and feels like it is on the desktop PC, but by running it on the server, the software can be more quickly deployed, better maintained, more easily upgraded, more efficiently run, and at lower costs to boot.

Exh 1: The Traditional Desktop versus the Virtualized Desktop

There are several types of virtualization that can be broadly classified in four categories: operating system, server, storage, and application (Exhibit 2). The most widely visible is operating system virtualization, which essentially is the concept of running more than one virtual machine on a single physical computer. A virtual machine is an independent instance of an operating system with one or more applications running that uses the local resources of the host machine. An operating system cannot distinguish between a virtual or physical machine nor can other applications or computers on a network. This allows for multiple applications within separate virtual machines to run on a single physical computer. A familiar example is enabling a Mac OS X computer to run a Windows application like MS Access.

Exh x: Categories of Virtualization

On servers, virtualization can enable the efficient utilization of resources.  As users come onto a network, servers can be powered up and resources managed to allow for optimal performance. Conversely, during off-peak times, servers can idle. Also, when a server or disk is being serviced or upgraded, resources can be shifted in the background and allow an application to continue running uninterrupted. Storage virtualization is similar. A Storage Area Network (SAN) is a distributed storage network that appears as a single physical device. A file does not reside on a local user’s computer. Application virtualization again is similar. While you may be running MS Office on your machine, the application suite is not installed, but can communicate with the local OS, middleware, plugins, and other applications. Your system provides the processing power and RAM to run the app, but nothing is stored locally.

Double Rainbow!  What Does It Mean!?

Over the past half decade, enterprises have moved aggressively to implement virtualization within their own internal data centers, enticed by cost efficiencies, increased flexibility and enhanced control over technology use within the organization. Gartner maintains that over 80% of enterprises have some virtualization program or project as of early 2011 and 49% of x86-Architecture workloads are run in virtual machines (Exhibit 3). A study by Veeam Software is more aggressive, suggesting that 98% of organizations are using at least some virtualiztion and that nearly 40% of all servers worldwide are virtualized.

The primary position of virtualization in enterprise IT has far reaching implications for technology vendors.  First, it shifts processing demand from ever faster desk top devices to ever larger servers.  It shifts storage from local hard drives to shared storage systems.  It means software deployed centrally rather than in individual copies on each PC.  It also means that the architecture of the data center becomes that much more complicated.

Exh 3: Percentage of x86-Architecture Workloads Running Virtual Machines

This has been good for most of the big players in data center IT.  Enterprise IT managers need help with the transition to virtualization, so they contract for consulting services from the likes of IBM and Accenture.  Enterprises buy turn-key virtualization solutions, favoring market leaders like VMWare, EMC, Brocade, NetApp, IBM, HP, Citrix and Cisco (Exhibit 4).  Virtualization supports seamless migration of most enterprise software, so major software vendors like Microsoft and Oracle have opportunity to lever their products to take advantage of the new architecture.  On the flip side, virtualization reduces the urgency to upgrade PCs, so Microsoft, Dell, HP and others may take a hit, particularly as smartphones and tablets gain hold in the enterprise market.

Meanwhile, Back in the Cloud …

While enterprises move applications onto virtual machines in the data center, consumers have embraced the cloud, which we will define as applications served from publicly accessible commercial data centers connected directly to the Internet.  This is in contrast to traditional applications which are stored and run directly on user devices – note that most “apps” downloaded to smartphones and tablets are merely access shortcuts to cloud-based services which rely on internet data centers for processing and storage.

Exh 4: Sales Growth of Major Virtualization Linked Vendors

Cloud-based applications like search, social networking, and streaming media have become the focus of the consumer computing experience, and in response, leaders like Google, Amazon, Facebook and Apple have stepped up to billions of dollars in annual IT investments, while content delivery networks like Akamai, Level 3 and Limelight have built out their own data center assets to provide cloud service for consumer cloud application providers such as Netflix (Exhibit 5).  The rise of the smartphone and tablet, abetted by the introduction of 4G wireless broadband, is quickly expanding the opportunities for users to access their apps, and users are responding.  As such, the strong growth in IT spending by this segment can be expected to continue.

And Back to the Enterprise …

From many perspectives, virtualization is a stepping stone for enterprises on the way to the cloud (Exhibit 6).  The underlying technologies of virtualization make “private cloud” networks possible –where computing capabilities are pooled through an in house data center and made accessible via secure Internet links. In a private cloud, PCs act as dumb terminals accessing enterprise applications via their browsers or customized “apps”, as they would a consumer service like Facebook.  This approach standardizes access for users and allows organizations to replace aging PCs with inexpensive tablets or netbooks rather than pricey upgraded PCs.  While this approach offers some incremental benefit vs. simple data center virtualization, scale economies and efficient utilization are very limited and the onus of IT management remains within the organization.

Exh 5: Plant, Property, and Equipment Positions of Major Data Center Players

The next step is the public cloud, which, at a minimum, involves moving computing and storage out of internally operated data centers and into commercially operated internet accessible data centers.  The least aggressive move is closely akin to traditional data center outsourcing – an organization’s software platforms and applications are ported, as is, directly onto a 3rd party host’s infrastructure, who takes responsibility for server, storage and network access capacity.  A more aggressive approach would be to just move the applications, and rely on a cloud host for basic software platforms –i.e. operating systems, data base engines, etc.  The most aggressive approach would be to rely on the cloud provider for applications as well, a la Salesforce.com.  In all three of these levels, infrastructure only, platform and applications, the cloud capability and software is typically paid for as an on-going service, rather than a one-time purchase.

As solutions rely more on the cloud host, many benefits build. Hosts, like Amazon, IBM, Microsoft and Google, with enormous consumer cloud operations can offer scale economies and resource utilization far, far beyond the reach of enterprise data centers. Their architectures distribute data center resources broadly, giving users superior performance across many locations and geographies. Their deep expertise becomes an asset to their customers in an era of scarce technical talent. Applications designed for the cloud can take full advantage of the breadth of access to support collaboration, mobility and other attributes emerging in importance in ways that are not inherent to traditional enterprise applications.

Exh 6: Virtualization to Cloud Road Map

Go Big or Go Home

A substantial shift in enterprise IT from building internal virualized data centers to moving applications to public cloud hosts has substantial repercussions for all of the businesses upstream. The enterprise data center is a great customer. Remembering the old adage “No one ever got fired for buying IBM”, enterprise IT buyers are sticky customers and well impressed by track record. Most will lack the expertise and the manpower to adequately evaluate competing technology solutions on their own, so they rely on longstanding relationships and external consultants to inform their purchase decisions, usually to the benefit of industry leaders. Enterprise IT shops, lacking that leading edge expertise, are also prone to buy turnkey solutions to avoid the risks of incompatibilities and the headache of managing too many vendors. If they can possibly justify it, they also prefer to buy top-of-the-line solutions, again to save headaches, but also to establish bragging rights at the annual users group boondoggles. This culture has long rewarded sales heavy companies in the IBM tradition, like EMC, Cisco, Oracle, SAP and VMWare. Even competitors that had traditionally been less “salesy” like HP and Dell have moved emphatically in that direction over the past decade.

The big cloud operators do not buy this way. Data centers are painstakingly designed to deliver maximum performance at the lowest possible lifetime cost. Bells and whistles are eschewed and value added is the province of proprietary software written by the customer itself. Hardware is commoditized, bought in huge volumes from interchangeable vendors. Some cloud hosts – Google comes to mind – actually buy bare components in bulk and engage contract manufacturers to assemble them into boards to slide into generic racks. If a systems level vendor is to prosper here, it must offer the best value for money and accept the price that the market will bear. This is true for servers. This is true for storage. This is true for networking.

Software is just as bad. Cloud hosts will use branded software if customers demand it, but have considerably more leverage over price than the typical enterprise buyer and are very unlikely to buy a maintenance contract. If a customer doesn’t specify brand name infrastructure software, most cloud hosts will go with low-cost open source options, again, without the maintenance. Note that these highly sophisticated IT operations would not require a 3rd party to package open source software for their needs, a la RedHat or others, rather open source is used as a framework for internally developed, proprietary software used as a competitive advantage. For example, in data base management, a major infrastructure software platform, Google and Facebook are arguably the biggest, fastest and most sophisticated operators in the world, with the technology underlying their massive search and social networking offerings. Companies like these are NOT good candidates for turnkey software solutions.

Who Wins

As the enterprise shift to the cloud plays out over the next several years, several categories of IT players are positioned to take advantage (Exhibit 7). Chief amongst these are the leading cloud hosts. Here, we prefer well established players with scale, skill and leverage against consumer cloud franchises. Amazon, Google and Microsoft are obvious, with IBM able to leverage its considerable consulting franchise into success in its fourth era of computing. While it has shown no interest in the market, we would also watch Facebook as a darkhorse, given its enormous data processing infrastructure and its towering competence in “big data”.

Exh 7: Winners and Losers

We also see leading SaaS players as winners, of which Salesforce.com stands out. The ranks of independent SaaS plays is thinning with traditional software giants on the M&A trail. Just this year both RightNow (Oracle) and Success Factors (SAP) were taken out. NetSuite is the most prominent of the remaining public SaaS plays, with a handful of private players like Workday and Intacct possible acquisition bait. Of the bigger software players, we see Microsoft and IBM as having established the best beachhead in SaaS.

Given the complexity of a move to the cloud, we believe most CIOs will rely on IT consultants to map out their transition and aid in the implementation. IBM and Accenture stand out, with a raft of would-be rivals jockeying for position behind.

While the commoditization of hardware markets is a difficult scenario for most IT vendors, those with cost leadership could see share gains as a result of the shift to the cloud. Contract manufacturers could benefit, as could storage and memory component makers like Seagate, Western Digital and Sandisk.

Who Loses?

The big losers will be traditional IT vendors who don’t make a strong successful transition to the cloud. On the hardware side, we see big names like HP, Dell, EMC, Cisco, NetApp and others as particularly exposed to commoditization pressures in the transition to the cloud. Software vendors will also be vulnerable to the shift, as cloud providers eschew commercial packages in favor of their own open-source based infrastructure solutions. Even application vendors will be under pressure, as a shift to the cloud opens established accounts to new solutions. We see Oracle, SAP, RedHat and others as potentially disadvantaged by long term trends.

Print Friendly