About the HP cloud computing strategy

Posted by virtualization.info Staff   |   Thursday, October 21st, 2010   |  

At the beginning of September cloudcomputing.info has been invited by HP to a closed-doors, intimate event for a few influencers in Cupertino. During the two-days mini-conference, very high-profile executives introduced the HP approach to cloud computing, the product and services portfolio, and the go-to-market strategy.

It’s uncommon for a company of this size to pack together in a single room vice presidents, CTOs, and directors to clarify the vision, and answer questions about it for hours, just for a very restricted audience (there were less than 10 people invited, from all around the world). It’s a welcome novel approach considering how fragmented the message can be, buried deep inside tens or even hundreds of press announcements, coming from multiple business units and departments that work on a technology like cloud computing from many different angles.

A few interesting details emerged during the two days, so what follows is a list of things that are worth a report.

The very first thing to report about is how HP position itself at the beginning of the cloud computing era.

Right now, the primary goal doesn’t seem to be the technology provider for cloud infrastructures. Selling hardware and software to build cloud computing infrastructures of course is part of the strategy, but HP doesn’t seem to have set it as its primary goal in this phase. The main goal seems more to be the business partner that assists during the transition to the cloud. So HP has a strong focus on consulting services.
The company has five services: the Discovery Workshop, the Roadmap Service, the Design Service, the Security Service and the CloudStart Solution.

cloudcomputing.info has been particularly impressed by the first one, the Discovery Workshop, which has been briefly demonstrated during the event.
It basically is a consulting day where multiple representatives of a single company are invited at the HP campus in a special training room. The room is filled with giant posters, describing cloud computing from many different perspectives: the 101 definitions, the infrastructure requirements, the business goals, the security risks, the identity management approaches, etc.
The purpose of the day is not to educate a customer about what is cloud computing. The purpose rather is to do a reality check, and be sure that internally, business and technical people are on the same page about what is cloud computing and why the company wants to embrace it.

At the beginning of the demo, it seemed a silly exercise, but after half an hour the value of this approach appeared much more evident. The customer is invited to walk through the posters and openly discuss about them, while HP staff is primarily there to facilitate the discussion. 
HP reports that in many cases the activity reveals profound conflicts in how cloud computing is perceived by different people in the same company, with the workshop becoming the very first opportunity to compare point of views and try to realign expectations and goals. It’s hard to not believe so.

The Discovery Workshop has been particularly impressive not because of the approach, which is probably not unique in the Industry, but for the quality of contents provided, and the knowledge HP demonstrated in describing the current cloud computing trends, challenges and risks.
The presentation has been very objective and, remarkably, completely lacking any specific reference to HP’s portfolio. Not even for a single minute HP used the workshop to pitch its products, and we were told that this is on purpose: for the company, the workshop represents an opportunity to increase awareness and open a discussion with customers, not an opportunity to place sales orders.

 

cloudcomputing.info didn’t find as convincing the HP Cloud Security Analysis Service.

The company can perform a wide range of assessments, on behalf of the customer, on any 3rd party cloud infrastructure selected by the customer. While this should be done before a customer selects his cloud provider, HP reports that it’s usually requested after the adoption.
This assessments, usually 3 weeks long, may include various activities, from review of compliance policies to infrastructure assessment. HP generates reports and presents findings to customers, providing, if requested, guidance on remediation.

Problem is that these assessments are only available on demand and they seem part of a rather slow, manually driven process. There’s no way for the customer to request some sort of automated assessment, triggered by the occurrence of a specific event. The customer has to recognize the need for an assessment and deal with HP consulting services to book it. Once completed, it can be easily renewed, but it still involves interaction between the parts.
The whole process seems far away from the dynamism required to deal with the rapidly changing infrastructures of cloud computing.

More than that, HP doesn’t rely on any third party, external security provider to assess its own assessment infrastructure. So if the HP tools are compromised, and the customers’ assessments are compromised accordingly, there’s no way to know. The customer has to blindly trust HP without any chance to verify the integrity of the systems.

 

Another notable presentation has been the one about the new Cloud Maps extensions for Matrix Orchestration Environment (MOE).

MOE is the orchestration framework that HP offers to automate service provisioning inside its BladeSystem Matrix converged/unified infrastructure.
Rather than just provide architecture blueprints to its customers, HP launched at the end of August a number of extensions for MOE called Cloud Maps that completely offload the system architects from infrastructure design duties.
They basically are predefined workflows that leverage MOE to automate the deployment of a few popular software solutions, including Citrix XenApp, Microsoft Exchange/Sharepoint/SQL Server, Oracle RAC/PeopleSoft/Fusion Middleware, Red Hat JBoss, SAP BusinessObjects/NetWeaver, SAS Enterprise BI and even VMware vCloud Director.

Once deployed in MOE, customers find these applications available as business services in the self-service provisioning portal. A Cloud Map tells MOE exactly how to provision blades, networks, storage, operating system instances, and of course application instances to serve a certain amount of users. 
Additionally, the Cloud Maps are developed to scale the service if needed: customers can request more capacity through the portal while MOE adds physical resources behind the scenes.

While Cloud Maps are developed by HP and a third party vendor part of the AllianceONE program, so that customers can’t easily create their own Cloud Maps, they remain great examples of the capabilities and potential of orchestrations frameworks, inspiring the creation of custom, sophisticated workflows for the BladeSystem Matrix.

The Cloud Maps approach greatly simplify the implementation and use of a private cloud infrastructure and it’s a shame that HP only offers them to BladeSystem Matrix customers.
The company stated that Cloud Maps may be available in future also for customers that don’t have this specific hardware. Meanwhile, there are a couple of alternatives available today: the HP Server Automation solution packs and the HP SiteScope solution templates. Of course both are limited compared to Cloud Maps as they cannot control the provisioning of the hardware components.

 

Overall the impression has been positive: HP seems well positioned to provide guidance and tools to embrace cloud computing, despite the offering could mature even more in few areas, like security.
Another point that seems evident is the fragmentation of HP offering: the plethora of services, and tools that operate or just support the cloud, is impressive (and it’s growing), and in some ways reflects the vastness of the company’s overall portfolio. Without guidance, even just to browse the company’s website, customers can get lost easily. Here the company has to work harder to simplify the message and demonstrate the clarity and transparency that shown during its Discovery Workshop.


Labels: ,

blog comments powered by Disqus


cloudcomputing.info Newest articles
Google is working on making Windows Server available in Google Cloud Platform

December 11th, 2014

Google announced beginning this week that it’s going to support running Windows workloads on top of its Google Cloud Platform. For now this means that you can now run Windows…

Tech: Microsoft Cloud Platform Integration Framework

December 3rd, 2014

Microsoft has published a series of articles on the subject: Cloud Platform Integration Framework. The Cloud Platform Integration Framework (CPIF) provides an enterprise or cloud service provider architect patterns…

OpenStack 2015 Board of Directors Elections

November 28th, 2014

The OpenStack Foundation regularly conducts Elections for Individual Directors of the Foundation’s Board, next elections for the 2015 Board of Directors are going to be held from Monday January…

Paper: Creating a VMware Software-Defined Data Center

November 27th, 2014

VMware has released a paper titled:"Creating a VMware Software-Defined Data Center". The paper which contains 29 pages describes a reference architecture for a Software Defined Data Center (SDDC) using VMware…

IBM announces “Bluemix Dedicated” Platform

November 20th, 2014

Today, November 20, IBM announced an interesting update to its PaaS (Platform as a Service) Bluemix. The new “Dedicated” offering allows developers to create applications and deploy them in a…

Tech: Microsoft Infrastructure as a Service Foundations

November 20th, 2014

Microsoft has published a series of blog articles called the "Microsoft Infrastructure as a Service Foundations". The goal of the Infrastructure-as-a-Service (IaaS) Foundations series is to help enterprise IT…

AWS re:Invent Day 2 Keynote Wrap-Up

November 17th, 2014

The Day 2 at AWS re:Invent in Las Vegas saw on stage Verner Vogels, amazon.com CTO, in a keynote mostly focused on devs.
Two the key announcements:

Amazon EC2 Container…

AWS re:Invent Keynote Wrap-Up

November 13th, 2014

Yesterday at AWS re:Invent 2014 in Las Vegas, Andy Jassy (SVP, Amazon Web Services) announced, as expected, a set of new products and services that will be available to Amazon’s…

Release: Puppet Labs Puppet Enterprise 3.7

November 12th, 2014

Puppet Labs has released version 3.7 of its Puppet Enterprise. Puppet Enterprise is a Configuration Management platform using a model based, declarative approach build on top of Puppet which…

Red Hat launches OpenShift Enterprise 2.2

November 11th, 2014

With Integration Platform as a Service (iPaaS), Gartner defines a cloud-based service that offers integration to, from and whitin an on premises counterpart.
With the announcement of OpenShift Enterprise 2.2…

Midokura MidoNet goes Open Source

November 4th, 2014

Midokura is a Japanese startup that released, back in 2013, MidoNet a network virtualization solution for OpenStack.
Yesterday, at OpenStack Summit in Paris, the company announced the decision to…

Release: Red Hat Cloud Infrastructure 5

November 3rd, 2014

Red Hat today announced the release of version 5 of its systems management solution for Infrastructure as a Service (IaaS) private clouds based on the OpenStack platform, supporting OpenStack, Amazon…

Amazon releases AWS plugin for Microsoft System Center Virtual Machine Manager

October 30th, 2014

Amazon has released the AWS System Manager for Microsoft System Center Virtual Machine Manager (SCVMM). AWS System Manager is an add-in for SCVMM which allows customers to monitor and manager…

HP Helion goes GA

October 28th, 2014

In May HP (soon Hewlett-Packard Enterprise) announced the plan to build an offer based on its own OpenStack distribution, codename Helion, supported over the next two years from a $1…

 
Monthly Archive