At the beginning of September cloudcomputing.info has been invited by HP to a closed-doors, intimate event for a few influencers in Cupertino. During the two-days mini-conference, very high-profile executives introduced the HP approach to cloud computing, the product and services portfolio, and the go-to-market strategy.
It’s uncommon for a company of this size to pack together in a single room vice presidents, CTOs, and directors to clarify the vision, and answer questions about it for hours, just for a very restricted audience (there were less than 10 people invited, from all around the world). It’s a welcome novel approach considering how fragmented the message can be, buried deep inside tens or even hundreds of press announcements, coming from multiple business units and departments that work on a technology like cloud computing from many different angles.
A few interesting details emerged during the two days, so what follows is a list of things that are worth a report.
The very first thing to report about is how HP position itself at the beginning of the cloud computing era.
Right now, the primary goal doesn’t seem to be the technology provider for cloud infrastructures. Selling hardware and software to build cloud computing infrastructures of course is part of the strategy, but HP doesn’t seem to have set it as its primary goal in this phase. The main goal seems more to be the business partner that assists during the transition to the cloud. So HP has a strong focus on consulting services.
The company has five services: the Discovery Workshop, the Roadmap Service, the Design Service, the Security Service and the CloudStart Solution.
cloudcomputing.info has been particularly impressed by the first one, the Discovery Workshop, which has been briefly demonstrated during the event.
It basically is a consulting day where multiple representatives of a single company are invited at the HP campus in a special training room. The room is filled with giant posters, describing cloud computing from many different perspectives: the 101 definitions, the infrastructure requirements, the business goals, the security risks, the identity management approaches, etc.
The purpose of the day is not to educate a customer about what is cloud computing. The purpose rather is to do a reality check, and be sure that internally, business and technical people are on the same page about what is cloud computing and why the company wants to embrace it.
At the beginning of the demo, it seemed a silly exercise, but after half an hour the value of this approach appeared much more evident. The customer is invited to walk through the posters and openly discuss about them, while HP staff is primarily there to facilitate the discussion.
HP reports that in many cases the activity reveals profound conflicts in how cloud computing is perceived by different people in the same company, with the workshop becoming the very first opportunity to compare point of views and try to realign expectations and goals. It’s hard to not believe so.
The Discovery Workshop has been particularly impressive not because of the approach, which is probably not unique in the Industry, but for the quality of contents provided, and the knowledge HP demonstrated in describing the current cloud computing trends, challenges and risks.
The presentation has been very objective and, remarkably, completely lacking any specific reference to HP’s portfolio. Not even for a single minute HP used the workshop to pitch its products, and we were told that this is on purpose: for the company, the workshop represents an opportunity to increase awareness and open a discussion with customers, not an opportunity to place sales orders.
cloudcomputing.info didn’t find as convincing the HP Cloud Security Analysis Service.
The company can perform a wide range of assessments, on behalf of the customer, on any 3rd party cloud infrastructure selected by the customer. While this should be done before a customer selects his cloud provider, HP reports that it’s usually requested after the adoption.
This assessments, usually 3 weeks long, may include various activities, from review of compliance policies to infrastructure assessment. HP generates reports and presents findings to customers, providing, if requested, guidance on remediation.
Problem is that these assessments are only available on demand and they seem part of a rather slow, manually driven process. There’s no way for the customer to request some sort of automated assessment, triggered by the occurrence of a specific event. The customer has to recognize the need for an assessment and deal with HP consulting services to book it. Once completed, it can be easily renewed, but it still involves interaction between the parts.
The whole process seems far away from the dynamism required to deal with the rapidly changing infrastructures of cloud computing.
More than that, HP doesn’t rely on any third party, external security provider to assess its own assessment infrastructure. So if the HP tools are compromised, and the customers’ assessments are compromised accordingly, there’s no way to know. The customer has to blindly trust HP without any chance to verify the integrity of the systems.
Another notable presentation has been the one about the new Cloud Maps extensions for Matrix Orchestration Environment (MOE).
MOE is the orchestration framework that HP offers to automate service provisioning inside its BladeSystem Matrix converged/unified infrastructure.
Rather than just provide architecture blueprints to its customers, HP launched at the end of August a number of extensions for MOE called Cloud Maps that completely offload the system architects from infrastructure design duties.
They basically are predefined workflows that leverage MOE to automate the deployment of a few popular software solutions, including Citrix XenApp, Microsoft Exchange/Sharepoint/SQL Server, Oracle RAC/PeopleSoft/Fusion Middleware, Red Hat JBoss, SAP BusinessObjects/NetWeaver, SAS Enterprise BI and even VMware vCloud Director.
Once deployed in MOE, customers find these applications available as business services in the self-service provisioning portal. A Cloud Map tells MOE exactly how to provision blades, networks, storage, operating system instances, and of course application instances to serve a certain amount of users.
Additionally, the Cloud Maps are developed to scale the service if needed: customers can request more capacity through the portal while MOE adds physical resources behind the scenes.
While Cloud Maps are developed by HP and a third party vendor part of the AllianceONE program, so that customers can’t easily create their own Cloud Maps, they remain great examples of the capabilities and potential of orchestrations frameworks, inspiring the creation of custom, sophisticated workflows for the BladeSystem Matrix.
The Cloud Maps approach greatly simplify the implementation and use of a private cloud infrastructure and it’s a shame that HP only offers them to BladeSystem Matrix customers.
The company stated that Cloud Maps may be available in future also for customers that don’t have this specific hardware. Meanwhile, there are a couple of alternatives available today: the HP Server Automation solution packs and the HP SiteScope solution templates. Of course both are limited compared to Cloud Maps as they cannot control the provisioning of the hardware components.
Overall the impression has been positive: HP seems well positioned to provide guidance and tools to embrace cloud computing, despite the offering could mature even more in few areas, like security.
Another point that seems evident is the fragmentation of HP offering: the plethora of services, and tools that operate or just support the cloud, is impressive (and it’s growing), and in some ways reflects the vastness of the company’s overall portfolio. Without guidance, even just to browse the company’s website, customers can get lost easily. Here the company has to work harder to simplify the message and demonstrate the clarity and transparency that shown during its Discovery Workshop.
blog comments powered by Disqus
cloudcomputing.info Newest articles
March 24th, 2015
Juniper Networks and Mirantis announced a partnership preparing an open-source software-defined networking (SDN) fabric to deploy OpenStack clouds at scale.
The two companies have released a reference architecture, validated for…
March 18th, 2015
Microsoft has published a free ebook titled: “Microsoft System Center Deploying Hyper-V with Software-Defined Storage & Networking“. The book which contains 236 pages provides a step-by-step guide for a…
March 18th, 2015
Today, March 17, Microsoft disclosed that the former Vice President & Principal Analyst at Forrester Research: James Staten, was elected as chief strategist for its Cloud & Enterprise group. Staten…
March 12th, 2015
As announced few weeks ago, IBM opened its new SoftLayer datacenter in Sydney, its second one in Australia, after the opening in Melbourne last year. The opening of Big Blue’s…
March 11th, 2015
VMware has announced that it now supports CoreOS on both vSphere 5.5. and vCloud Air. CoreOS is a Linux distribution designed to provide minimal functionality required for deploying applications inside…
March 11th, 2015
Microsoft has published a free ebook titled:”Microsoft Azure Essentials: Azure Automation“. The book which contains 113 pages introduces a fairly new feature of Microsoft Azure called Azure Automation. Using…
March 11th, 2015
SimpliVity, is one of the competitors in the crowdy hyperconverged infrastructure and a data architecture market. Yesterdat the company announced that it has closed a $175 million Series D…
March 10th, 2015
Today it’s a sad day for the world of the information technology websites, a very well known name on the scene, GigaOM, just announced that it is shutting down…
March 10th, 2015
Few days after announcing the availability of its orchestration tools in beta, Docker also disclosed the acquisition of SocketPlane, a software-defined networking (SDN) startup founded during last quarter of…
March 3rd, 2015
Docker last week announced the availability of its orchestration tools in beta. The orchestration tools are able to assist in portability of the Docker containers and consists of Docker…
February 27th, 2015
Recently IBM revealed the opening of new SoftLayer datacenters in Sydney and Montreal in the next 30 days. As soon as they are fully operating, IBM is going to have…
February 17th, 2015
Red Hat today announced the release of version 6 of its OpenStack platform management tool Red Hat Enterprise Linux OpenStack. Version 6 is the follow up of version 5 which…
February 12th, 2015
Docker has released version 1.5 of its container virtualization solution. Docker provides so called container virtualization, allowing an application and its dependencies to run as an isolated process inside…
February 12th, 2015
Microsoft has released version 9.1 of its capactiy planning tool the Assessment and Planning Toolkit (MAP). This version is the follow-up of version 9.1 which was released in July…