How Google implements disaster recovery for Apps

Crafted in a certain way, with the right message it’s easy to persuade a potential customer about the benefits of running his IT infrastructure inside a public cloud rather than on-premises.
A typical argument for example is the cost of disaster recovery.

In March Google published a post on the subject educating the readers about the Recovery Point Objective (RPO), how much data you’re willing to lose when things go wrong, and Recovery Time Objective (RTO), how long you’re willing to go without service after a disaster.

Google compared the RPO, the RTO and the costs of performing disaster recovery between a traditional on-premises data center and its white label SaaS platform Apps:

In larger businesses, companies will add a storage area network (SAN), which is a consolidated place for all storage. SANs are expensive, and even then, you’re out of luck if your data center goes down. So the largest enterprises will build an entirely new data center somewhere else, with another set of identical mail servers, another SAN and more people to staff them…

Read more

Google introduces Apps for Government

Earlier this week, Google announced a new edition of its white label Software-as-a-Service (SaaS) offering Apps (formerly Apps for Domains): Google Apps for Government.

The new edition, which costs $50 per user per year, includes some special security features to comply with US regulations.

Thanks to these features Google Apps for Government is the first SaaS suite to receive the US Federal Information Security Management Act (FISMA) certification:

FISMA assigns specific responsibilities to federal agencies, the National Institute of Standards and Technology (NIST) and the Office of Management and Budget (OMB) in order to strengthen information system security. In particular, FISMA requires the head of each agency to implement policies and procedures to cost-effectively reduce information technology security risks to an acceptable level…

Read more

SaaS market to grow to $40.5B by 2010 says IDC

In June IDC released a new market analysis titled Worldwide Software as a Service 2010-2014 Forecast: Software Will Never Be the Same.

It includes very interesting information. For example it says that SaaS represented just over 73% of IT public cloud services revenue in 2009, reaching $13.1B.

It also includes remarkable predicitions:

  • In 2010 perpetual annual license revenuw will decline by nearly $7B as the industry shifts to subscription models
  • By 2012 nearly 85% of net-new software firms will be built around the SaaS model
  • By 2014 revenue will grow up to $40.5B, at a compound annual growth rate (CAGR) of 25.3%
  • By 2014 about 65% of new product from established ISVs will be delivered through a SaaS model
  • By 2014 SaaS will account for just over half of market revenue. The rest will be divided among IaaS and PaaS

A private cloud is still a giant computer for VMware

Who remembers the VMware’s marketing message in 2008? It was all about a giant computer where VMware provided the operating system, called Virtual Datacenter OS or VDC-OS.

The VDC-OS rapidly disappeared because it had too many similarities with the mainframe, and was slowly replaced by the ubiquitous private cloud computing concept we have today. Despite the name change, the idea of a giant computer remains.

Yesterday InformationWeek published an exclusive interview with the VMware’s CEO Paul Maritz. He provided interesting information, like the desire to abandon mainframes for some of the largest banks in the world:

“We’re starting to get—for the first time ever—some very significant companies saying, ‘We have decided in principle—we don’t know when, but we have decided—to move off of the mainframe.’ This is one of the world’s three largest banks telling us this,” Maritz said.

Read more

Yahoo! to move to Xen for cloud computing

In a roundtable hosted by Association for Computing Machinery (ACM) that took place last month, the Vice President of Cloud Computing at Yahoo! Surendra Reddy revealed that the search giant is adopting the open source hypervisor Xen.

The information was provided during a discussion about the challenges of networking management inside cloud computing infrastructures:

We are moving to Xen and building a new data-center architecture with flat networks. We tried to use VLANs, but we have taken a different approach and are going to a flat layer 2 network. On top of this we are building an open vSwitch model placing everything in the fabric on the server.

My problem is in responding to the service requirements of my applications and addressing things such as latency and throughput. The data needed to address these issues is not available from either a network virtualization solution or the hypervisor.

Also, my uplink from the switches is 10 gigabits per second or multiple 10 gigabits per second, but my NICs are only one gig. If I run 10 VMs on a box, then all of the bandwidth aggregates on one or two NICs.

Amazon allows penetration testing against EC2

Trying to address the excessive lack of transparency that plagues today’s public cloud computing offerings, Amazon has just published a new policy that allows customers (or security researchers) to perform penetration testing inside EC2.

The company already defines what is considered a security attack, or a network abuse, in its Acceptable User Policy. An EC2 customer that wants to simulate a real-world attack without violating that policy has to require permission to do a penetration test. Amazon keeps this request confidential and answers within 24 hours in a non-automated fashion.

In its reply Amazon requires specific information about the penetration test, like the targeted Amazon Machine Images (AMIs) and the attack timeframe. The company also lists the security tools that customers are allowed to use during the attack (but the published policy doesn’t include this list).

Amazon also published the policy to report about discovered vulnerabilities in any of its Amazon Web Services (AWS) platforms, including EC2 of course.

Read more

C12G Labs (negatively) reacts to the Rackspace OpenStack announcement – UPDATED

The OpenStack cloud computing infrastructure announced by Rackspace earlier this week certainly shacked the ecosystem, pushing a few players to answer.
VMware, for example, published a vague statement about the value of open source without really validating the OpenStack platform.

Another vendor that reacted to the announcement is C12G Labs, creator and maintainer of the OpenNebula management framework for Infrastructure-as-a-Service (IaaS) clouds.
While open source, OpenNebula wasn’t included in the OpenStack platform, along with Citrix XenServer or NASA Nebula.

So C12G Labs published a sort of “me too” statement where the company reminds everybody that OpenNebula has been one the first projects in cloud computing, that it uses Apache licensing, that it is open, flexible, production-ready and that the existence of a commercial Enterprise edition doesn’t imply that the free edition has less features.

Read more

Axibase launches a reporting tool for Amazon EC2

Axibase is a US company founded in 2004 that develops sophisticated reporting.
Its flagship product, Fabrica, turns availability and performance data coming from all sort of IT assets, including entire Configuration Management Databases (CMDBs), into Microsoft Visio diagrams that update in real-time.
Fabrica works with HP OpenView Operations (OVO), IBM Tivoli Monitoring, HP Network Node Manager (NNM) and Microsoft System Center Operation Manager (SCOM).

The company just entered the cloud computing market with a new Software-as-a-Service (SaaS) monitoring solution that is hosted on Amazon EC2 and reports about Amazon EC2.

Called Cloud Reporter, this solution is able to track resource usage inside the Amazon Infrastructure-as-a-Service (IaaS) cloud and report about the Amazon Virtual Images (AMIs), the Relational Database Service (RDS), the Elastic Load Balancers (ELBs) and the Elastic Block Store (EBS).

Axibase is able to track the EC2 resources thanks to the Cloudwatch web service that Amazon offers.

Read more

VMware organizes its channel to sell Zimbra starting next month

As part of its new cloud computing strategy VMware acquired Zimbra from Yahoo! in January.

Zimbra is an online/offline collaboration suite which Yahoo acquired in September 2007 for $350M in cash and that competes with Software-as-a-Service (SaaS) PIMs offered by Google or Zoho for example.

Zimbra also offers an open source mail client that competes with products such as Microsoft Office and Mozilla Thunderbird. 
Yahoo is rumored to be trying to sell it since September 2008.

The new VMware that Paul Maritz is building since June 2008, when he replaced the founder Diane Greene as CEO, believes that SaaS applications must be part of the platform as much as the Infrastructure-as-a-Service (IaaS) and the Platform-as-a-Service (PaaS) underlying foundation blocks. So the company is developing and acquiring in many areas, far away from its corporate technology area of expertise, to build an end-to-end software stack.

But VMware has been quiet so far about Zimbra. Probably, training a channel that is used to sell hypervisors and virtual machines is a significant challenge that required some time. 
The company seems ready now.

Read more

Web Hosting Talk to launch another benchmark for IaaS clouds

Earlier this week the hosting community Web Hosting Talk (a division of iNET Interactive), in collaboration with the hosting provider The Planet, announced the private beta of new benchmark for cloud computing called Bench the Cloud.

The new benchmark framework, which will be officially launched as a private beta in mid August, seem designed to measure performance of public cloud infrastructures only, through a crowdsourcing approach.
It is built on a suite of open-source benchmarking tools and measures three key criteria essential to running cloud applications: CPU performance, disk read and disk write speed.

Bench the Cloud is only the last attempt to rate the performance of public clouds. In June CloudHarmony published the first performance analysis of most IaaS public clouds. And in July Compuware launched a public tracking service for IaaS and PaaS public clouds called CloudSleuth.

The Planet recently adopted KVM as virtualization platform of choice for part of its public cloud infrastructure, so it will be rather interesting to see if Bench the Cloud will measure that portion of the infrastructure and compare it against other offerings that are Xen-based and ESX-based.