When it comes to defining 'cloud computing', IBM is both revisionist and futurist.
As organisations like Amazon and Salesforce.com continue to take business away from traditional vendors, the approach of many a vendor product marketing manager has been to obfuscate the meaning of the word ‘cloud’ such that it no longer refers to a business model, but a box.
IBM – to the credit of its more conservative marketers – hasn’t been nearly as guilty as some of its industry peers. Being that 'utility computing' and 'on-demand' have been buzzwords in the company’s lexicon for some years, it’s perhaps understandable some at IBM would bristle at young web start-ups claiming to have invented the model, just as the entrepreneurs at the start-ups despair at phrases like 'private cloud' and 'cloud-in-a-box'.
This week, I spoke with IBM Australia's distinguished engineer Michael Shallcroft about what IBM offers in this space, and what the vendor's future might hold.
IBM currently attempts to play a role in any workload, via any delivery model, of enterprise computing.
First, there is the traditional vendor model, in which IBM sells customers the components (servers, storage, switches, software) to build their own infrastructure.
Then there is the traditional systems integrator model, under which IBM builds an infrastructure stack for the customer to own and operate.
It also operates under the outsourcing model, by not only building that same stack, but also maintaining and operating it on behalf of the customer.
And finally there is what might genuinely be defined as ‘cloud computing’ services - compute, storage or software offered as a service, over the network.
Like HP and countless others, IBM wants to label all of these services as 'cloud', being that they are all based on the same common reference architecture.
This ‘cloud-washing’ effort shields both vendors’ decades-old product, integration and outsourcing businesses from the cheaper (and sometimes nastier) public cloud compute services taking the industry by storm.
Today, IBM's Australian customers can use virtual servers hosted in IBM data centres as part of the vendor’s six-year-old VSS (Virtual Server services) offerings.
VSS offers the provision of virtual servers (based on VMware vSphere 4.0) with a baseline configuration of one Virtual CPU, 2GB RAM and 30GB storage.
It isn’t pitched at the software start-up seeking rapid application deployment; rather, VSS targets large corporate and government customers seeking a slightly more flexible approach to managed IT services and outsourcing.
VSS falls short of the NIST definition [pdf] of a 'cloud service' in that customers can only engage in three to five year contracts.
It offers a two-to-five day window for adjusting virtual server configuration, a task completed by IBM staff rather than via the web browser.
IBM profits by offering services on top of VSS – such as monitoring and helpdesk – services Shallcroft said has a “more manual involvement than pressing a button”. These extend to everything from management of operating systems, database (SQL, DB2, Oracle), Websphere app server, right up to full application management.
“A lot of our traditional outsourcing customers have already started using this infrastructure,” Shallcroft said, highlighting particular interest from retail and manufacturing sectors.
Shallcroft said the “less sexy” end of enterprise IT attracted most interest from IBM customers: long-standing managed security and storage services; remote backup of desktop and server environments; and shared data centre facilities for organisations to use as targets for disaster recovery.
IBM also provides a vendor-agnostic virtual desktop service – customers of which include some large public sector agencies.
Read on for IBM's plan for a cloudy future...