HPE brings supercomputing to the masses

By , on
HPE brings supercomputing to the masses

Hewlett Packard Enterprise's GreenLake high performance computing (HPC) pay-per-use cloud service was unveiled yesterday adding another of the vendors solutions to its as a service offerings.

HPE said the service would slash the big-ticket, capital expenditure price for HPC systems that previously cost hundreds of thousands and even millions of dollars by 40 percent. What’s more, HPE said the new service will speed up the deployment of HPC rollouts by 75 percent.

The vendor is offering three "T-shirt" sized options for Greenlake HPC in small, medium and large. HPE execs told CRN that these sizes were not related to the size of the organisation, but to the size of the data sets and as such it was important to scale up or down the compute power to meet the needs of each data set.

"We've seen very large, global 2000 companies that just need a small amount of incremental HPC capacity, maybe for an AI or very intensive analytics application. And then we've seen a startup that needs an immense amount of computing power, even though they have 30 or 50 people within the organization," Greenlake boss Keith White told CRN.

HPE head of HPC and Mission Critical Systems Peter Ungaro said, "It's really about a whole new era of computing, one where we'll take these excess scale era technologies that we're building for these massive high performance systems, and use that very same technology to harness the explosion of data that's going on in every enterprise, large or small, and help them to process their data and insights faster." 

The GreenLake HPC service is aimed at bringing supercomputing technologies “from the peak of the pyramid” to a single rack or a single server for any small or midsize data center or even for HPE colocation providers, Ungaro said.

The execs told media that they were seeing customers bristle at high priced public cloud egress fees for massive data sets. “What people are finding especially in these HPC scenarios is that the data sets are massive and there is a charge when you bring data back down to do certain things with it,” said White, referring to public cloud data egress charges.”Many customers are finding (that those egress charges) are quite expensive.”

Another big issue for public cloud providers is the data latency issues that come with moving big data sets from the edge to the public cloud, said White.“You need that instantaneous computation,” said White of the HPE GreenLake HPC on premise advantage,

“You don’t want to be dependent on things going over the wire and coming back.”

Got a news tip for our journalists? Share it with us anonymously here.
Copyright © CRN Australia and The Channel Company LLC
All rights reserved.
Tags:

Most Read Articles

You must be a registered member of CRN to post a comment.
| Register

Log In

Username / Email:
Password:
  |  Forgot your password?