AWS, Microsoft enable clouds to run larger SAP databases

By on
AWS, Microsoft enable clouds to run larger SAP databases

AWS and Microsoft Azure this week elevated their SAP hosting games, enabling their clouds to run much larger SAP HANA in-memory databases than previously possible.

During the SAP Sapphire Now conference, both AWS and Microsoft claimed their clouds offer the most memory-rich environments for hosting the database platform used by many of the world's largest companies.

AWS' powerful high-memory instances, the X1 family, became ready on Monday to run much larger SAP clusters in the cloud than they could ever before. The public cloud leader also revealed it is preparing to introduce virtual servers stocked with even more RAM, both later this year and into the next, that will be more adept at powering SAP workloads.

On Tuesday, the day after AWS' announcement, Microsoft upped its virtual machine portfolio with a new instance type called the M-Series that significantly ramped memory resources available for customers to provision.

SAP's latest certification for AWS X1 instances (specifically x1.32xlarge) will enable scaling out HANA clusters to 17 nodes, delivering to those database workloads 34 TB of memory. The previous limit was 7 nodes delivering 14 TB.

The X1 family, introduced at Amazon's re:Invent conference in October of 2015, is optimised for running in-memory applications. With 2 TB of RAM, the instances offer 8 times more memory than any other EC2 instance.

HANA, SAP's popular in-memory database management platform, thrives in high-memory conditions. But there's a tightly controlled process for validating the software that powers massive transactional databases for specific cloud environments.

"Because SAP installations are unfailingly mission-critical, SAP certifies their products for use on certain EC2 instance types and sizes," said Jeff Barr, AWS' chief evangelist, on the company's blog.

"We work directly with SAP in order to achieve certification and to make AWS a robust & reliable host for their products," Barr said.

Amazon's ability to scale out to 17 nodes, with 2 TB per node, delivers far more memory to a HANA database than can any other cloud provider, Barr wrote in the blog post.

Azure's new M-Series machines, however, can be configured to offer 3.5 TB of RAM for HANA and other high-end databases. The G-Series instances that were previously designated as SAP hosting environments in Azure could at best scale-out over several nodes to half a terabyte.

Microsoft told CRN USA on Tuesday it is working with SAP to certify the M-Series to scale-out to clusters of 34 TB, matching AWS.

For customers that need more memory, Azure also offers purpose-built, bare metal configurations that can deliver 20 TB on a single node, or scale up to 60 TB.

"We’ve invested deeply to ensure that Azure is the best public cloud for our customers’ SAP HANA workloads. Azure provides the most powerful and scalable infrastructure of any public cloud provider for HANA," said Jason Zander, Microsoft's corporate vice president of Azure, in a blog.

The arms race to win SAP workloads will continue among the two largest public cloud providers.

Enterprises running SAP HANA can expect to access even more cloud resources from Amazon in the future, according to AWS' Barr.

AWS is extending the X1 family, he said. Later in the year, the public cloud kingpin will release across several regions new X1 instances, both for On-Demand and Reserved consumption, that will offer 4 TB of DDR4 memory, and can support 128 virtual CPUs.

Looking further down the road, Amazon is working on bulking up the X1s even more.

"Throughout 2017 and 2018, we plan to launch EC2 instances with between 8 TB and 16 TB of memory. These upcoming instances," Barr said, "will allow you to create larger single-node SAP installations and multi-node SAP HANA clusters, and to run other memory-intensive applications and services."

The massive memory configurations coming to market appeal to only an elite few potential customers, said Jagadish Bandla, leader of an SAP HANA practice group at global systems integrator Deloitte.

Very few enterprise clients consider scaling beyond four terabytes, Bandla said.

But there are giant SAP customers like Walmart, ExxonMobil and Procter & Gamble might need 10 TB just to test workloads before sending them into production, and the cloud can deliver those environments in days.

"First thing these cloud providers can accelerate is the test drive," he said.

Some of those leaders of the Fortune 100 who have spent a lot of money in the past for on-premises HANA deployments are at a crossroads, looking to embrace cloud as a new platform for running those memory hungry transactional databases in production.

"These cloud providers are absolutely important for SAP customers," Bandla said.

For giant integrators like Deloitte who can "put this puzzle together" for them, it's always best to offer the most comprehensive capabilities, even if they go beyond what 90 percent of potential customers would ever need, he said. 

This article originally appeared at crn.com

Got a news tip for our journalists? Share it with us anonymously here.
Copyright © 2018 The Channel Company, LLC. All rights reserved.
Tags:

Most Read Articles

Log In

Username / Email:
Password:
  |  Forgot your password?