IBM intros Spectrum Fusion

By on
IBM intros Spectrum Fusion

IBM on Tuesday said it plans to launch container-native software-defined storage technology this year to bring together its general parallel file system and data protection technologies to improve data management across on-premises, edge, and cloud environments.

Big Blue also unveiled a new entry-level model in its IBM Elastic Storage System all-flash storage family.

The introduction of IBM Spectrum Fusion, initially as IBM’s first hyper-converged infrastructure appliance and eventually as a software-only technology, is part of a larger strategy to help clients and channel partners better manage data across hybrid environments, said Eric Herzog, chief marketing officer and vice president of global storage channels for IBM’s Storage Division.

“Our storage is designed to simplify data availability spanning the cloud, the core, and the edge,” Herzog told CRN. “IBM is also expanding its partner base looking for partners who can handle hybrid cloud, core, and edge deployments with AI and analytics.”

IBM Spectrum Fusion is a container-native storage technology for seamlessly spanning storage across the core, edge, and cloud, Herzog said. The software-defined container technology was developed with AI and analytics as primary drivers, he said.

IBM Spectrum Fusion software brings together the container-native unstructured data management capabilities of Spectrum Scale, the container-native metadata management capabilities of Spectrum Discovery, and the container-native data protection capabilities of Spectrum Data Protection Plus, and fuses them together, hence the name Spectrum Fusion, Herzog said. It also adds the ability to work with S3-compatible object storage, he said.

IBM Spectrum Fusion is slated to be released in the next quarter as IBM’s first hyper-converged infrastructure appliance with Red Hat OpenShift and Kubernetes container management software, Herzog said. The software-defined version is slated to be released early next year, he said.

IBM’s decision to release it first as a hyper-converged appliance stemmed from partner and user feedback, but this is not a normal HCI offering, Herzog said.

“This is the only fully-loaded HCI built on OpenShift,” he said. “This is container-led, not virtualization-led. Customers like the simplicity of HCI and the ability to place one call for support. It supports virtualization, but is leading with containers.”

IBM wanted to make it easy for customers to adopt IBM Spectrum Fusion, Herzog said.

“We include OpenShift, Kubernetes, and Red Hat virtualization,” he said. “We even ship the rack with it. This makes it easy to train partners and educate users.”

When the software-defined storage version of IBM Spectrum Fusion ships next year, in will include the ability to discover and catalog metadata on any S3-compliant object storage, Herzog said. It will also include an API so partners can gather metadata to leverage for AI codebases. Partners will be able to deploy it on any x86-based hardware on a hardware compatibility list IBM will publish, he said.

Whether as a hyper-converged infrastructure appliance or as software-defined storage, IBM Spectrum Fusion will give businesses a single copy of data they can leverage across cloud, core, and edge environments, Herzog said.

“There will be no more silos of data,” he said. “Without silos, businesses can lower their capital expenses and operating expenses and simplify storage management. And the partner is the hero. No more data sets with 12 copies of data floating around.”

IBM Spectrum Fusion is proof that IBM continues to evolve its storage software technology, said John Zawistowski, global systems solutions executive at Sycomp, a Foster City, Calif.-based solution provider and IBM channel partner.

“It combines Spectrum Scale, Spectrum Discovery, and Spectrum Data Protection into a single product with a single pane of glass for container storage,” Zawistowski told CRN. “IBM is hitting its stride.”

It was important for IBM to develop a best-of-breed offering for container storage, Zawistowski said.

“We’ve seen an uptick in container storage with IBM Spectrum Scale,” he said. “Over the next few years, we’ll see more. There are a lot of gaps in container storage, but IBM is closing the gaps faster than its competitors.”

IBM on Tuesday also introduced the latest in its IBM Elastic Storage System family, the ESS 3200, which doubles the performance of its predecessor, the ESS 3000, which was introduced in late 2019.

The ESS 3200 is a 2U storage system with performance of up to 80 GBs per second per node, Herzog said. Performance scales linearly so that a 10-node system offers up to 800 GBs per second. It comes with IBM Spectrum Scale software which allows customers to mix-and-match the ESS 3200 with older ESS systems, he said.

IBM has over the last few years honed down its storage line to make a better fit with customers’ requirements, and the ESS 3200 is an example of how it is making that line better, Zawistowski said.

“As IBM moves to take AI inferencing to the edge to throw away unneeded data and keep the important data, Spectrum Fusion and the ESS 3200 makes the transition easier,” he said. “There’s a lot of processing done on the edge, making the ESS 3200 a great fit. Or, when customers do inferencing in the data center where they need fast metadata servers, the ESS 3200 complements customers’ data lakes.”

This article originally appeared at

Got a news tip for our journalists? Share it with us anonymously here.
Copyright © 2018 The Channel Company, LLC. All rights reserved.

Most Read Articles

Log In

  |  Forgot your password?