HPE unveils 160TB memory-driven computing ‘Machine’ built for big data
FYI, this story is more than a year old
Hewlett Packard Enterprise has unveiled the world’s largest single-memory computer – all 160 terabytes of it.
The prototype, which HPE says is the largest R&D program in the vendor’s history, is designed to deliver an architecture custom-built for the big data era, with what HPE calls memory driven computing.
Memory driven computing puts memory, rather than the processor, at the centre of the computing architecture, with HPE claiming it can reduce the time needed to process complex problems dramatically, to deliver real-time intelligence.
Meg Whitman, Hewlett Packard Enterprise chief executive, says “The secrets to the next great scientific breakthrough, industry-changing innovation or life-altering technology hide in plain sight behind the mountains of data we create every day.
“To realise this promise, we can’t rely on the technologies of the past, we need a computer built for the big data era.”
Packing 160TB of memory spread across 40 physical nodes interconnected by a high performance fabric protocol, The Machine – as HPE has dubbed the prototype – is capable of simultaneously working with data of approximately 160 million books – or in HPE (and United States terms) the data held in every book in the Library of Congress five times over.
The company says The Machine offers a glimpse of the ‘immense potential’ of memory driven computing, with HPE saying the architecture could easily scale to an exabyte-scale single memory system and, beyond that, to a nearly limitless pool of memory – 4,096 yottabyte, or 250,000 times the entire digital universe today.
Mark Potter, HPE CTO and director of Hewlett Packard Labs, says the architecture can be applied ‘to every computing category, from intelligent edge devices to supercomputers’.
“We believe memory-driven computing is the solution to move the technology industry forward in a way that can enable advancements across all aspects of society,” Potter says.