Story image

HPE unveils 160TB memory-driven computing ‘Machine’ built for big data

17 May 17

Hewlett Packard Enterprise has unveiled the world’s largest single-memory computer – all 160 terabytes of it.

The prototype, which HPE says is the largest R&D program in the vendor’s history, is designed to deliver an architecture custom-built for the big data era, with what HPE calls memory driven computing.

Memory driven computing puts memory, rather than the processor, at the centre of the computing architecture, with HPE claiming it can reduce the time needed to process complex problems dramatically, to deliver real-time intelligence.

Meg Whitman, Hewlett Packard Enterprise chief executive, says “The secrets to the next great scientific breakthrough, industry-changing innovation or life-altering technology hide in plain sight behind the mountains of data we create every day.

“To realise this promise, we can’t rely on the technologies of the past, we need a computer built for the big data era.”

Packing 160TB of memory spread across 40 physical nodes interconnected by a high performance fabric protocol, The Machine – as HPE has dubbed the prototype – is capable of simultaneously working with data of approximately 160 million books – or in HPE (and United States terms) the data held in every book in the Library of Congress five times over.

The company says The Machine offers a glimpse of the ‘immense potential’ of memory driven computing, with HPE saying the architecture could easily scale to an exabyte-scale single memory system and, beyond that, to a nearly limitless pool of memory – 4,096 yottabyte, or 250,000 times the entire digital universe today.

Mark Potter, HPE CTO and director of Hewlett Packard Labs, says the architecture can be applied ‘to every computing category, from intelligent edge devices to supercomputers’.

“We believe memory-driven computing is the solution to move the technology industry forward in a way that can enable advancements across all aspects of society,” Potter says.

Disruption in the supply chain: Why IT resilience is a collective responsibility
"A truly resilient organisation will invest in building strong relationships while the sun shines so they can draw on goodwill when it rains."
The disaster recovery-as-a-service market is on the rise
As time progresses and advanced technologies are implemented, the demand for disaster recovery-as-a-service is also expected to increase.
Apax Partners wins bidding war for Trade Me buyout
“We’re confident Trade Me would have a successful standalone future," says Trade Me chairman David Kirk
The key to financial institutions’ path to digital dominance
By 2020, about 1.7 megabytes a second of new information will be created for every human being on the planet.
Proofpoint launches feature to identify most targeted users
“One of the largest security industry misconceptions is that most cyberattacks target top executives and management.”
What disaster recovery will look like in 2019
“With nearly half of all businesses experiencing an unrecoverable data event in the last three years, current backup solutions are no longer fit for purpose."
NVIDIA sets records with their enterprise AI
The new MLPerf benchmark suite measures a wide range of deep learning workloads, aiming to serve as the industry’s first objective AI benchmark suite.
McAfee named Leader in Magic Quadrant an eighth time
The company has been once again named as a Leader in the Gartner Magic Quadrant for Security Information and Event Management.