Hewlett-Packard Enterprise recently announced it has created the world’s largest singlememory computer. Aptly named “The Machine,” this computer has the largest R&D program in the history of HPE and was created with big data in mind.”
The secrets to the next great scientific breakthrough, industry-changing innovation, or life-altering technology hide in plain sight behind the mountains of data we create every day,” said Meg Whitman, CEO of Hewlett-Packard Enterprise, in a statement. “To realize this promise, we can’t rely on the technologies of the past, we need a computer built for the Big Data era.”
The Machine prototype unveiled contains 160 terabytes (TB) of memory, which means it can simultaneously work with the data held in every book in the Library of Congress five times over — around 160 million books. According to HPE, it has never been possible to hold and manipulate whole data sets of this size in a single-memory system. This provides just a glimpse of the immense potential of what it calls Memory-Driven Computing, the company claims.
“We believe Memory-Driven Computing is the solution to move the technology industry forward in a way that can enable advancements across all aspects of society,” said Mark Potter, CTO at HPE and Director of Hewlett-Packard Labs, in a statement. “The architecture we have unveiled can be applied to every computing category — from intelligent edge devices to supercomputers.”
Memory-Driven Computing, as HPE defines it, puts memory, rather than the processor, at the center of a computer’s architecture. This approach can significantly reduce the time needed to process complex problems to deliver real-time intelligence in a fraction of the time it currently takes.
“The Machine is architected the ways devices will be built in the future,” said Patrick Moorhead, analyst at Moor Insights & Strategy. “That is, with a massive memory footprint and a combination memory and storage. This help analytical and machine learning workloads and also allows accelerators to get direct access to a massive memory-storage footprint. Much of the industry is coming at it from a storage point of view, like with Intel’s 3D XPoint, which is speeding up storage. I expect the different approaches to mesh in three to five years.”