Select the directory option from the above "Directory" header!

Menu
Could memory-driven computing change the face of tech?

Could memory-driven computing change the face of tech?

HPE's non-traditional architecture could facilitate new technology development

(Image: HPE)

(Image: HPE)

The announcement by Hewlett Packard Enterprise (HPE) on 29 November that it had successfully demonstrated memory-driven computing sees the company embark on a branch of IT architecture that departs from the traditional model that has held sway for decades.

The idea behind memory-driven computing architecture is that memory, not processing, takes its place as the driving computational force at the centre of the computing platform. According to HPE, this new architecture could pave the way for performance and efficiency gains not possible today.

HPE said it has run new software programming tools on existing products using the prototype, and has seen improved execution speeds of up to 8,000 times on a variety of workloads. The company expects to achieve similar results as it expands the capacity of the prototype with more nodes and memory.

Developed as part of HPE’s ambitious ‘The Machine’ research program, the memory-driven computing proof-of-concept prototype represents a big step in the company’s ongoing efforts to design new architectures.

“With this prototype, we have demonstrated the potential of memory-driven computing and also opened the door to immediate innovation,” said HPE Enterprise Group general manager, Antoni Neri. “Our customers and the industry as a whole can expect to benefit from these advancements as we continue our pursuit of game-changing technologies.”

According to HP Labs chief architect and HP fellow, Kirk Bresniker, the aim behind the ‘The Machine’ project is to “fuse memory and storage, flatten complex data hierarchies, bring processing closer to the data, embed security control points throughout the hardware and software stacks, and enable management and assurance of the system at scale”.

If HPE’s claims are to be taken at face value – and with the working prototype of its memory-driven computing architecture going online in October – it appears that the company has gone some way to achieving its goals.

A big part of HPE’s research is aimed at creating technology that can handle the ever-increasing masses of data at humanity’s disposal – a tantalising incentive for a company in an industry as competitive as the one in which HPE plays.

The prototype features compute nodes that access a shared pool of fabric-attached memory, an optimised Linux-based operating system running on a customised ‘System on a Chip’, photonics/optical communication links, and new software programming tools designed to take advantage of abundant persistent memory.

For Sean Carroll, a theoretical physicist at the California Institute of Technology, the new technology may end up being a bit like parallel processing – an advance that does not merely increase speed, but also puts a whole class of problems that had previously been out of reach within our grasp.

“Every scientist knows that the amount of data we have to deal with has been exploding exponentially – and that computational speeds are no longer keeping up,” Carroll said in an interview with HPE.

“We need to be imaginative in thinking of ways to extract meaningful information from mountains of data,” he said. “Memory-driven computing will help us find new surprises in how nature works.”

According to Carroll, who is also a consultant to the makers of television series, The Big Bang Theory, modern, ultra-large data sets, have large costs associated with them, not to mention the continuing struggle to find meaning in growing masses of data.

Further, the problem of analysing increasingly large volumes of data is not only faced by the scientific community, it is an issue for everybody.

But Carroll believes the new architecture not only has the potential to find meaning in otherwise meaningless data, it could also facilitate new advanced technological developments.

“Big data doesn't just come from scientific instruments; it's generated by ordinary human beings going through their day, using their phones and interacting online,” he said.

“I'm very excited by the prospects of artificial intelligence and brain-computer interfaces. Imagine a search engine that you can talk with like an ordinary person, one that understands what you're really after.

“We'll be connected not just through devices, but through virtual-reality environments and perhaps even direct interfaces with our brains,” he said.


Follow Us

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags ArchitectureHPEAntoni Nerimemory-driven computingKirk BresnikerSean Carroll

Show Comments