IBM Scientists Demonstrates “Computational Memory”

IBM Scientists Demonstrates Computational Memory

“In-memory computing” or “computational memory” is an emerging concept that uses the physical properties of memory devices for both storing and processing information. They are named von Neumann systems in honor of physicist and computer scientist John von Neumann who was a pioneer in modern digital computing. Ordinary desktop computers, laptops and even smartphones transfer data back and forth between memory and the computing unit, consequently making them slower and less energy efficient. A team from IBM Research indicates that they have made a breakthrough in computational memory by successfully using one million phase change memory (PCM) devices to run an unsupervised machine learning algorithm. “This prototype technology is expected to yield 200x improvements in both speed and energy efficiency, making it highly suitable for enabling ultra-dense, low-power, and massively-parallel computing systems for applications in AI,” according to a post on IBM Research’s. The ability to complete computations faster will, clearly, benefit over all computer performance. For IBM, that means better computing power for AI applications. “This is an important step forward in our research of the physics of AI, which explores new hardware materials, devices and architectures” IBM Fellow and co-author of the study Evangelos Eleftheriou said in a statement quoted in the blog. Computational memory introduces an opportunity for a more “real-time” processing of information; a much-needed improvement in today’s world, where more companies are putting a premium on data analytics. At the same time, Amazon and Google place AI at the center of their business, faster computing for AI applications is indeed a welcomed development. According to IBM, in-memory computing is key. “Memory has so far been viewed as a place where we merely store information. But in this work, we conclusively show how we can exploit the physics of these memory devices to also perform a rather high-level computational primitive,” lead author Abu Sebastian said. “The result of the computation is also stored in the memory devices, and in this sense the concept is loosely inspired by how the brain computes.”