man with brain highlighted

Scientists invent computer that operates just like the human brain

ByEric Ralls Earth.com staff writer

In a significant leap forward for artificial intelligence (AI) and machine learning, a collaborative team from Northwestern UniversityBoston College, and the Massachusetts Institute of Technology (MIT) has unveiled a cutting-edge synaptic computer transistor that operates like a human brain. This novel device, inspired by the human brain’s architecture, marks a new era in computing technology.

Computing with human brain-like functionality

The synaptic transistor is engineered to replicate the brain’s ability to process and store information concurrently. Unlike traditional computers, where data shuttles between the processor and memory, consuming excessive energy and creating bottlenecks, this device integrates these functions.

This new approach, mimicking the brain’s efficiency, allows the transistor to perform complex machine-learning tasks and associative learning, a higher-level cognitive process. Previous attempts at creating brain-like computing devices relied on transistors that only functioned at cryogenic temperatures, limiting their practical application.

However, the newly developed transistor operates effectively at room temperature. It boasts rapid processing speeds, minimal energy consumption, and the ability to retain information even when powered off, making it an ideal candidate for real-world applications.

Architecture of the brain vs. a computer

Mark C. Hersam is the Walter P. Murphy Professor of Materials Science and Engineering at Northwestern’s McCormick School of Engineering and co-led this research team. Hersam collaborated with Qiong Ma of Boston College and Pablo Jarillo-Herrero of MIT on this project.

“The brain has a fundamentally different architecture than a digital computer,” said Hersham. “In a digital computer, data move back and forth between a microprocessor and memory, which consumes a lot of energy and creates a bottleneck when attempting to perform multiple tasks at the same time.”

Hersham continued, “On the other hand, in the brain, memory and information processing are co-located and fully integrated, resulting in orders of magnitude higher energy efficiency. Our synaptic transistor similarly achieves concurrent memory and information processing functionality to more faithfully mimic the brain.”

Redefining electronic paradigms

The creation of this synaptic transistor aligns with the growing need to rethink computing hardware, especially for AI and machine-learning applications. Current digital computing, heavily reliant on separate processing and storage units, faces challenges in handling data-intensive tasks efficiently.

Hersam pointed out, “For several decades, the paradigm in electronics has been to build everything out of transistors and use the same silicon architecture. Significant progress has been made by simply packing more and more transistors into integrated circuits.

“You cannot deny the success of that strategy, but it comes at the cost of high power consumption, especially in the current era of big data where digital computing is on track to overwhelm the grid. We have to rethink computing hardware, especially for AI and machine-learning tasks,” Hersham concluded.

Harnessing moiré physics

The researchers ventured into the unexplored territory of moiré patterns in physics, which emerge when two patterns are superimposed. By combining bilayer graphene and hexagonal boron nitride and manipulating them into a moiré pattern, they achieved unprecedented control over electronic properties at an atomic scale. This approach enabled the transistor to perform neuromorphic functions at room temperature.

The team’s tests focused on the transistor’s ability to perform associative learning. They trained the device to recognize patterns and differentiate between similar sequences, showcasing its potential for more complex AI applications.

Hersam noted, “If AI is meant to mimic human thought, one of the lowest-level tasks would be to classify data, which is simply sorting into bins. Our goal is to advance AI technology in the direction of higher-level thinking. Real-world conditions are often more complicated than current AI algorithms can handle, so we tested our new devices under more complicated conditions to verify their advanced capabilities.”

In one experiment, the device demonstrated its ability to discern patterns like 111 or 101 from a trained sequence of 000, reflecting associative learning abilities.

“If we trained it to detect 000 and then gave it 111 and 101, it knows 111 is more similar to 000 than 101,” Hersam explained. “000 and 111 are not exactly the same, but both are three digits in a row. Recognizing that similarity is a higher-level form of cognition known as associative learning.”

Is mimicking human brains the future of computing?

The successful deployment of this synaptic transistor opens new horizons in AI, particularly in contexts where current technologies fall short. For example, in self-driving vehicles facing complex weather conditions, this advanced technology could significantly enhance safety and reliability.

“Current AI can be easy to confuse, which can cause major problems in certain contexts,” Hersam said. “Imagine if you are using a self-driving vehicle, and the weather conditions deteriorate. The vehicle might not be able to interpret the more complicated sensor data as well as a human driver could. But even when we gave our transistor imperfect input, it could still identify the correct response.”

In summary, this impressive innovation from Northwestern University, Boston College, and MIT represents a significant step forward in the quest for AI that closely mimics human cognitive processes. As AI continues to evolve, such advancements in brain-inspired computing hardware will be crucial in addressing the increasingly complex challenges of the digital age.

The full study was published in the journal Nature.

 

Related Posts