Deep analog learning: more effective hardware with new electrolyte

Many large machine learning models rely on increasing amounts of processing power to achieve their results. But this carries huge energy costs and produces large amounts of heat. One proposed solution is analog deep learning , which functions like a brain using neuron-like electronic devices .

However, these have so far not been fast, small, or efficient enough to offer any advantages over digital machine learning. After an initial concept , Murat Onen of MIT and colleagues created a nanoscale resistor that transmits protons from one terminal to another . All improving the previous one in terms of power and efficiency thanks to the use of a new electrolyte.

Deep analog learning, challenges and features

There are big differences between the human brain and the automatic learning models that want to reproduce its functioning. Despite the will to imitate it, our oragan weighs just over a kilo and uses the same amount of energy as a light bulb . Furthermore, the time required to carry out a new business is short.

Analogue Deep Learning

In reverse, it takes weeks to train neural networks, megawatt hours of electricity and specialized transformer racks with AI techniques. All of this is sparking growing interest in the effort to redesign the underlying hardware that AI runs on. In fact, the researchers pointed to an analog solution, which gave birth to components called: programmable proton resistors . The researchers hope that such “neuromorphic” processors may be much better suited to running AI than today's classic chips.

The analog approach seeks to design components capable of exploiting their internal physics to process information. This is much more efficient and straightforward than performing complex logic operations like conventional chips do. In the human brain, learning takes place thanks to the intensification or weakening of the connections between neurons, the synapses. Deep neural networks mimic this strategy, where weights (numerical value of a connection) are programmed through training algorithms .

In the case of the analog processor of MIT, it is the increase and decrease of the electrical conductance of the proton resistors that allows machine learning operations . Measured in nanometers (billionths of a meter) the proton resistors are arranged in a matrix, like a chessboard. And it is precisely the movement of the protons that controls the electrical conductance; it increases as they increase in the resistor channel and decreases vice versa.

The material behind the components

To modulate the conductance, the researchers made use of an electrolyte (similar to that of a battery) which “blocks” the electrons favoring the passage of protons . Searched for a long time between different materials, Onen focused his attention on phosphosilicate glass (PSG).

Basically it's silicon dioxide, the powdered drying compound in sachets for removing moisture from new furniture. Once optimized, PSG exhibits high proton conductivity at room temperature without the need for water, the perfect electrolyte for this task . The ultra-rapid movement of subatomic particles occurs thanks to a multitude of nanometer-sized pores. The latter, on their surface, create a passage to the diffusion of protons.

analog deep learning processor powered by ultra-fast protonic credits: MIT
analog deep learning processor powered by ultra-fast protonic credits: MIT

An important feature of PSG is its ability to support very strong electric fields, which ensures protons move at high speeds. About a million times faster than the previous resistor design and compared to human synapses. Fortunately , all this high-speed movement does not cause the device to break due to the size and very small mass of the protons . Finally, since PSG is an insulator against electrons , there is no electric current flowing through the device; a feature that keeps it fresh and energetically efficient .

Future perspectives of analog deep learning

Researchers plan to produce these resistors in larger quantities in order to build arrays to study their characteristics. And maybe resize them for incorporation into smaller devices.

The project leader said jokingly:

Once the analog processor is obtained, we will not train networks that everyone currently works on. Instead, they will be of unprecedented complexity that no one can afford, and therefore that will surpass all the others. In other words, this is not a faster car, this is a spaceship.

Murat Onen

Finally, their hope is that these energy-efficient ionic devices can emulate the neural circuits and synaptic plasticity studied in neuroscience .

The article Deep learning analog: More effective hardware with new electrolyte was written on: Tech CuE | Close-up Engineering .