An artificial intelligence capable of creating a computer within itself to augment its capabilities

⇧ [VIDÉO] You may also like this partner content (after advertising)

Using a new, low-level neural machine code that they specifically developed, two researchers at the University of Pennsylvania have designed a neural network that can run a program just like a typical computer. They have shown that this artificial network can thus speed up its computational speed, play a game of Pong, or even implement other artificial intelligence.

Neural networks are designed to mimic the functioning of the human brain and are capable of solving common problems. They consist of several layers of artificial neurons (nodes) connected to each other; Each node is associated with a certain weight and threshold value: if the node’s output is greater than the threshold, the data is transmitted to the next layer, etc. These artificial neural networks must be trained to be increasingly effective.

So the neural network driven by AI must be trained to perform the task for which it is designed. Normally, the neural network developed to classify images must be trained to recognize different patterns and distinguish them from thousands of examples: this is machine learning (or machine learning). The examples presented to the network in this case are explained, we are talking about “supervised” learning. Jason Kim and Danny Bassett at the University of Pennsylvania now propose a new approach, in which a neural network is trained to execute code, like a regular computer.

A new language for implementing logic circuits

An AI that has been trained to mimic the logical circuits of a standard computer in its neural network could, in theory, run code inside it and thus speed up certain computations. ” However, the lack of a specific low-level programming language for neural networks prevents us from taking full advantage of the neural computing framework. ‘,” the researchers emphasized in the pre-printed version of their article.

See also  Zoom in / Zoom out: Use these tips to protect your meeting

So Jason Kim and Danny Bassett set out to develop a new programming language to add a fully distributed application of software virtualization and computer logic circuits to a neural network. ” We bridge the gap between how neural computers and silicon computers are conceived and implemented ‘, they explain.

Their language depends on the reservoir account (tank account) – a computational framework derived from the theory of recurrent neural networks (neural networks with recurrent connections). The researchers began by calculating the effect of each neuron to create a very basic neural network capable of performing simple tasks, such as addition. Then they wired many of these networks together so they could perform more complex operations, thus reproducing the behavior of logic gates – the basic operations that can be performed on a single bit, but taken together (which are called logic circuits) allow more complex operations to be performed.

Thus, the network obtained in this way is able to do everything that a conventional computer can do. In particular, the researchers used it to power another virtual neural network and run a copy of the game Pong.

Faster networks thanks to neural computing

By parsing the internal representation and reservoir dynamics into a symbolic basis for its input, we define the low-level neural machine code that we use to program the reservoir to solve complex equations and store chaotic dynamical systems as living memory.e ”, summed up the two experts. This neural network can also greatly simplify the division of huge computational tasks: these are usually distributed over several processors to increase the computation speed, but this also requires more power.

See also  How to play with a PS5 console on your PC?

In addition, The neural computing (or neural computing) can make these virtual networks run faster. In a typical computer, data storage (memory) and processing (processor) are separated; The data is processed sequentially and synchronously. On the other hand, in a neural computer, designed to better mimic the functioning of the human brain, storage and computations take place inside artificial neurons that communicate with each other (a large amount of information is processed both parallel and asynchronously), reducing the number of operations that they have to perform out. Therefore, this computer learns and adapts with low latency, even in real time.

Asked new worldFrancesco Martinozzi of the University of Leipzig, who specializes in machine learning, has confirmed that neural networks that run code like those developed by Kim and Bassett can derive better performance from neural chips, adding that in certain specific areas, these computers can significantly outperform standard hardware devices. PC.

But in order to be able to exploit their computing power, these neural networks must first be scaled up: if the two researchers succeed in mimicking the work of some logic gates here, a conventional computer microprocessor contains several billion transistors!

source: J. Kim and Dr. Bassett, arXiv

Frank Mccarthy

<p class="sign">"Certified gamer. Problem solver. Internet enthusiast. Twitter scholar. Infuriatingly humble alcohol geek. Tv guru."</p>

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top