How artificial intelligence is redefining computing as we know it

The emergence of the personal computer and the Internet in the twentieth centuryNS The century has changed our daily lives. Today we have access to information anywhere and anytime and many of our tasks are now automated. However, you can’t automate things like writing assignments. On the other hand, you can contact a professional helper with a “write my dissertation” message and still get your best result with no efforts. In recent years, another, quieter revolution has interfered with our daily lives. It could completely transform information technology as we know it, raise the site Technology review: It is artificial intelligence (AI).

In recent years, another, quieter revolution has interfered with our daily lives. It could completely transform information technology as we know it, raise the site Technology review: It is artificial intelligence (AI).

In the last 40 or 50 years, a computer has become more compact and faster, but it’s almost a box loaded with microprocessors that execute human programming instructions.

Now, artificial intelligence is redefining computing on three different fronts: how computers are made, how they are programmed, and how they are used. And why they are.

The first change relates to the design. For decades, advances in computing have followed Moore’s Law, doubling the number of transistors on a chip every two years. AI requires a different approach: it requires too many simultaneous calculations.

This means that a new type of chip is needed. Technology review highlights. Manufacturers like Intel and Nvidia now offer custom AI chips called TPU (Tensor Processing Unit), designed to perform ultra-fast but less accurate calculations, such as those used by neural networks.

See also  At the age of sixteen, he sells dyslexic keyboard sets all over France

new laws

Second change: programming. For forty years we have been programming computers. In the next forty years, we will train them, ” says Chris Bishop, head of Microsoft Research UK. No need to write the rules anymore: the neural network learns them on its own by relying on the examples provided.

Developed by OpenAI Corporation, the system GPT-3 is able to learn a language on its own, write news articles, write computer programs, or come up with an entirely new piece of music.

The third revolution, finally, concerns the use of terminals. Today, devices no longer need a keyboard or mouse to interact with us. Technology review highlights. In fact, anything used in everyday life (refrigerator, vacuum cleaner, doorbell, toothbrush, etc.) can become “smart”.

Beings able to guess our desires before we even formulate them and communicate with each other. “When I was a kid, it was my favorite scene from the movie novice magicianWhen Mickey asks for a broom to help him sort, says Daniela Ross, director of the Massachusetts Institute of Technology (CSAIL). Soon we won’t need any magic to make that happen.” The computer came out of its box and turned into a broom.

Frank Mccarthy

<p class="sign">"Certified gamer. Problem solver. Internet enthusiast. Twitter scholar. Infuriatingly humble alcohol geek. Tv guru."</p>

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top