AI is re-inventing what a computer is

Texas News Today

Fall 2021: pumpkin, peach pie, pink new mobile phone season. Every year, Apple, Samsung, Google and others will release the latest releases soon. These fixtures on the consumer technology calendar no longer inspire these dazzling early wonders and wonders. But behind all the glare of marketing, there’s something remarkable.

Google’s latest product, the Pixel 6, is the first smartphone to feature an independent AI-only chip with a standard processor. Also, the chips running the iPhone include what Apple calls a “Neural Engine”, which is also AI-only. Both chips are well suited for the types of calculations involved in training and running machine learning models on devices, such as the AI ​​that powers cameras. AI has become a part of our daily life which we are almost unnoticed. And it’s changing the way we think about computing.

what do you mean? Well, computers haven’t changed much in 40 or 50 years. They are smaller and faster, but are still boxes containing processors that execute instructions from humans. AI changes this in at least three ways: how computers are built, how they are programmed and how they are used. Ultimately, it will change what they are for.

“The core of computing is changing from counting to decision-making,” said Pradeep Dubi, director of Intel’s Parallel Computing Lab. Or, as MITC cell director Daniela Russ puts it, AI is releasing computers out of the box.

hurry up and slow down

The first change is about the computer and how the chips that control it are built. Traditional computing improvements came when machines were able to perform calculations one after the other. For decades, chip makers have held onto Moore’s Law, and the world has benefited from faster chips due to metronomic regularity.

However, the deep learning model that makes current AI applications work requires a different approach. In other words, doing everything at the same time requires a large number of incorrect calculations. In other words, you need a new type of tip. In short, it is a chip that transfers data as fast as possible and makes it available when and where it is needed. When deep learning first came out about a decade ago, there was already a very good specialized computer chip for this. The Graphics Processing Unit (GPU) was designed to display pixels on the screen dozens of times per second.

A computer can be anything. Indeed, most household items, from toothbrushes to lamp switches to doorbells, are already available in smart versions.

Today, chipmakers such as Intel, Arm and Nvidia, which previously provided multiple GPUs, are pivoting to hardware designed specifically for AI. Google and Facebook have also been forced to enter the industry for the first time in a race to discover the benefits of AI through hardware.

For example, the chip in the Pixel 6 is a new mobile version of Google’s Tensor Processing Unit (TPU). Unlike traditional chips, which are aimed at ultra-fast and accurate calculations, TPUs are designed for the high volume, less precise calculations that neural networks require. Google has been using these chips internally since 2015. These chips process people’s photos and natural language search queries. Google’s sister company, DeepMind, uses them to train AI.

Over the years, Google has made TPU available to other companies. These chips, and similar chips developed by other companies, are becoming the default in data centers around the world.

AI is also helping you design your own computing infrastructure. In 2020, Google designed a new TPU layout using reinforcement learning algorithms, a type of AI that learns to solve tasks through trial and error. AI eventually came up with strange new designs that humans didn’t expect, but they worked. This type of AI has the potential to one day develop better, more efficient chips.

show instead of talk

The second change is about how to tell the computer what to do. We are doing computer programming since last 40 years. Chris Bishop, head of Microsoft Research in the UK, said he plans to train them for the next 40 years.

Traditionally, programmers first had to come up with computer rules to get computers to do things like speech recognition and recognition of objects in images.

Machine learning prevents programmers from making rules. Instead, they form a neural network that learns those rules on its own. This is a fundamentally different idea.

LEAVE A REPLY

Please enter your comment!
Please enter your name here