AI Smart has a big price tag

Texas News Today

Calvin Chee, I work for a search startup called Glenn and want to use the latest artificial intelligence algorithms to improve my company’s products.

Glenn provides tools to search applications such as Gmail, Slack and Salesforce. Qi says the new AI technology for language analysis will help Glenn’s customers find the right files and conversations much faster.

However, it can cost millions of dollars to train such state-of-the-art AI algorithms. So Glean uses a smaller, less capable AI model that can’t extract much meaning from the text.

Getting results similar to those of companies like Google and Amazon is “difficult in small spaces with low budgets,” Chi said. He says the most powerful AI model is “out of the question”.

AI has made exciting breakthroughs over the past decade. A program that allows you to defeat humans in complex games, drive on city streets under certain conditions, respond to voice commands, and continuously write text based on short prompts. Writing in particular relies on recent advances in computers’ ability to parse and manipulate languages.

These advances are mainly the result of providing the algorithm with more lessons as examples of how to learn it, and more chips to digest it. And it costs money.

Consider the OpenAI language model GPT-3. It was a large mathematically simulated neural network that was fed a huge amount of text scraped from the web. GPT-3 can find statistical patterns that predict with impressive consistency which words should follow other words. Out of the box, the GPT-3 is significantly better than previous AI models at tasks such as answering questions, summarizing text, and correcting grammatical errors. On a scale, it is 1,000 times more capable than its predecessor, the GPT-2. However, according to some estimates, the cost of GPT-3 training is about $5 million.

“If GPT-3 is accessible and cheap, search engines will be completely overloaded,” Qi says. “It’s really, really powerful.”

The exponential cost of advanced AI training is also a problem for established companies building AI capabilities.

Dan McCreery leads a team within a division of Optum, a healthcare IT company. The team uses a language model to analyze call records to identify high-risk patients and recommend referrals. He says that training a language model that is one-thousandth the size of a GPT-3 could quickly run out of the team’s budget. Models need to be trained for specific tasks and cloud computing companies can be paid as much as $50,000 to rent computers and programs.

McCreery says there’s no reason for cloud computing providers to reduce costs. “I can’t believe cloud providers are working to reduce the cost of building AI models,” he says. He is looking at purchasing a dedicated chip designed to speed up AI training.

One of the reasons AI has made rapid progress lately is that many academic laboratories and startups can download and use the latest ideas and methods. For example, algorithms that have brought breakthroughs in image processing emerged from academic laboratories and were developed using off-the-shelf hardware and openly shared datasets.

However, over time, it is becoming increasingly clear that advances in AI are associated with an exponential increase in the capabilities of the underlying computers.

Of course, there are always advantages to larger companies in terms of budget, size, and accessibility. And the power of computers in large numbers is an important element of industries such as drug discovery.

Some are now trying to take it further. Microsoft said this week that it had used Nvidia to build a language model that was more than twice that of GPT-3. Chinese researchers say they have constructed four times that language model.

“The cost of training AI is certainly going up,” said David Cantor, executive director of ML Commons, an organization that tracks the performance of chips designed for AI. He says the idea is found in many areas of the tech industry that larger models can unlock valuable new features. This may explain why Tesla is designing its own chip to train AI models for autonomous driving.

Some are concerned that the rising cost of accessing the latest and greatest technology could slow the pace of innovation by booking large companies and companies for leasing tools. Growth.

“I think it undermines innovation,” says Stanford University professor Chris Manning, who specializes in AI and languages. “When there are only a few places where people can play inside these models of that size, it needs to significantly reduce the amount of creative discovery that has to happen.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here