This article is more than

1 year old
Artificial Intelligence

OpenAI chief seeks new Microsoft funds to build ‘superintelligence’

Author: Editors Desk Source: Financial Times
November 13, 2023 at 11:25
OpenAI’s chief Sam Altman said the partnership with Microsoft would ensure ‘that we both make money on each other’s success, and everybody is happy’ © FT montage/Bloomberg
OpenAI’s chief Sam Altman said the partnership with Microsoft would ensure ‘that we both make money on each other’s success, and everybody is happy’ © FT montage/Bloomberg
Sam Altman expects big tech group will back start-up’s mission to create software as intelligent as humans
OpenAI plans to secure further financial backing from its biggest investor Microsoft as the ChatGPT maker’s chief executive Sam Altman pushes ahead with his vision to create artificial general intelligence (AGI) — computer software as intelligent as humans.

In an interview with the Financial Times, Altman said his company’s partnership with Microsoft’s chief executive Satya Nadella was “working really well” and that he expected “to raise a lot more over time” from the tech giant among other investors, to keep up with the punishing costs of building more sophisticated AI models.

Microsoft earlier this year invested $10bn in OpenAI as part of a “multiyear” agreement that valued the San Francisco-based company at $29bn, according to people familiar with the talks.

Asked if Microsoft would keep investing further, Altman said: “I’d hope so.” He added: “There’s a long way to go, and a lot of compute to build out between here and AGI . . . training expenses are just huge.”

Altman said “revenue growth had been good this year”, without providing financial details, and that the company remained unprofitable due to training costs. But he said the Microsoft partnership would ensure “that we both make money on each other’s success, and everybody is happy”.

In the latest sign of how OpenAI intends to build a business model on top of ChatGPT, the company announced a suite of new tools, and upgrades to its existing model GPT-4 for developers and companies at an event on November 6 attended by Nadella.

The tools include custom versions of ChatGPT that can be adapted and tailored for specific applications, and a GPT Store, or a marketplace of the best apps. The eventual aim will be to split revenues with the most popular GPT creators, in a business model similar to Apple’s App Store.

“Right now, people [say] ‘you have this research lab, you have this API [software], you have the partnership with Microsoft, you have this ChatGPT thing, now there is a GPT store’. But those aren’t really our products,” Altman said. “Those are channels into our one single product, which is intelligence, magic intelligence in the sky. I think that’s what we’re about.”

To build out the enterprise business, Altman said he hired executives such as Brad Lightcap, who previously worked at Dropbox and start-up accelerator Y Combinator, as his chief operating officer.

Altman, meanwhile, splits his time between two areas: research into “how to build superintelligence” and ways to build up computing power to do so. “The vision is to make AGI, figure out how to make it safe . . . and figure out the benefits,” he said.

Pointing to the launch of GPTs, he said OpenAI was working to build more autonomous agents that can perform tasks and actions, such as executing code, making payments, sending emails or filing claims.

“We will make these agents more and more powerful . . . and the actions will get more and more complex from here,” he said. “The amount of business value that will come from being able to do that in every category, I think, is pretty good.”

The company is also working on GPT-5, the next generation of its AI model, Altman said, although he did not commit to a timeline for its release.

It will require more data to train on, which Altman said would come from a combination of publicly available data sets on the internet, as well as proprietary data from companies.

OpenAI recently put out a call for large-scale data sets from organisations that “are not already easily accessible online to the public today”, particularly for long-form writing or conversations in any format.

While GPT-5 is likely to be more sophisticated than its predecessors, Altman said it was technically hard to predict exactly what new capabilities and skills the model might have.

“Until we go train that model, it’s like a fun guessing game for us,” he said. “We’re trying to get better at it, because I think it’s important from a safety perspective to predict the capabilities. But I can’t tell you here’s exactly what it’s going to do that GPT-4 didn’t.”

To train its models, OpenAI, like most other large AI companies, uses Nvidia’s advanced H100 chips, which became Silicon Valley’s hottest commodity over the past year as rival tech companies raced to secure the crucial semiconductors needed to build AI systems.

Altman said there had been “a brutal crunch” all year due to supply shortages of Nvidia’s $40,000-a-piece chips. He said his company had received H100s, and was expecting more soon, adding that “next year looks already like it’s going to be better”.

However, as other players such as Google, Microsoft, AMD and Intel prepare to release rival AI chips, the dependence on Nvidia is unlikely to last much longer. “I think the magic of capitalism is doing its thing here. And a lot of people would like to be Nvidia now,” Altman said.

OpenAI has already taken an early lead in the race to build generative AI — systems that can create text, images, code and other multimedia in seconds — with the release of ChatGPT almost a year ago.

Despite its consumer success, OpenAI seeks to make progress towards building artificial general intelligence, Altman said. Large language models (LLMs), which underpin ChatGPT, are “one of the core pieces . . . for how to build AGI, but there’ll be a lot of other pieces on top of it”.

While OpenAI has focused primarily on LLMs, its competitors have been pursuing alternative research strategies to advance AI.

Altman said his team believed that language was a “great way to compress information” and therefore developing intelligence, a factor he thought that the likes of Google DeepMind had missed.

“[Other companies] have a lot of smart people. But they did not do it. They did not do it even after I thought we kind of had proved it with GPT-3,” he said.

Ultimately, Altman said “the biggest missing piece” in the race to develop AGI is what is required for such systems to make fundamental leaps of understanding.

“There was a long period of time where the right thing for [Isaac] Newton to do was to read more math textbooks, and talk to professors and practice problems . . . that’s what our current models do,” said Altman, using an example a colleague had previously used.

But he added that Newton was never going to invent calculus by simply reading about geometry or algebra. “And neither are our models,” Altman said.

“And so the question is, what is the missing idea to go generate net new . . . knowledge for humanity? I think that’s the biggest thing to go work on.”
Keywords
You did not use the site, Click here to remain logged. Timeout: 60 second