Response from Academician Huang Renxun to OpenAI Altman’s 7 trillion chip plan: laughter

Front-foot Altman has just been revealed to be raising money $7 trillion, competing with NVIDIA to reshape the global semiconductor landscape. Lao Huang, the back footer, actually responded: Old man, that’s an exaggeration.

Let’s talk specifically, with a touch of yin-yang aesthetics (manual dog head):

Advertisement

($7 trillion) can obviously buy all the GPUs.

If you assume that computers won't get faster, you might conclude that we'd need 14 planets, 3 galaxies, and 4 suns to fuel it all. But computer architecture continues to advance.

In short, Huang believes that more efficient, lower-cost chips will continue to emerge, which will make Altman's “$7 trillion” massive investment less necessary.

But having said that, Lao Huang didn't say anything to death. He also emphasized that investment growth in the AI ​​field will not stop in the short term, and also predicted:AI data centers will double in size in five years.

In fact, netizens are not less impressed by the revelation of Altman's 7 trillion news. According to Gartner's forecast, the global semiconductor industry's total revenue in 2023 will be US$533 billion, and US$7 trillion is 14 times this number.

Netizens estimate that these funds are not only enough to swallow up NVIDIA + TSMC + Intel + Samsung + Qualcomm + Broadcom + AMD + ASML, etc., and a series of leading semiconductor companies, but the remaining money is more than enough to buy another Meta.

Advertisement

So what specific information did Lao Huang share this time? If you are interested, here is the transcript~

(Compiled by Kimi and ChatGPT, assisted by human editors)

Jen-Hsun Huang: The most important AI event last year was Llama 2

host: I want to start with a question that has always been in my mind, how many GPUs can $7 trillion buy?

Jen-Hsun Huang:Obviously, all GPUs.

host: I’d love to ask Sam this question, it’s a very big number (laughs). Speaking of ambition, we do not lack ambition, but how should today’s government plan for artificial intelligence? What suggestions do you have?

Jen-Hsun Huang: First of all, this is an amazing time because we are at the beginning of a new industrial revolution, in the past the information revolution was brought about by steam engines, electricity, PCs and the Internet, and now it is artificial intelligence.

Unprecedentedly, we are experiencing two transformations simultaneously:The end of general-purpose computing and the beginning of accelerated computing.

Just like using CPU computing as the basis for all work is no longer feasible today. The reason is that 60 years have passed since we invented the CPU in 1964, the year the IBM System 360 was released. We've actually been riding this wave of technology for 60 years, and now we're at a new beginning in accelerated computing.

If you want to achieve sustainable computing, energy-efficient computing, high-performance computing, cost-effective computing, you can no longer rely on general-purpose computing. You need specialized domain-specific acceleration, and that's what's driving the growth in accelerated computing. It makes possible a new type of application – artificial intelligence.

The question is, what is cause and what is effect? You know, the first thing is that accelerated computing makes new types of applications possible. Many applications are accelerating today.

Now that we are at the beginning of this new era, what happens next?

Currently, the total value of global data centers is approximately US$1 trillion. In the next 4-5 years, this number will grow to US$2 trillion, and these data centers will become the source of global software operations. All of this will be accelerated, and this accelerated computing architecture is ideally suited for the next generation of software, generative artificial intelligence. This is the core change that is happening right now.

In the process of replacing general-purpose computing, remember that the performance of the architecture is also improving at the same time. So you can't just assume that you'll buy more computers, you have to assume that computers will get faster. Therefore, the actual computing resources required are not that many. Otherwise, if you assume that computers won't get faster, you might conclude that we'd need 14 planets, 3 galaxies, and 4 suns to fuel it all.

One of our greatest contributions over the past 10 years has been to advance computing and artificial intelligence a million times. So, whatever needs you think are driving the world, you have to consider that it's going to be a million times faster and more efficient.

host: Regarding the fear of AI taking over the world, I think we need to clarify which ones are real and which ones are hype. What do you think is the biggest problem right now?

Jen-Hsun Huang: Very good question. It is absolutely right that we must develop new and creative technologies safely, first and foremost. Whether it's aircraft, automobiles, manufacturing systems, medicine, all of these different industries are heavily regulated today. These regulations must be expanded and enhanced to take into account the situations in which AI will come to us through products and services.

Now, some interest groups are trying to scare people and demystify AI to prevent others from taking action on the technology. I think this is a mistake, we want to universalize AI technology.

If you ask me what the most important AI event was last year, I think it was Llama 2, which is an open source model. Or Falcon, another excellent model. And mistral and so on. All these technologies are built on transparency and explainability. Because of these open source models, many different innovations such as safety, alignment, guardrails, reinforcement learning, etc. are possible.

Getting everyone on board with the progress of AI is probably the most important thing, rather than convincing people that AI is too complex, too dangerous, too mysterious and only two or three people in the world can do it, which I think is a huge mistake.

host: Do you think the next era of AI will continue to be built on GPUs? What breakthroughs do you think will happen in the future?

Jen-Hsun Huang: In fact, almost all big companies in the world are doing in-house development. Google, AWS, Microsoft, Meta are all making their own chips.

NVIDIA GPUs are being focused on because they are the only platform that is open to everyone.

A unified architecture covers all areas. Our CUDA architecture can adapt to any emerging architectural model, whether it is CNN, RNN, LSTM, or now Transformer. Now, various different architectures are being created such as Vision Transformer, Birdseye View Transformers, etc., and all these different architectures can be developed on NVIDIA GPUs.

host: The thing about AI is that it has evolved a lot in a very short period of time, so the infrastructure used five years ago may be very different from the infrastructure used today.

But Huang's point is very important, that is, Nvidia will always have a place.

host: Let’s change the topic next, let’s not talk about AI for the moment, and talk about education. Being at the forefront of technology, what should people pay attention to in education? What should people learn and how should they educate their children?

Jen-Hsun Huang: Wow, that’s a great question, but my answer might sound (and people’s impressions of) the exact opposite.

You may remember that over the past 10 or 15 years, almost everyone who answered this question in a formal setting would say that computer science and programming are something that everyone should learn.

But in reality, it's almost the exact opposite, because our job is to create computing technology so that no one needs to “program” (in the traditional sense), so that everyone in the world can become a programmer.

This is the miracle brought by artificial intelligence. For the first time, we are closing the technology gap (in programming) and making AI accessible to more people, which is why AI is being talked about almost everywhere.

Because for the first time, everyone in the company can become a technology expert, it's a perfect time as the technology gap has closed.

Problems that required specialized expertise in areas such as digital biology, education of young people, manufacturing or agriculture are now within everyone's reach.

Because people have computers that can follow human instructions and help humans automate work and increase productivity and efficiency, I think this is an excellent time. Of course, people need to learn to use such tools immediately, which is an urgent issue.

Also realize that it is now easier to participate in AI than at any time in the history of computing, and society has a responsibility to improve everyone's skills. At the same time, I believe that this process of improvement will be enjoyable and surprising.

host: So, if I were to choose a college major, what advice would you give me?

Jen-Hsun Huang: I will first consider a question – I think the most complex science to understand is biology, especially biology related to humans.

It not only covers a wide range of content, but is also very complex and difficult to understand. The key is that it will have a huge impact.

We call this field (biology)life sciencesand the disciplines related to medicine are calleddrug discovery(discovery).

But in traditional industries like computer science, no one says “car discovery” or “computer discovery” or “software discovery”; they call it engineering.

Every year, our software, chips, and infrastructure get better than the year before, but progress in life sciences is sporadic.

If I were given a chance to choose again, I would realize that the discipline of engineering life science—— life engineeringComing soon, it will become an engineering field and not just a purely scientific field.

So, I hope today's young people will enjoy working with proteins, enzymes and materials, using engineering techniques to make them more energy-efficient, lightweight, durable and more sustainable.

In the future, all these inventions will be part of engineering rather than scientific discoveries.

One More Thing

Just on Monday, Nvidia's market value surpassed Amazon and became the fourth most valuable company in the U.S. stock market. The top three are Microsoft, Apple and Google parent company Alphabet.

However, Amazon regained the fourth position at the close, with a closing market value of US$1.79 trillion, and Nvidia's closing market value was approximately US$1.78 trillion.

Since the beginning of 2024, Nvidia's stock price has been rising steadily, with an increase of nearly 50%, relying on strong global demand for chips. According to estimates, Nvidia's market value has increased by approximately US$600 billion this year, exceeding the increase in the last seven months of 2023.

Advertisement