Annual revenue exceeds US$1 billion? OpenAI’s “money-making trick”

Original Source: Light Cone Intelligence

Author: Hao Xin Liu Yuqi

Image source: Generated by Unbounded AI

"I saw him for less than three minutes, and I was thinking, ah, this is probably what happened to 19-year-old Bill Gates!"

In the eyes of YC founder Paul Graham (Graham), the 28-year-old Sam Altman is the same as the 19-year-old Bill Gates. He graduated from a prestigious school, is a maverick, and firmly believes that the world can be changed. Altruism and extreme ambition are intertwined. , but Altman is more radical, and his ambitions even exceed the boundaries that Silicon Valley can accommodate.

All in all, Graham is Altman’s first Bole. In 2014, Altman was selected by Graham to serve as the CEO of the entrepreneurial incubator. As we all know, after Altman founded OpenAI, he found his second boss: Nadella, the current CEO of Microsoft.

Facts have proved that Altman is indeed a great horse. In the five years since Altman took over as the CEO of YC Incubator, he has opened up a number of new businesses, leading YC's total market value to about 150 billion US dollars, and his investment network covers more than 4,000 start-up parks and more than 1,900 company. Today, OpenAI, which has been receiving extreme evaluations from both sides, has also initially handed over its report card.

**On August 30, according to foreign media "The Information", OpenAI expects to earn more than $1 billion in revenue through the sale of artificial intelligence software and its computing power in the next 12 months. **

As soon as the news came out, there was an uproar.

After all, as early as 3 months ago, OpenAI was still struggling on the "life and death line." A report by the Indian news media platform Analytics India Magazine stated that OpenAI costs about US$700,000 a day just to run its artificial intelligence service ChatGPT. OpenAI currently It is in a state of burning money. If it does not accelerate its own commercialization process, it is very likely that it will have to file for bankruptcy at the end of 2024.

This kind of speculation is not just speculation. Public data shows that in 2022, OpenAI’s revenue will be about 36 million U.S. dollars, but this year, they spent 544 million U.S. dollars. In other words, last year alone, they had a net loss of 500 million U.S. dollars. .

OpenAI, which has always been hailed as a "gold-swallowing beast", suddenly began to have large-scale revenue, which made everyone suspicious: What kind of money-making tricks did Altman change?

**More importantly, OpenAI is a barometer of the commercial potential of large language models. While exploring the secrets of OpenAI’s revenue, it also unveiled the long-awaited “Pandora’s Box” of commercialization of general large models. **

For the entire industry, boosting confidence is on the one hand, and on the other hand, once OpenAI's business model is fully operational and sets a new template for the industry, it is expected that there will be another wave of "hundred-model commercialization war" soon. , pushing AGI to quickly enter the second stage.

OpenAI's commercial landscape

From the moment OpenAI launched ChatGPT, it focused on commercialization with a hunter's ruthless gaze.

On November 30 last year, GPT-3.5 was born. Just over two months later, OpenAI quickly launched the charging model and launched the ChatGPT Plus subscription plan.

Since May this year, OpenAI has gone further and further on the road to business, frequently making “big moves”:

  • On May 15, the ChatGPT iOS app was launched.
  • On May 31, the GPT-4 third-party plug-in function (plugins) was fully opened.
  • On June 21, it was revealed that it planned to launch a large model store similar to the "App store".
  • On June 23, it was revealed that it planned to launch the ChatGPT version of "Personal Work Assistant".
  • On August 29, OpenAI released the enterprise version of ChatGPT, which is fully functionally aligned with Bing Chat.

(Source: OpenAI official website)

According to OpenAI's official website, at present, its products are mainly divided into two categories: one is API-based products, including callable GPT models, DALL E models (Vincent graph models); Whisper (speech recognition models) and Chat (dialogue), Embeddings (vectorization), Analysis (analysis), and Fine-tuning (fine-tuning) functions provided by developers; one type is a product with ChatGPT dialogue robot as the core, which is divided into personal version and enterprise version.

Based on the information on OpenAI’s official website and the compilation of public information, Light Cone Intelligence found that OpenAI currently has two main revenue pillars.

(light cone intelligent drawing)

**First of all, since the birth of ChatGPT, OpenAI has relied most on the fee-by-API call model. **In this mode, users can almost use the multi-modal capabilities developed by OpenAI, running through the underlying large language model, model deployment, model development and other processes, and the price is also very friendly, only a few cents per call. OpenAI officials did not specify whether users are individual users or enterprises, but according to foreign media reports, in addition to a large number of individual users, well-known companies such as Jasper, Slack, Salesforce, and Morgan Stanley are all early users.

It is worth mentioning that under this charging model, OpenAI also provides its largest "funder" Microsoft with a number of functions including coding, GPT-4, Vincentian graphs, ChatGPT, etc., and integrates it into Microsoft cloud services, search , office software and many other products. It is not yet known how much OpenAI will benefit from it, but taking the Azure cloud business as an example, Microsoft's cost for using the above OpenAI functions is consistent with the quotation. At the same time, all OpenAI technologies also run for free on Microsoft's Azure cloud infrastructure.

**The second is the subscription fee system based on ChatGPT products. **In the early days, OpenAI obtained a large amount of training data for free. With this, within 9 months, the user growth data of TikTok and Instagram were refreshed, becoming the fastest application to reach 100 million users.

After all, "making a wedding dress" for Microsoft and giving users "early adopters" is not the ultimate goal of OpenAI. If you want to make a profit, you need to find a way to increase your payment rate. In June, the number of ChatGPT users exceeded the peak and then declined. OpenAI began to shift its business ideas from the C-side to the B-side in an attempt to "steal" the business of the sponsor.

OpenAI stated that many large companies are interested in its new enterprise-level products. Since the launch of ChatGPT, it has been adopted by more than 80% of Fortune 500 company teams, including Block, Canva, Estee Lauder, PwC and other large companies. Tried the Beta version of ChatGPT Enterprise Edition in advance. After that, OpenAI will also launch a business version of ChatGPT for small institutions and provide more customization options.

**According to Guangcone Intelligence, since the launch of ChatGPT, the most anticipated one is undoubtedly the enterprise version. After being criticized for "data privacy security", OpenAI has made adjustments to its products. **

ChatGPT Enterprise Edition currently has OpenAI's most advanced language model GPT-4 driver. Enterprise users have priority access to GPT-4 and the usage limit has been cancelled. The execution speed is twice as high as that of ordinary GPT-4. In addition, the enterprise version allows more content to be entered, and the context window is expanded to 32,000 Tokens and about 25,000 words.

**OpenAI promises that customer prompts and all other data will not be used for model training. **Users can control the retention time of data, and any deleted conversations will be automatically deleted from ChatGPT’s system within 30 days.

In addition, in terms of deployment, the Enterprise Edition provides a brand-new management control platform that can manage users in batches, including single sign-on, domain verification, and a dashboard containing usage statistics, etc., suitable for large-scale scalable deployment. At the same time, it also increases the supporting use of a full set of tool chains such as vectorization tools and advanced data analysis tools.

**From this point of view, OpenAI is trying to shift from the previous low-charge, low-frequency API charging by token model to a diversified charging model of high pricing, high stickiness 2B subscription charging and customized solution charging. **

**The more you grow, the more you will lose money? **

The increase in revenue does not mean that OpenAI has really started to make money. After all, 1 billion US dollars in the investment cost of OpenAI can only be "sprinkled water". And after the commercialization is fully launched, with the number of OpenAI users and GPT4's continuous research on the demand for computing power, the cost will continue to rise with the scale of users, and many technology companies will find it difficult to escape the curse of "the more the revenue grows, the more the loss".

The high cost of OpenAI is obvious to all. According to light cone intelligent combing, the current cost of OpenAI is mainly divided into the following parts:

  1. **Talent cost: **OpenAI has 375 permanent employees in San Francisco, most of whom are giants in the field of machine learning. Their annual salary alone is US$200 million. According to a survey by a foreign salary website, the median salary of OpenAI software engineers is US$920,000.
  2. **Training cost: **According to data, they spent 4.6 million US dollars to train GPT3 once, and the corresponding cloud resource cost is almost 9 figures (that is, hundreds of millions). According to data from SemiAnalysis, a semiconductor consulting research company, if the cost of OpenAI cloud computing is about $1/A100 hours, then under such conditions, the cost of only one training is about $63 million, which does not include all Experiments, failed training, and other costs, e.g., data collection, RLHF, human costs, etc.
  3. Inference and operating costs: Quoting Forbes, the operating expenses or inference costs of ChatGPT’s large language models “far exceed the training costs when deploying any reasonably sized model.” “In fact, ChatGPT’s inference costs exceed training costs every week.”
  4. **Investment: **According to foreign media The Information, at the beginning of the year, OpenAI invested in at least 16 companies through a US$100 million venture fund supported by Microsoft and other investors, and its accelerator Converge invested in 10 companies. Light Cone Intelligence also found that in the first half of this year, OpenAI made public investments in the name of companies three times. Before it, it was Microsoft, Google, and Nvidia, a number of established companies.
  5. **Acquisition:**On August 17, OpenAI announced the acquisition of a game company called Global Illumination. It is reported that this is OpenAI’s first public acquisition since its founding in 2015.

According to public information, since the establishment of OpenAI, it has received more than 15 billion US dollars in investment alone, which is used to fill the "holes" in high-cost training and development of large models.

To reach the destination of AGI, OpenAI does need money, but "burning money" is like a bottomless pit, and "bleeding" may not bring growth. It is precisely because of this that OpenAI must speed up the commercialization process.

But having revenue does not mean turning a profit. Founder Securities has carefully calculated the relevant indicators of ChatGPT based on public data, and pointed out in the analysis: **The general logic of OpenAI’s profitability is to increase the payment ratio of GPT-4 and reduce the cost of GPT-3.5, which is the main cost source of OpenAI. **

In the case of GPT-3.5 cost reduction, if the proportion of daily active and monthly active users reaches 35%, and the monthly active payment rate exceeds 12%, it may be able to achieve breakeven. For the cost-compressed GPT-3.5 model and GPT-4 model, if the monthly payment rate is increased by 0.5% per month, it may turn around.

As of July 12, 2023, the number of daily visits to the ChatGPT webpage has remained basically flat at more than 50 million. As of June 19, 2023, the average daily active users of the OpenAIChatGP iOS terminal in the United States for the first 30 days was 946,000.

According to data.ai data, as of June 19, 2023, the average daily active users of the ChatGPT iOS terminal in the United States from May 21 to June 19 is about 946,400, and the cumulative number of paying users is about 41,300. Therefore, the daily active payment rate (number of monthly paying users/daily active users) is about 4.36% (4.13÷94.64). According to Questmobile data, the ratio of daily and monthly active users of Baidu APP is about 37%. Therefore, if the ratio of daily and monthly active users of ChatGPT is 37%, the number of monthly active users is about 2.5578 million (94.64÷37%). Paid users/monthly active users) is about 1.61% (4.13÷255.78).

The above data shows that the commercialization of OpenAI is still difficult and long-term. In the future, only by increasing the user payment ratio to a certain level can we achieve a break-even balance. Revenue alone cannot explain the problem.

OpenAI and Microsoft "grab"

It is gratifying that the business model is running smoothly, but problems also follow. The previous "hidden worry" was directly moved to the front, that is, the delicate relationship with Microsoft.

The premise that needs to be made clear is that no matter how sweet the "honeymoon period" OpenAI and Microsoft have spent, they are two independently operated institutions and companies. Although the cooperation between the two is more special than the relationship of acquisition or investment, Nadella even went against public opinion and shut down some businesses to build an intelligent computing center for OpenAI. However, once OpenAI begins to be commercialized independently, it means that the two will share the same piece of cake, and war and fighting will be inevitable.

** Different from facing competition directly, the relationship between OpenAI and Microsoft is closer and more complicated. **

In 2019, OpenAI transitioned from a non-profit to a hybrid model, in which a company called OpenAI LP is responsible for commercializing products developed by the open artificial intelligence research laboratory. In the same year, Microsoft invested $1 billion in the partnership, through its Microsoft Azure AI supercomputing technology, as the infrastructure for training the GPT model. And in 2022, this partnership will be further strengthened, and Microsoft will invest $10 billion in OpenAI.

(OpenAI LP, a for-profit entity established by OpenAI)

After OpenAI repays its first investors, Microsoft will get 75 percent of profits until its main investment is repaid, and 49 percent thereafter until it hits a theoretical cap, according to a person familiar with the terms of the partnership. At the same time, people familiar with the matter broke the news to foreign media that starting around 2025, the upper limit of profit sharing will be increased by 20% every year, instead of setting a hard upper limit for profit sharing-basically their return on investment. Microsoft could effectively own more than one-third of the company, according to investors with knowledge of the deal.

Since March this year, Lightcone Intelligence has also learned from many companies that if they want to use the ability of ChatGPT, there are two roads ahead of them: One is to directly call the API interface of OpenAI, according to the number of tokens used OpenAI pays; **The second is to use OpenAI services on Azure based on public cloud computing power. **Compared with the former, Azure services are more secure, and the supporting facilities are more complete when the price is the same. Microsoft cloud salespeople have been emphasizing "difference."

**However, according to the survey, even so, a large number of enterprises still choose to directly access API services. One of the reasons is that it is simple and convenient, and the pay-as-you-go model is more cost-effective for individual developers and small and medium-sized enterprises. More importantly, the logic of calling APIs and purchasing cloud services are completely different. The former can be decided by developers themselves, while the latter needs to be reported to higher authorities for approval and consideration of integration with the business. **

As mentioned above, in the eyes of the outside world, the two sides are complementary. Microsoft provides funds, resources and technical support to OpenAI, and OpenAI helps Microsoft become the top technology giant again. However, this situation, with OpenAI in 2023 The gradual commercialization has also changed.

In June of this year, according to "The Information," an internal document from Microsoft directed Azure sales staff to tell customers that Microsoft can provide more services than OpenAI; And so on, to defend. This is the first time the subtle rift between the two sides has been shown to the outside world.

**When engaged in brutal market competition, conflicts of interest between the two parties are inevitable. A sales pitch cannot prove anything, but the strategy behind it can truly explain the problem. **

For example, in March this year, after OpenAI first signed contracts with companies such as Snap and Instacart, Microsoft Cloud Services released a preview of the ChatGPT function after a week; After paying, Microsoft's cloud service gained access to GPT-4.

The technology is in the hands of OpenAI, and it must be shared with Microsoft. However, there are no specific instructions and regulations on the time, node, and extent of sharing. OpenAI wants to use this "time difference" to grab some benchmark customers.

To put it bluntly, how does OpenAI handle the relationship with Microsoft? The two sides draw boundaries, set standards, and find cooperation methods and balance points. If it is not handled carefully, it is likely to be a common crisis for both parties.

**However, in the business world, there are no permanent friends or enemies, only eternal interests. **In the face of interests, people are more looking forward to how smart "geniuses" like Altman and Nadella can create new businesses. A miracle, rather than falling into the cliché of "tearing".

View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
  • Reward
  • Comment
  • Repost
  • Share
Comment
0/400
No comments
Trade Crypto Anywhere Anytime
qrCode
Scan to download Gate App
Community
English
  • 简体中文
  • English
  • Tiếng Việt
  • 繁體中文
  • Español
  • Русский
  • Français (Afrique)
  • Português (Portugal)
  • Bahasa Indonesia
  • 日本語
  • بالعربية
  • Українська
  • Português (Brasil)