💥 Gate Square Event: #PTB Creative Contest# 💥
Post original content related to PTB, CandyDrop #77, or Launchpool on Gate Square for a chance to share 5,000 PTB rewards!
CandyDrop x PTB 👉 https://www.gate.com/zh/announcements/article/46922
PTB Launchpool is live 👉 https://www.gate.com/zh/announcements/article/46934
📅 Event Period: Sep 10, 2025 04:00 UTC – Sep 14, 2025 16:00 UTC
📌 How to Participate:
Post original content related to PTB, CandyDrop, or Launchpool
Minimum 80 words
Add hashtag: #PTB Creative Contest#
Include CandyDrop or Launchpool participation screenshot
🏆 Rewards:
🥇 1st
AIGC is at an inflection point: What’s next for real-world adoption?
According to a new report from Bloomberg Intelligence analysts earlier this year, the AI industry could expand at a rate of 42 percent within a decade, starting with the fundamentals needed to train AI systems. Driven by demand for facilities, and then demand for follow-on equipment that uses artificial intelligence models, advertising and other services. The release of consumer-focused AI tools such as ChatGPT and Google's Bard will fuel a decade-long boom that will boost AIGC market revenue from $40 billion last year to an estimated $1.3 trillion by 2032.
Image source: Generated by Unbounded AI
Generative AI (AIGC) is gaining wider adoption, especially in the business sector.
For example, recently, Walmart announced that it would roll out an AIGC app to 50,000 non-store associates. The app, which combines Walmart's data with a third-party large language model (LLM), can help employees with a range of tasks, from speeding up the drafting process to acting as a creative partner to summarizing large documents, Axios reported.
Such deployments help drive demand for the graphics cards (GPUs) needed to train powerful deep learning models. Graphics cards GPUs are specialized computing processors that execute programmed instructions in parallel rather than sequentially like traditional central processing units (CPUs).
According to the Wall Street Journal, training these models "can cost companies billions of dollars because of the massive amounts of data they need to ingest and analyze." This includes all deep learning and foundational LLMs from GPT-4 to LaMDA, respectively. Provides support for ChatGPT and Bard chatbot applications.
01.* Riding the wave of generative AI*
The AIGC trend has given major GPU supplier Nvidia a powerful boost: The company reported eye-popping earnings for its most recent quarter. It's a boom time for Nvidia at least, as nearly every big tech company is trying to get their hands on high-end AI graphics cards.
Erin Griffiths writes in the New York Times that startups and investors are taking extraordinary measures to get their hands on these chips: “What tech companies are desperate for this year isn’t money, engineering talent, hype or even Profit, but desire for GPUs.”
Ben Thompson calls it "Nvidia on top of the mountain" in this week's Stratechery newsletter. The momentum was further fueled by the announcement of a partnership between Google and Nvidia that will see Google's cloud customers gain greater access to technology powered by Nvidia GPUs. All of which points to the current scarcity of these chips in the face of surging demand.
Do current demands mark the culmination of a new generation of AI, or perhaps herald the beginning of the next wave of developments?
02. How generative technologies are shaping the future of computing
Nvidia CEO Jensen Huang said on the company's most recent earnings call that this demand marks the dawn of "accelerated computing." He added that it would be wise for companies to "shift capital investments away from general-purpose computing and focus on generating artificial intelligence and accelerating computing."
General purpose computing refers to CPUs designed for a variety of tasks, from spreadsheets to relational databases to ERP. Nvidia believes that CPUs are now legacy infrastructure and developers should optimize GPU code to perform tasks more efficiently than traditional CPUs.
GPUs can perform many calculations simultaneously, making them ideal for tasks such as machine learning (ML) that perform millions of calculations in parallel. GPUs are also particularly good at certain types of mathematical computations, such as linear algebra and matrix manipulation tasks, which are the foundation of deep learning and artificial intelligence.
03. GPUs offer little benefit to certain types of software
However, other classes of software, including most existing business applications, are optimized to run on CPUs and benefit little from the parallel instruction execution of GPUs.
Thompson seems to hold a similar view: “My interpretation of Huang’s point is that all of these GPUs will be used for many of the same activities that are currently running on CPUs; this is certainly an optimistic view for Nvidia because it means pursuing Any excess capacity that generative AI may create will be filled by current cloud computing workloads.”
He continued: "That being said, I doubt it: both humans and companies are lazy, and CPU-based applications are not only easier to develop, but are mostly already built. I have a hard time imagining which companies would take the time and effort Porting something that already runs on the CPU to the GPU."
04. History repeats itself
InfoWorld's Matt Assay reminds us that we've seen this before. “When machine learning first emerged, data scientists applied it to everything, even if there were simpler tools. As data scientist Noah Lorang once pointed out, “Only a small subset of business problems are best solved by machine learning; Most people just want good data and understand what it means. "
The point is, accelerated computing and GPUs don't meet all software needs.
Nvidia had a strong quarter, driven by the current rush to develop a new generation of AI applications. The company is naturally enthusiastic about this. However, as we have seen from the recent Gartner Emerging Technology Hype Cycle, this new generation of AI is having a moment and is at the peak of inflated expectations.
Singularity University and XPRIZE founder Peter Diamandis said the expectations were to see the potential of the future without any downside. "At that moment, the hype started to generate unwarranted excitement and inflated expectations."
05. Current Limitations
At this point, we will soon reach the limits of the current AIGC craze. As venture capitalists Paul Kedrosky and Eric Norlin of SK Ventures wrote on their firm’s Substack: “Our view is that we are at the tail end of the first wave of AI based on large language models. This wave began in 2017 year, with [Google] Transformer paper ("Attention Is All You Need"), and it's wrapping up sometime in the next year or two, there are all sorts of constraints. "
These limitations include "a tendency to hallucinate, insufficient training data in a narrow domain, training corpora from years ago that are out of date, or a myriad of other reasons." They add: "We are already at the tail end of the current AI wave."
To be clear, Kedrosky and Norlin don't think AI has reached a dead end. Instead, they argue that substantial technological improvements are needed to achieve something better than “so-so automation” and limited productivity gains. They believe the next wave will include new models, more open source, and especially "ubiquitous/cheap GPUs" which, if correct, may not bode well for Nvidia but will enable those who need The technology benefits people.
As Fortune notes, Amazon has made clear its intention to directly challenge Nvidia's dominance in chipmaking. They're not alone, as many startups are vying for market share - as are chip giants including AMD. Challenging a dominant incumbent is extremely difficult. At least in this case, expanding the sources of these chips and reducing the price of scarce technology will be key to developing and spreading the AIGC wave of innovation.
06 The next wave of AI
Despite the limitations of the current generation of models and applications, the future of AIGC looks bright. There are likely several reasons behind this commitment, but perhaps the most important is a generational shortage of workers across the economy, which will continue to drive the need for higher levels of automation.
Although AI and automation have historically been viewed as separate, with the advent of the AIGC, this view is changing. The technology is increasingly becoming a driver of automation and productivity. Mike Knoop, co-founder of workflow company Zapier, touched on this phenomenon on a recent Eye on AI podcast, saying: "AI and automation are collapsing into one thing."
Of course, McKinsey believes this. "AIGC is poised to unleash the next wave of productivity," they said in a recent report. And they're not alone. For example, Goldman Sachs said that a new generation of artificial intelligence could increase global GDP by 7%.
Whether or not we are at the pinnacle of the current generation of AI, it is clearly an area that will continue to evolve and spark debate across the enterprise. As great as the challenges are, so are the opportunities—especially in a world hungry for innovation and efficiency. The race for GPU dominance is just a snapshot in this unfolding narrative, the prologue to a future chapter in artificial intelligence and computing.
References: