The artificial intelligence boom unleashed by the launch of ChatGPT has been governed by a single rule — bigger AI models are better. That consensus has pushed Microsoft (MSFT), Google (GOOGL), Amazon.com (AMZN), Meta Platforms (META) and others into a spending war to source chips from Nvidia (NVDA) and others. The competition could be about to change as the industry faces obstacles in its quest to build ever-larger AI models, Adam Clark writes in this week’s edition of Barron’s. Nvidia has been the chief beneficiary of the spending race, but the scaling law is now facing questions, the author adds. At some point, AI’s emphasis will shift from training to inference, the process of generating answers or results from the models. Many in the industry now believe that dedicating more computing power to inference can provide similar benefits to training. The inference focus has big implications for Nvidia. While training is uniquely suited to the company’s GPUs, inference might be more readily handled by AI processors from Nvidia peers like AMD (AMD) and Intel (INTC), by custom chips from Amazon, or by a range of chip start-ups, the publication points out.
Don't Miss our Black Friday Offers:
- Unlock your investing potential with TipRanks Premium - Now At 40% OFF!
- Make smarter investments with weekly expert stock picks from the Smart Investor Newsletter
Published first on TheFly – the ultimate source for real-time, market-moving breaking financial news. Try Now>>
Read More on INTC:
- Nvidia’s (NVDA) Jensen Huang Was Once Offered the CEO Job at TSMC
- Biden admin debates on additional China chip curbs, Bloomberg says
- Intel Secures $7.86 Billion for Semiconductor Expansion
- Intel’s (NASDAQ:INTC) $8B Payout Catches Workers’ Attention
- Dow Jones Drops Following October’s Inflation Report