AI chips are hot. Here’s what they are, what they’re for, and why investors see gold

 AI chips are hot.  Here's what they are, what they're for, and why investors see gold


What’s most interesting about the technology is an unattractive sliver of silicon closely related to the chips that power video game graphics. It’s an AI chip, specifically designed to make building AI systems like ChatGPT faster and cheaper.

Such chips have suddenly risen to the center of what some experts see as an AI revolution that could reshape the tech industry and possibly the world along with it. Shares of Nvidia, the leading AI chip designer, jumped nearly 25% on Thursday after the company forecast a huge increase in revenue that analysts said indicated an increase in sales of its products. The company was briefly worth more than $1 trillion on Tuesday.


Not an easy question to answer. There really isn’t a fully agreed-upon definition of an AI chip, said Hannah Dohmen, a research analyst at the Center for Security and Emerging Technology.

Broadly speaking, however, the term encompasses specialized computing hardware for handling AI workloads, such as training AI systems to tackle difficult problems that can choke conventional computers.


Three entrepreneurs founded Nvidia in 1993 to push the boundaries of computational graphics. Within a few years, the company developed a new chip called a graphics processing unit, or GPU, that dramatically accelerated both video game development and play by performing multiple complex graphics calculations simultaneously.

That technique, formally known as parallel processing, would prove to be key to the development of both games and AI. Two University of Toronto graduate students used a GPU-based neural network to win a prestigious 2012 AI competition called ImageNet, identifying photographic images with much lower error rates than their competitors.

The victory ignited interest in AI-related parallel computing, opening a new business opportunity for Nvidia and its rivals, providing researchers with powerful tools to explore the frontiers of AI development.


Eleven years later, Nvidia is the leading supplier of chips for building and updating AI systems. One of its recent products, the H100 GPU, packs 80 billion transistors about 13 million more than Apple’s latest high-end processor for his MacBook Pro laptop. Unsurprisingly, this technology doesn’t come cheap; at an online retailer, the H100 costs $30,000.

Nvidia doesn’t make these complex GPU chips itself, a task that would require huge investments in new factories. Instead, it relies on Asian chip foundries such as Taiwan Semiconductor Manufacturing Co. and Korea’s Samsung Electronics.

Some of the biggest customers of AI chips are cloud computing services such as those operated by Amazon and Microsoft. By leasing their AI computing power, these services enable smaller businesses and groups that can’t afford to build their own AI systems from the ground up to use cloud-based tools to help with tasks that can range from discovery of drugs to customer management.


Parallel processing has many uses outside of AI. A few years ago, for example, Nvidia graphics cards were in short supply because cryptocurrency miners, who banked computers to solve tricky math problems for bitcoin rewards, stole most of them. That problem vanished when the cryptocurrency market crashed in early 2022.

Analysts say Nvidia will inevitably face tougher competition. One potential rival is Advanced Micro Devices, which already faces Nvidia in the computer graphics chip market. AMD has recently taken steps to beef up its lineup of AI chips.

Nvidia is headquartered in Santa Clara, California. Co-founder Jensen Huang remains the company’s president and chief executive officer.

#chips #hot #Heres #theyre #investors #gold

Previous articleUncool and nowhere to scroll: The internet has become hostile to millennials like me
Next articleThe elephants in the Computex room


Please enter your comment!
Please enter your name here