TSMC will reportedly build OpenAI's in-house chip using its A16 Angstrom node. | Image credit: TSMC
So how much better will TSMC's A16 Angstrom node be compared to the 2nm chips that will precede it? At the same operating voltage, the A16 Angstrom node could be 8-10% faster while consuming 20% less power. Based on 2023 data, each query sent to ChatGPT will cost OpenAI 4 cents, according to Bernstein analyst Stacy Rasgon. If ChatGPT usage continues to grow until it reaches a tenth the size of Google Search, the company will need $16 billion worth of chips each year, which could make this a very profitable venture for TSMC.
Earlier this year, a rumor circulated that TikTok parent company ByteDance was developing an AI chip with its partner Broadcom. This chip was said to be manufactured by TSMC using its 5nm process node. You might be wondering how China-based ByteDance was able to bypass US sanctions to join forces with Broadcom. Apparently, there is a legal loophole that allows the production and export of custom application-specific integrated chips (ASIC) to China.
ByteDance plans to use the in-house chip to run new powerful AI algorithms for TikTok and the Douyin app available in China. The company also operates a chatbot called Doubao in China. While ByteDance waits for TSMC to start producing its in-house chips, the company still has a stockpile of chips it bought from NVIDIA before the US sanctions began.