Google’s parent company, Alphabet, is very close to becoming the fourth company to join the $4 trillion market cap club (current members to have crossed that threshold are Apple, Microsoft, and Nvidia).
That’s thanks to a week of arguably great news for its AI efforts.
Fellow tech giant and significant Nvidia customer Meta is considering supplying some of its data centers with Google chips, according to The Information on Monday. The deal that’s potentially worth billions of dollars would start in 2027, the report claims, but Meta could also rent chips from Google Cloud as early as next year.
Preceding that piece of news was a flashy product reveal. Last week, the tech giant released its latest AI model, Gemini 3, and announced some updates to its viral image generator Nano Banana Pro, both to much fanfare.
According to The Verge, AI benchmarking firm LMArena’s co-founder and CTO, Wei-Lin Chiang, said that the release of Gemini 3 represents “more than a leaderboard shuffle.”
Right now, two companies are generally seen as leading the AI industry. You have OpenAI on the products side, whose ChatGPT has become almost synonymous with the word ‘AI chatbot’. On the hardware infrastructure side, you have Nvidia, the world’s number one supplier of graphics processing units (GPUs) that are used to power AI.
But Google, a company that has plenty of money and resources to spend and institutional knowledge to take advantage of as a Silicon Valley veteran, seems well poised to give a good fight on both fronts.
Many people all over the internet, including Salesforce CEO Marc Benioff, have claimed that Google’s Gemini 3 model is better than OpenAI’s ChatGPT by a significant degree.
From the outside looking in, OpenAI is still the leading name in AI chatbots for the time being. But, according to a New York Times report, the head of ChatGPT, Nick Turley, told employees in October that the company was facing “the greatest competitive pressure we’ve ever seen.”
On the AI chips front, Nvidia is still the confident frontrunner, but Google might score a big win in its catch-up efforts if The Information report is true.
Nvidia’s GPUs are the preferred AI chip right now, but Google’s custom tensor processing units (TPUs) are providing at least some competition.
While GPUs are considered versatile like Swiss Army knives with their flexibility to accommodate a broad range of tasks, Google’s TPUs are specialized and considered more efficient for specific AI workloads. TPUs are a type of application-specific integrated circuit (ASIC). An industry expert told CNBC last week that he sees custom ASICs growing “faster than the GPU market over the next few years.”
On top of the GPUs it purchases from Nvidia, Google has been using its own TPUs to power its cloud computing business for several years now. The tech giant is also renting out its TPUs to AI companies like Anthropic, which then uses the chips for its chatbot Claude in tandem with Nvidia GPUs, as well as Amazon’s Trainium chips.
Meta would no doubt be a significant addition to that customer list, and could perhaps give Google’s custom chips business a more competitive edge in a market dominated by a behemoth.
Trending Products
CHONCHOW 87 Keys TKL Gaming Keyboar...
Lenovo Ideapad Laptop Touchscreen 1...
Logitech MK235 Wi-fi Keyboard and M...
Amazon Fundamentals – 27 Inch...
ASUS RT-AX1800S Dual Band WiFi 6 Ex...
Acer CB272 Ebmiprx 27″ FHD 19...
Wireless Keyboard and Mouse Combo, ...
ASUS 24 Inch Desktop Monitor –...
SAMSUNG 32″ Odyssey G55C Seri...