Today, we’re proud to introduce Maia 200, a breakthrough inference accelerator engineered to dramatically improve the economics of AI token generation. Maia 200 is an AI inference powerhouse: an ...
Microsoft Azure's AI inference accelerator Maia 200 aims to outperform Google TPU v7 and AWS Inferentia with 10 Petaflops of FP4 compute power.
Raspberry Pi sent me a sample of their AI HAT+ 2 generative AI accelerator based on Hailo-10H for review. The 40 TOPS AI ...
China’s race to build homegrown artificial intelligence chips has collided head on with Nvidia’s H200, the United States company’s latest workhorse for training and running large models. The result is ...
What is the Maia 200 AI accelerator? The Maia 200 is Microsoft's custom-designed chip, specifically an AI inference ...
Microsoft has announced that Azure’s US central datacentre region is the first to receive a new artificial intelligence (AI) inference accelerator, Maia 200.
ServiceNow (NOW) proved in its fourth quarter results that the rise of artificial intelligence is not dousing demand or causing it to lose seats, but analysts noted 21.5% revenue subscription growth ...
Raspberry Pi has started selling the AI HAT+ 2, an add-on board that represents a significant upgrade over the AI HAT+ model launched in 2024. While ...