Today, we’re proud to introduce Maia 200, a breakthrough inference accelerator engineered to dramatically improve the economics of AI token generation. Maia 200 is an AI inference powerhouse: an ...
Physical AI marks a transition from robots as programmed tools to robots as adaptable collaborators. That transition will unfold over years rather than months, but the foundation models emerging from ...
Maia 200 is most efficient inference system Microsoft has ever deployed, with 30% better performance per dollar than latest ...
AMD is hiring a Senior AI/ML Lead in Hyderabad to lead the design, development, deployment, and optimization of AI/ML ...
Microsoft has announced that Azure’s US central datacentre region is the first to receive a new artificial intelligence (AI) inference accelerator, Maia 200.
API frameworks reduce development time and improve reliability across connected software systemsChoosing the right framework ...
Calling it the highest performance chip of any custom cloud accelerator, the company says Maia is optimized for AI inference on multiple models.
With LiteBox, Microsoft has released a library OS written in Rust, which aims to reduce the attack surface through minimal ...
Microsoft unveils Maia 200, a custom AI chip designed to power Copilot and Azure, challenging Amazon and Google in the ...
Discover the leading AI code review tools reshaping DevOps practices in 2026, enhancing code quality, security, and team productivity with automated solutions.
Some results have been hidden because they may be inaccessible to you
Show inaccessible results