• AutoTL;DRB
    link
    fedilink
    English
    41 year ago

    This is the best summary I could come up with:


    The rumors are true: Microsoft has built its own custom AI chip that can be used to train large language models and potentially avoid a costly reliance on Nvidia.

    “We were excited when Microsoft first shared their designs for the Maia chip, and we’ve worked together to refine and test it with our models,” says Sam Altman, CEO of OpenAI.

    Microsoft is part of a group that includes AMD, Arm, Intel, Meta, Nvidia, and Qualcomm that are standardizing the next generation of data formats for AI models.

    “At the scale at which the cloud operates, it’s really important to optimize and integrate every layer of the stack, to maximize performance, to diversify the supply chain, and frankly to give our customers infrastructure choices,” says Borkar.

    Estimates have suggested OpenAI needed more than 30,000 of Nvidia’s older A100 GPUs for the commercialization of ChatGPT, so Microsoft’s own chips could help lower the cost of AI for its customers.

    As Microsoft pushes ahead with even more Copilot features this week and a Bing Chat rebranding, Maia could soon help balance the demand for the AI chips that power these new experiences.


    The original article contains 1,352 words, the summary contains 189 words. Saved 86%. I’m a bot and I’m open source!