Datagrom AI News Logo

OpenAI working with Broadcom and TSMC to develop custom AI inference chip

OpenAI working with Broadcom and TSMC to develop custom AI inference chip

October 30, 2024: OpenAI Partners for Custom AI Inference Chips - OpenAI is collaborating with Broadcom and TSMC to develop a custom AI inference chip, aiming to diversify chip supply and reduce costs. Initially considering in-house foundries, OpenAI pivoted to focusing on in-house chip design while leveraging partner expertise for manufacturing. The new chips will optimize AI model inference, which requires speed and efficiency over raw computational power, unlike Nvidia GPUs used for training. As AI demand grows, creating specialized chips will help OpenAI tailor infrastructure to technical and budget needs, aligning with industry trends toward custom, efficient hardware for AI workloads.

KEEP UP WITH THE INNOVATIVE AI TECH TRANSFORMING BUSINESS

Datagrom keeps business leaders up-to-date on the latest AI innovations, automation advances,
policy shifts, and more, so they can make informed decisions about AI tech.