Groq’s LPU chip emerges as a potential solution to the challenges faced by AI developers relying on GPUs, sparking comparisons with ChatGPT.
The latest artificial intelligence (AI) tool to capture the public’s attention is the Groq LPU Inference Engine, which became an overnight sensation on social media after its public benchmark tests went viral, outperforming the top models by other Big Tech companies.
Groq, not to be confused with Elon Musk’s AI model called Grok, is, in fact, not a model itself but a chip system through which a model can run.
The team behind Groq developed its own “software-defined” AI chip which they called a language processing unit (LPU), developed for inference purposes. The LPU allows Groq to generate roughly 500 tokens per second.