Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is the absolute biggest grift of the century by the groq team. They never shared actual TCO, and I remember a Seminalaysis article about the power consumption being actually insane - this makes sense because they scale the number of chips to fit a single model when they have no dram. They have good inference latency but there was no way the economics were going to work out. Meanwhile Nvidia with every advantage in the world decides they’re worth 20B? It actually doesn’t make sense at all. The only scenarios the groq system would be worth it is in the exact throughput-optimized scenarios Nvidia already thrives in.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: