Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

no, and that is widely known. the actual problem is that the margins are not sufficient at that scale to make up for the gargantuan training costs to train their SOTA model.


They are large enough to cover their previous training costs but not their next gen training costs.

i.e They made more money on 3.5 than 3.5 cost to train, but didn't make enough money on 3.5 to train 4.0.


Source on that?

Because inference revenue is outpacing training cost based on OpenAI’s report and intuition.


Net inference revenue would need to be outpacing to go against his think about margins.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: