Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

At this point subsidizing Chinese open-weights vendors by paying for them is just the right thing to do. Maybe they too might go closed-weights when they become SotA, but they're now pretty close and haven't done it.


I am wondering what kinds of harness are best for GLM, Deepseek, Qwen, Kimi.


OpenCode is great in general. At least one of them is specifically trained on CC - I think it was Qwen - so for those that should give best results.


Claude Code better than opencode for GLM models for me.


OpenCode with Kimi has been great for me.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: