Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
deaux
22 days ago
|
parent
|
context
|
favorite
| on:
Improving 15 LLMs at Coding in One Afternoon. Only...
At this point subsidizing Chinese open-weights vendors by paying for them is just the right thing to do. Maybe they too might go closed-weights when they become SotA, but they're now pretty close and haven't done it.
DeathArrow
22 days ago
[–]
I am wondering what kinds of harness are best for GLM, Deepseek, Qwen, Kimi.
deaux
22 days ago
|
parent
|
next
[–]
OpenCode is great in general. At least one of them is
specifically
trained on CC - I think it was Qwen - so for those that should give best results.
azuanrb
22 days ago
|
parent
|
prev
|
next
[–]
Claude Code better than opencode for GLM models for me.
JSR_FDED
21 days ago
|
parent
|
prev
[–]
OpenCode with Kimi has been great for me.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: