Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Llama3.2 3B feels a lot better than other models with same size (e.g. Gemma2, Phi3.5-mini models).

For anyone looking for a simple way to test Llama3.2 3B locally with UI, Install nexa-sdk(https://github.com/NexaAI/nexa-sdk) and type in terminal:

nexa run llama3.2 --streamlit

Disclaimer: I am from Nexa AI and nexa-sdk is an open-sourced. We'd love your feedback.



It's a great tool. Thanks!

I had to test it with Llama3.1 and was really easy. At a first glance Llama3.2 didn't seem available. The command you provided did not work, raising "An error occurred while pulling the model: not enough values to unpack (expected 2, got 1)".


Thanks for reporting. We are investigating this issue. Could you help submit an issue to our GitHub and provide a screenshot of the terminal (with pip show nexaai)? This could help us reproduce this issue faster. Much appreciated!



or grab lmstudio


For people who really care about open source, this is not.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: