Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
huydotnet
83 days ago
|
parent
|
context
|
favorite
| on:
LM Studio 0.4
yup, I've been using llama.cpp for that on my PC, but on my Mac I found some cases where MLX models work best. haven't tried MLX with llama.cpp, so not sure how that will work out (or if it's even supported yet).
Consider applying for YC's Summer 2026 batch! Applications are open till May 4
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: