Get the latest tech news
Experimenting with Local LLMs on macOS
A developer's guide to downloading and running LLMs on macOS, for experimentation and privacy.
Laurie has a great post about it, which I highly recommend, but in summary they are generally good at summarizing text, regurgitating home maintenance advice from reddit, or telling you that you have cancer. Joking aside, we accepted the concept of LLMs too quickly, when the truth is that we never expected computers to figure out human speech before robots were walking among us. LM Studio has two runtimes on macOS, llama.cpp which we covered earlier, and MLX, which is an ML engine developed by Apple and runs a bit faster, but offers less configuration in the UI.
Or read this on Hacker News