Get the latest tech news
'Forget ChatGPT: Why Researchers Now Run Small AIs On Their Laptops'
Nature published an introduction to running an LLM locally, starting with the example of a bioinformatician who's using AI to generate readable summaries for his database of immune-system protein structures. "But he doesn't use ChatGPT, or any other web-based LLM." He just runs the AI on his Mac....
Researchers might use such tools to save money, protect the confidentiality of patients or corporations, or ensure reproducibility... As computers get faster and models become more efficient, people will increasingly have AIs running on their laptops or mobile devices for all but the most intensive needs. The article's list of small open-weights models includes Meta's Llama, Google DeepMind's Gemma, Alibaba's Qwen, Apple's DCLM, Mistral's NeMo, and OLMo from the Allen Institute for AI. "If you are able to craft a data set that is very rich in those reasoning tokens, then the signal will be much richer," he says... Sharon Machlis, a former editor at the website InfoWorld, who lives in Framingham, Massachusetts, wrote a guide to using LLMs locally, covering a dozen options.
Or read this on Slashdot