Get the latest tech news
Llama.cpp guide – Running LLMs locally on any hardware, from scratch
Psst, kid, want some cheap and small LLMs?
If you came here with intention of finding some piece of software that will allow you to easily run popular models on most modern hardware for non-commercial purposes- grab LM Studio, read the of this post, and go play with it. Figure 1 from this paper nicely shows what each of the sampling algorithms does to probability distribution of tokens: Exclude Top Choices (XTC)- This is a funky one, because it works a bit differently from most other samplers. d- DRY k- Top-K y- Typical-P p- Top-P m- Min-P x- Exclude Top Choices (XTC) t- Temperature
Or read this on Hacker News