Edit: I’ll be off-grid and offline, likely without cell service depending on how deep I go.
I would like to have an LLM with me when I go camping as a solo beginner to help. I have an Apple M1 pro with 16gb of RAM.
What's the best way to do this?
What's the best model?
Gemma3 Qwen 2.5 instruct Qwen 2.5 coder
You should take multiple distills/quants. It's good to have a high quality model sometimes but for most stuff you'll want something bellow 1GB for the fast response times. The quality is close to or better than the original chatGPT and they support extremely long contexts (if you have the memory.) It might be good to take a VLM as well (I've been happy with Qwen's VLM although it's slow.)