I've taken a similarly specced machine running Linux out camping and on boat trips "a number of times" (before you complain about me not enjoying nature I was living this way for a while to save money on rent.) here are the models I like:
Gemma3
Qwen 2.5 instruct
Qwen 2.5 coder
You should take multiple distills/quants. It's good to have a high quality model sometimes but for most stuff you'll want something bellow 1GB for the fast response times. The quality is close to or better than the original chatGPT and they support extremely long contexts (if you have the memory.) It might be good to take a VLM as well (I've been happy with Qwen's VLM although it's slow.)
You're not running a deepseek model on your macbook.
Gemma is Google's distillation of their larger Gemini model (at least that's my understanding.) Qwen is alibab's model. Qwen is usually very good at code, gemma tends to be a little better at everything else.
There are Deepseek distills that use either qwen or gemma as a base. I haven't been impressed with them though. TBH I've felt like most of the reasoning models are overhyped.
Cool, I'll try them out and see which I like best. Good to know that deepseek distills are not the move. I'm excited on being able to take pictures of plants/trees/other things and get information.
Any tips or fun ways you used your local model while camping?
Gemma3 Qwen 2.5 instruct Qwen 2.5 coder
You should take multiple distills/quants. It's good to have a high quality model sometimes but for most stuff you'll want something bellow 1GB for the fast response times. The quality is close to or better than the original chatGPT and they support extremely long contexts (if you have the memory.) It might be good to take a VLM as well (I've been happy with Qwen's VLM although it's slow.)