Heh. Those aren't the weird ones. Those are just the base model sizes for popular semi-open foundational models. The weird ones are the "merged" models where people arbitrarily combine layers from multiple models to make larger ones with more layers, like Solar-10-7B (https://huggingface.co/upstage/SOLAR-10.7B-v1.0) a 10B made from 7B Mistral fine tunes. Or the 22B "llamas".
> What are LLM Parameters?
LLM Parameters are settings that you can adjust to control how the LLM generates texts. They can affect the quality, diversity, and creativity of the generated texts. Some of the common LLM parameters are temperature, number of tokens, top-p, presence penalty, and frequency penalty.
In this post, I will explain what these parameters are and how to use them effectively. I will also show you some examples of texts that can be generated by LLMs with different parameter values.