There is a very strong use case for less-powerful but local LLMs, and there's going to be a big expansion in that area in the next couple years. So big, I'll bet that all the major AI players will do everything they can to cripple them.
Unfortunately I think you're overestimating how many people care enough about privacy to go through the effort of running LLMs locally and likely buying a GPU