Non-technicals don't know how LLMs work, and, more importantly, don't care about their privacy.
For a technology to be widely used, by definition, you need to make it appealing to the masses, and there is almost zero demand for private LLM right now.
That's why I don't think that local llms will win. There are narrow use cases where regulations can force local llm usage (like for medical stuff), but overall I think that services will win (as they always do)
you need some really expensive hardware to run a local LLM, most of which is unavailable to the average user. The demand might just simply be hidden as these users do not know nor want to expend the resources for it.
but i have hope that the hardware costs will come down eventually, enough that it reveals the demand for local LLM.
After all, i prefer my private questions to an LLM not be ever revealed.
We can have services but also private history/contexts. Those can be "local" (and encrypted).
I can't wait actually. It's less about privacy to me than to being offline.