> If I can't run a tool locally (that includes 'compiling' the model from data) , it is a liability not a tool.
I don't have a GPU supercluster laying around to train my own deepseek. You can't be independent and self sufficient in everything. Cooperation, delegation and specialization is what makes humanity so strong.
I also have gcc or go on my machine.
Proprietary language models are the proprietary compilers or languages of old.
As the blog post suggests, let's learn the lessons of the past.
If I can't run a tool locally (that includes 'compiling' the model from data) , it is a liability not a tool.
Even local models are like requiring proprietary firmware blobs as a programming tool.
Useful or not, this situation is not desirable.
Yours, a local coding assistant user (ollama and friends)