Preferences

I think its an alternative because ollama has no UI and its hard to use for non-developers who will never touch the CLI

simonw
Ollama added a chat UI to their desktop apps a week ago: https://ollama.com/blog/new-app
apitman
Their new app is closed source right?
simonw
Huh, yeah it looks like the GUI component is closed source. Their GitHub version only has the CLI.
diggan
I think at this point it's fair to say that most of the stuff Ollama does, is closed source. AFAIK, only the CLI is open source, everything else isn't.
conradev
Yeah, and they’re also on a forked llama.cpp
kanestreet
Yup. They have not even acknowledged the fact that it’s closed, despite a ton of questions. Ppl are downloading it assuming it’s open source only to get a nasty surprise. No mention of it in their blog post announcing the GUI. Plus no new license for it. And no privacy policy. Feels deceptive.
accrual
I have been using the Ollama GUI on Windows since release and appreciated its simplicity. It recently received an update that puts a large "Turbo" button in the message box that links to a sign-in page.

I'm trying Jan now and am really liking it - it feels friendlier than the Ollama GUI.

dcreater
And ollamas founder was on here posting that they are still focused on local inference... I don't see ollama as anything more than a funnel for their subscription now
numpad0
I truly don't understand why it's supposed to be the end of the world. They need to monetize eventually, and simultaneously its userbase desireg good inference. It looks a complete win-win to me. Anyone can fork it in case they actually turn evil once it'd happen.

I mean, it's not like people enjoy lovely smell of cash burning and bias opinions heavily towards it, or is it like that?

This item has no comments currently.