Preferences

4 points
Does anyone know if there are there any plans for browsers to natively integrate LLMs, LLM APIs, or LLM models like Llama for local use by web applications?

I feel there's a large opportunity here for a more privacy-friendly, on-device solution that doesn't send the user's data to OpenAI.

Is RAM the current main limitation?


Thank you! This is exactly what I was looking for. I hope these become part of the web platform APIs! With Google pushing this effort, it might be highly likely.

I hope Apple will follow suit with some of their small models (https://huggingface.co/apple/OpenELM).

And then maybe even Firefox will join them...

Every big tech company is trying to do this. FB (through whatsapp), Google (through chrome/Android), Apple (through Safari/iOS/etc). As soon as they meet their internal metrics, they will release these to public
"Is RAM the current main limitation?"

(V)RAM+processing power+storage(I mean what kind of average user wants to clog half their hard drive for a subpar model that output 1 token a second?)

check out https://github.com/mlc-ai/web-llm

IMO the main limitation is access to powerful GPUs for running models locally and the size of some models causing UX problems with cold starts

This item has no comments currently.

Keyboard Shortcuts

Story Lists

j
Next story
k
Previous story
Shift+j
Last story
Shift+k
First story
o Enter
Go to story URL
c
Go to comments
u
Go to author

Navigation

Shift+t
Go to top stories
Shift+n
Go to new stories
Shift+b
Go to best stories
Shift+a
Go to Ask HN
Shift+s
Go to Show HN

Miscellaneous

?
Show this modal