lrvick parent
Is there a way to do this with a local LLM, without any internet access needed?
Yes, Pipecat already supports that natively, so this can be done easily with ollama. I have also built that into the environment variables with `OLLAMA_BASE_URL`.
About ollama in pipecat: https://docs.pipecat.ai/server/services/llm/ollama
Also, check out any provider they support, and it can be easily onboarded in a few lines of code.