popalchemist parent
The critiques about local inference are valid, if you're billing this as an open source alternative to existing cloud based solutions.
Thanks for the feedback, probably should have been clearer in my original post and in the README as well. Local inference is already supported via Pipecat, you can use ollama or any custom OpenAI endpoint. Local STT is also supported via whisper, which pipecat will download and manage for you.