Preferences

westurner parent
braintrust-proxy: https://github.com/braintrustdata/braintrust-proxy

LocalAI: https://github.com/mudler/LocalAI

E.g promptfoo and chainforge have multi-LLM workflows.

Promptfoo has a YAML configuration for prompts, providers,: https://www.promptfoo.dev/docs/configuration/guide/

What is the system prompt, and how does a system prompt also bias an analysis?

/? "system prompt" https://hn.algolia.com/?dateRange=all&page=0&prefix=false&qu...


prasoonds
Thank you for the links! We'll take a look.

At the moment, the system prompt is hardcoded in code. I don't think it should bias analyses because our goal is usually returning code that does something specific (for eg "calculate normalized rates for column X grouped by column Y") as opposed to something generic ("why is our churn rate going up"). So, the idea would be that the operator is responsible for asking the right questions - the AI merely acts as a facilitator to generate and understand code.

Also tangentially, we do want to allow users some king of local prompt management and allow easy access, potentially via slash commands. In our testing so far, the existing prompt largely works (took us a fair bit of hit-and-trial to find something that mostly works for common data usecases) but we're hitting limitations of a fixed system prompt so these links will be useful.

This item has no comments currently.