simonw parent
Every time you send a prompt to a model you actually send the entire previous conversation along with it, in an array that looks like this:
You can see this yourself if you use their APIs.
that is true unless you use the Response API endpoint...
That's true, the signature feature of that API is that OpenAI can now manage your conversation state server-side for you.
You still have the option to send the full conversation JSON every time if you want to.
You can send "store": false to turn off the feature where it persists your conversation server-side for you.