-
Notifications
You must be signed in to change notification settings - Fork 34
Open
Description
Reusable Prompts were added in this commit 85ef34f however it does not actually work because when you use the createModelResponse
POST https://api.openai.com/v1/responses
{
"model": "gpt-4o-mini",
"input": "Say hello"
}
will return
{
...
"instructions": null,
...
}
which is fine.
However, if you use a Reusable Prompt
POST https://api.openai.com/v1/responses
{
"model": "gpt-4o-mini",
"prompt" : {
"id" : "PROMPT_ID",
"variables" : { },
"version" : "YOUR_VERSIOn"
}
}
will include the Reusable Prompt's instructions in the response body, which looks like this
{
...
"instructions": [
{
"type": "message",
"content": [
{
"type": "input_text",
"text": "SOME MESSAGE."
}
],
"role": "system"
},
{
"type": "message",
"content": [
{
"type": "input_text",
"text": "SOME OTHER MESSAGE"
}
],
"role": "user"
},
{
"type": "message",
"content": [
{
"type": "input_text",
"text": "YET ANOTHER MESSAGE"
}
],
"role": "assistant"
}
],
...
}
since Responses expects Instructions to be a (optional) String, the JSON parsing will fail, leading to an exception, even though the request succeeds in the background.
The parsing should handle this scenario so that you can actually use the Reusable Prompt's.
Metadata
Metadata
Assignees
Labels
No labels