This topic was automatically generated from Slack. You can find the original thread here.
Thanks for the great service you provide.
I am using the gpt4-vision-preview model in my action “Chat with OpenAI (ChatGPT) API”. The version of this action has been updated from 0.1.3 to 0.1.4/0.1.5 to support the “Response Format”.
However, “Response Format” and “JSON Mode”, which can be specified using this, are not supported for all models. It is only supported for the gpt-4-1106-preview and gpt-3.5-turbo-1106 at the moment. https://platform.openai.com/docs/guides/text-generation/json-mode
To prevent these errors and improve model performance, when calling gpt-4-1106-preview or gpt-3.5-turbo-1106, you can set response_format to { "type": "json_object" } to enable JSON mode.