This topic was automatically generated from Slack. You can find the original thread here.
Hello, when fullfilling a pipedream workflow, i got this error, how do i fix this? - Error - Request failed with status code 400
{“error”:{“message”:“This model’s maximum context length is 4097 tokens. However, your messages resulted in 13924 tokens. Please reduce the length of the messages.”,“type”:“invalid_request_error”,“param”:“messages”,“code”:“context_length_exceeded”}}
I assume you’re using Open AI action. For this, you will need to use Node.js code step to split your message to multiple parts and send it to Open AI API. I think this need a sufficient coding knowledge to implement