How to fix Error 400 in Pipedream workflow due to exceeding model's context length?

This topic was automatically generated from Slack. You can find the original thread here.

Hello, when fullfilling a pipedream workflow, i got this error, how do i fix this? - Error - Request failed with status code 400
{“error”:{“message”:“This model’s maximum context length is 4097 tokens. However, your messages resulted in 13924 tokens. Please reduce the length of the messages.”,“type”:“invalid_request_error”,“param”:“messages”,“code”:“context_length_exceeded”}}

Hi , it means the message you passed in is too long. You will need to try a shorter messsage

And there is no way for me to anyhow bypass this?

I assume you’re using Open AI action. For this, you will need to use Node.js code step to split your message to multiple parts and send it to Open AI API. I think this need a sufficient coding knowledge to implement

ok

Is this character limit from Pipe dream or OpenAI?
And if it is from Pipedream when and why can’t this be expanded to include more characters?