This topic was automatically generated from Slack. You can find the original thread here.
Does anyone have any experience writing ChatGPT code to get around token limits?
I’m working on this code:
const { Configuration, OpenAIApi } = require(“openai”);
const apiKey = process.env.OPENAI_API_KEY; // Set your OpenAI API key here
const openai = new OpenAIApi(
new Configuration({
apiKey,
})
);
const largeText = steps.fetch_content.value.content;
const prompt = Summarize the following text:\n\n${largeText}
;
(async () => {
try {
const completion = await openai.createChatCompletion({
model: “gpt-3.5-turbo”,
messages: [{ role: “user”, content: prompt }],
temperature: 0.2,
});
**const** summary = completion.choices[0].message.content;
console.log(“Summary:“, summary);
_// You can take further actions with the summary, such as sending it to another service or storing it._
} catch (error) {
console.error(“Error:“, error.message);
}
})();
The workflow is Trigger (new PDF file added to google drive) > PDF_to_Anything_converter (output is text to an html site) > HTTP/Webook > ChatGPT code (what’s written above).
Basically, I want to be able to segment a big document into smaller chunks for GPT to analyze.
Any thoughts? Am I explaining myself extremely poorly? Let me know.