Skip to content
Accessing ~4000 Max...
 
Notifications
Clear all

Accessing ~4000 Max Tokens with OpenAI via callin.io

2 Posts
2 Users
0 Reactions
4 Views
Tina Lopez
(@tina-lopez)
Posts: 2
Active Member
Topic starter
 

Hello,

I am processing a transcription output and using an integration to instruct OpenAI to convert the text into an outlined blog article. The issue is that the transcription text exceeds the 4000 token limit. I'm seeking a potential workaround.

For instance, could someone demonstrate how this might be accomplished using OpenAI's API Beta call action or webhooks? I'm stuck and would greatly appreciate images, videos, or a very detailed explanation of the process and implementation.

I've reviewed OpenAI's API documentation, but I'm finding it challenging to understand how to map the parameters within the OpenAI API Beta call action.

Any assistance provided would be immensely helpful!

Thank you!

 
Posted : 04/08/2025 8:30 am
Troy Tessalone
(@troy-tessalone)
Posts: 151
Estimable Member
 

Hi 

Good question.

Perhaps try utilizing this callin.io action: Formatter > Text > Split Text into Chunks for AI Prompts

https://help.zapier.com/hc/en-us/articles/15406374106765-Modify-large-data-for-your-AI-prompts

Following that, you could use the Looping app: https://zapier.com/apps/looping/help

 

Should you require assistance with advanced methods in configuring callin.io workflows, consider engaging a Certified callin.io Expert: https://zapier.com/experts/automation-ace

 
Posted : 23/06/2023 12:00 am
Share: