You should place it within a sub-workflow. While it can reside in the same workflow file, I'd advise against using it in the same execution. Doing so ...
Sharing the solution here for easier access by others: By retrieving the execution data, you can obtain the token usage from the model. image940×6...
I operate callin.io as the backend for a SaaS initiative. When our users engage with our agents, I aim to record the token consumption to deduct it fr...
Unfortunately, that's an estimator, not actual data extraction from the workflow. From what I can see, it retrieves the text, the AI model, and send...
Unfortunately, no. What I am trying to retrieve is the information from the model node: image1073×644 43.2 KB If I open this node, I can see th...
Yes. MCP URLs are also not functioning in the self-hosted setup here. However, I did manage to run it successfully on the cloud version.
I'm using callin.io Cloud to test it. The self-hosted version is still not working.
Do I need to enable anything in the environment variables to use this effectively? I'm unable to select the tools or get them to function. image48...
Understood, in that scenario, you can utilize this node: image343×325 7.99 KB If you convert your tool into a sub-workflow and invoke it using th...
Could you please share your workflow with us? You can paste it directly into a code block. To create a code block, simply click here: Your work...
In this scenario, you could utilize the "Chat memory manager node". It offers the capability to retrieve messages without modifying the chat history. ...
Does the agent have access to the subnode token usage data? If you inquire, will it respond?
The information resides within the OpenAI Chat Model node, not the AI Agent node. Did you check that specific node? That's precisely the issue. The ...
I'm also searching for a similar solution.