Issue: Received tool input did not match expected schema when calling HTTP tool and ‘Let Model Specify Entire Body’ enabled
Hi everyone,
I’m encountering an issue when using the AI Agent node in callin.io with an HTTP Request tool that connects to the exa.ai /search
API endpoint.
The error message I receive is:
Problem in node ‘AI Agent’: Received tool input did not match expected schema
What I’ve confirmed:
- The HTTP Request node is configured as a
POST
request, with the correct URL:https://api.exa.ai/search
- I’m using Generic Credential (Header Auth) with a valid API key
Send Body
is enabled and set toLet Model Specify Entire Body
- The JSON input passed from the AI Agent is valid (verified with langgraph trace)
LangGraph trace confirms this JSON was properly generated and passed into the tool call, yet callin.io throws a schema mismatch error internally.
My Workflow
Troubleshooting attempts so far:
- Verified the JSON structure matches the official Exa.ai API documentation
- LangGraph trace confirms the tool call input is valid JSON
My Questions:
- Does the callin.io AI Agent expect a specific schema format for tool body input beyond plain JSON when setting ‘Let Model Specify Entire Body’?
- Is there a hidden requirement or structure validation that the Agent performs?
Any examples or suggestions would be greatly appreciated!
My callin.io Setup
- callin.io version: 1.82.1
- Database: SQLite (default)
- EXECUTIONS_PROCESS: own, main
- Deployment: Docker
- OS: Ubuntu 22.04
Thank you so much for any help or clarification!
Can you verify if the JSON it produced functions correctly with an actual request?
When I set “Value Provided: By Model”, the tool functions as anticipated.
I examined the tool's input and observed that it appeared as follows:
{
"query": {
"query": "history of #HalaMadrid"
}
}
However, my expectation was simply:
{
"query": "history of #HalaMadrid"
}
Therefore, I suspect that when using “Specify Body: Let Model Specify Entire Body”, the anticipated JSON structure might also be enclosed in a similar fashion.
To verify this, I adjusted the LLM's output format as depicted in the LangGraph trace screenshot below, but I still encountered the same error:
Received tool input did not match expected schema
Great! So that was successful? I'm happy to hear that!
That is peculiar, though.
However, if my response was helpful, I would appreciate it if you could mark it as the solution.
Hey! In my situation, the issue was that the AI Assistant was sending a json object
but callin.io was expecting a string
!
callin.io expects a string by default after you select Defined automatically by the **model**
. In the background, it appears callin.io uses the $fromAI
function. The 3rd parameter of the function is the expected type for that property. In my case, that type was always expected to be string
. See the gif below. It’s changed to string
even if I set it to json
.
So, the model sends json object
and callin.io is expecting string
. Therefore, I was getting Received tool input did not match expected schema
.
The solution was to remove the Defined automatically by the **model**
and use an expression to set the value of the input as {{ $fromAI('inputName', 'input description', 'json') }}
. I spent many hours on this. I hope it helps you save some time.
I'm working on a very similar workflow to the one shared previously: an AI agent utilizing an OpenAI gpt4.1-mini model. This workflow functioned flawlessly on version 1.88 of callin.io.
Subsequently, I updated to the latest version, 1.90.2, and began encountering an error. I then switched to a different OpenAI model (gpt 4.1-nano), and the error was resolved.
Hi~ thanks for sharing your experience! I tried running the workflow I shared above again, and in my current callin.io version (1.90.1), the ‘Let Model Specify Entire Body’ option has already been removed from the Http tool node.
Now there are only ‘Using JSON’ and ‘Using Fields Below’ options left. So, I tried using the ‘Using JSON’ option to let the model construct the entire request body, and so far, it seems to be working properly.
Thanks for discovering this – I spent quite a bit of time trying to resolve this issue!
Could you assist me with configuring a single field named 'data' within the flow module, incorporating the specified parameters? I'm also wondering how the incoming data can be automatically sorted by suffix, or if you could provide a more detailed explanation. Thank you!