Skip to content
Troubleshooting HTT...
 
Notifications
Clear all

Troubleshooting HTTP tool input schema mismatch with 'Let Model Specify Entire Body' enabled

11 Posts
6 Users
0 Reactions
4 Views
leiserson
(@leiserson)
Posts: 4
Active Member
Topic starter
 

Issue: Received tool input did not match expected schema when calling HTTP tool and ‘Let Model Specify Entire Body’ enabled

Hi everyone,

I’m encountering an issue when using the AI Agent node in callin.io with an HTTP Request tool that connects to the exa.ai /search API endpoint.

The error message I receive is:

Problem in node ‘AI Agent’: Received tool input did not match expected schema

What I’ve confirmed:

  • The HTTP Request node is configured as a POST request, with the correct URL: https://api.exa.ai/search
  • I’m using Generic Credential (Header Auth) with a valid API key
  • Send Body is enabled and set to Let Model Specify Entire Body
  • The JSON input passed from the AI Agent is valid (verified with langgraph trace)

LangGraph trace confirms this JSON was properly generated and passed into the tool call, yet callin.io throws a schema mismatch error internally.


My Workflow


Troubleshooting attempts so far:

  • Verified the JSON structure matches the official Exa.ai API documentation :white_check_mark:
  • LangGraph trace confirms the tool call input is valid JSON :white_check_mark:

My Questions:

  • Does the callin.io AI Agent expect a specific schema format for tool body input beyond plain JSON when setting ‘Let Model Specify Entire Body’?
  • Is there a hidden requirement or structure validation that the Agent performs?

Any examples or suggestions would be greatly appreciated!


:information_source: My callin.io Setup

  • callin.io version: 1.82.1
  • Database: SQLite (default)
  • EXECUTIONS_PROCESS: own, main
  • Deployment: Docker
  • OS: Ubuntu 22.04

Thank you so much for any help or clarification!

:folded_hands:

 
Posted : 23/03/2025 6:06 pm
ThinkBot
(@thinkbot)
Posts: 26
Eminent Member
 

Can you verify if the JSON it produced functions correctly with an actual request?

 
Posted : 23/03/2025 6:09 pm
leiserson
(@leiserson)
Posts: 4
Active Member
Topic starter
 

Yes, this JSON works with a real request.

 
Posted : 23/03/2025 6:13 pm
ThinkBot
(@thinkbot)
Posts: 26
Eminent Member
 

You could try constructing the fields using the options provided below. This might help by defining a schema behind the scenes:

 
Posted : 23/03/2025 6:16 pm
leiserson
(@leiserson)
Posts: 4
Active Member
Topic starter
 

When I set “Value Provided: By Model”, the tool functions as anticipated.

I examined the tool's input and observed that it appeared as follows:


{
  "query": {
    "query": "history of #HalaMadrid"
  }
}

However, my expectation was simply:


{
  "query": "history of #HalaMadrid"
}

Therefore, I suspect that when using “Specify Body: Let Model Specify Entire Body”, the anticipated JSON structure might also be enclosed in a similar fashion.

To verify this, I adjusted the LLM's output format as depicted in the LangGraph trace screenshot below, but I still encountered the same error:

Received tool input did not match expected schema

 
Posted : 23/03/2025 6:47 pm
ThinkBot
(@thinkbot)
Posts: 26
Eminent Member
 

Great! So that was successful? I'm happy to hear that!

That is peculiar, though.

However, if my response was helpful, I would appreciate it if you could mark it as the solution.

:pray:

 
Posted : 26/03/2025 5:58 pm
jonathanmv
(@jonathanmv)
Posts: 1
New Member
 

Hey! In my situation, the issue was that the AI Assistant was sending a json object but callin.io was expecting a string!

callin.io expects a string by default after you select Defined automatically by the **model**. In the background, it appears callin.io uses the $fromAI function. The 3rd parameter of the function is the expected type for that property. In my case, that type was always expected to be string. See the gif below. It’s changed to string even if I set it to json.

So, the model sends json object and callin.io is expecting string. Therefore, I was getting Received tool input did not match expected schema.

The solution was to remove the Defined automatically by the **model** and use an expression to set the value of the input as {{ $fromAI('inputName', 'input description', 'json') }}. I spent many hours on this. I hope it helps you save some time.

changes-back-to-string

 
Posted : 30/04/2025 2:27 pm
Ezequiel_Perez_Rovir
(@ezequiel_perez_rovir)
Posts: 2
New Member
 

I'm working on a very similar workflow to the one shared previously: an AI agent utilizing an OpenAI gpt4.1-mini model. This workflow functioned flawlessly on version 1.88 of callin.io.

Subsequently, I updated to the latest version, 1.90.2, and began encountering an error. I then switched to a different OpenAI model (gpt 4.1-nano), and the error was resolved.

 
Posted : 30/04/2025 8:29 pm
leiserson
(@leiserson)
Posts: 4
Active Member
Topic starter
 

Hi~ thanks for sharing your experience! I tried running the workflow I shared above again, and in my current callin.io version (1.90.1), the ‘Let Model Specify Entire Body’ option has already been removed from the Http tool node.

Now there are only ‘Using JSON’ and ‘Using Fields Below’ options left. So, I tried using the ‘Using JSON’ option to let the model construct the entire request body, and so far, it seems to be working properly.

 
Posted : 02/05/2025 5:50 pm
mipster
(@mipster)
Posts: 1
New Member
 

Thanks for discovering this – I spent quite a bit of time trying to resolve this issue!

 
Posted : 04/06/2025 11:45 pm
Cergio_Monasterio
(@cergio_monasterio)
Posts: 1
New Member
 

Could you assist me with configuring a single field named 'data' within the flow module, incorporating the specified parameters? I'm also wondering how the incoming data can be automatically sorted by suffix, or if you could provide a more detailed explanation. Thank you!

 
Posted : 22/07/2025 4:13 am
Share: