Skip to content
RAG agent not behav...
 
Notifications
Clear all

RAG agent not behaving as expected - data not forwarded or processed

3 Posts
3 Users
0 Reactions
4 Views
Anvi_Motions
(@anvi_motions)
Posts: 1
New Member
Topic starter
 

Describe the problem/error/question

I have stored image names and their labels in a Supabase database. I've instructed the agent that if a question is image-related, it should call the faq_images tool. The response should then be processed using the following instructions/system message:

  • Call the faq_images tool.
  • Parse the text field from the result as JSON.
  • Extract the image_name:
    • From metadata.image_name, or
    • From the last part of pageContent (after the final comma).
  • Pass the image_name to the Ask for image tool in this format: { "image_name": "3_image_1.jpeg" }

The faq_images tool response looks good:
json
{
“response”: [
{
“type”: “text”,
“text”: “{"pageContent":"Lock and key,3_image_1.jpeg","metadata":{"loc":{"lines":{"to":1,"from":1}},"source":"blob","blobType":"text/plain","image_name":"3_image_1.jpeg"}}”
}
]
}

What is the error message (if any)?

However, the agent does not process or even forward the response as is to the Ask for image tool. This is what I see as the output of the AI agent tool:

json
“response”: {
“generations”: [
[
{
“text”: “”,
“generationInfo”: {
“prompt”: 0,
“completion”: 0,
“finish_reason”: “tool_calls”,…

And the Ask for image tool receives { "query": {} } as input.

Please share your workflow

json
{
“name”: “Test_ChatBot”,
“nodes”: [
{
“parameters”: {
“options”: {}
},
“id”: “4ef11502-3f75-438c-9ed1-b02b903cbfc2”,
“name”: “OpenAI Chat Model”,
“type”: “@callin.io/callin.io-nodes-langchain.lmChatOpenAi”,
“typeVersion”: 1,
“position”: [
-19500,
2160
],
“credentials”: {
“openAiApi”: {
“id”: “__REPLACE_ME__”,
“name”: “__REPLACE_ME__”
}
}
},
{
“parameters”: {},
“id”: “9f32fa0c-8e2c-44b9-9b75-f41306e2a766”,
“name”: “Postgres Chat Memory”,
“type”: “@callin.io/callin.io-nodes-langchain.memoryPostgresChat”,
“typeVersion”: 1,
“position”: [
-19360,
2160
],
“notesInFlow”: false,
“credentials”: {
“postgres”: {
“id”: “__REPLACE_ME__”,
“name”: “__REPLACE_ME__”
}
}
},
{
“parameters”: {
“content”: “## RAG AI Agent with Chat Interface”,
“height”: 545,
“width”: 1476
},
“id”: “04219df7-03d7-4d3d-8302-9c7dce3fc14f”,
“name”: “Sticky Note2”,
“type”: “callin.io-nodes-base.stickyNote”,
“typeVersion”: 1,
“position”: [
-19660,
1760
]
},
{
“parameters”: {
“public”: true,
“options”: {}
},
“id”: “440e3c8c-2131-4231-ad88-2a763e383b5c”,
“name”: “When chat message received”,
“type”: “@callin.io/callin.io-nodes-langchain.chatTrigger”,
“typeVersion”: 1.1,
“position”: [
-19300,
1880
],
“webhookId”: “42c0ba7c-5170-40bb-82ec-f46958a30811”
},
{
“parameters”: {
“options”: {
“systemMessage”: “You are a helpful virtual assistant for answering user FAQs.n- For text-based questions, use the ` faq_text ` tool.n- For visual/image-related questions, use the ` faq_images ` tool first, then pass the image_name to the ` Ask for image ` tool.nnRules:n- Only respond based on the tool outputs.n- Do not invent or use internal knowledge.n- Respond clearly and politely.n- Use short, helpful messages.n- Never include system jargon or raw JSON in replies.n- If no answer/image is found, ask the user to clarify.”,
“maxIterations”: 3
}
},
“id”: “3143f475-8e26-444d-b220-bd75f83774f5”,
“name”: “RAG AI Agent”,
“type”: “@callin.io/callin.io-nodes-langchain.agent”,
“typeVersion”: 1.6,
“position”: [
-19040,
1880
],
“alwaysOutputData”: true,
“executeOnce”: false,
“retryOnFail”: false
},
{
“parameters”: {
“toolDescription”: “This tool receives an image_name and returns a signed URL for that image from object storage. Use this after retrieving metadata from the faq_images tool.”,
“method”: “POST”,
“url”: “= https://yoururl.callin.io.co/storage/v1/object/sign/images/%7B%7B$json.image_name%7D }",
“authentication”: “predefinedCredentialType”,
“nodeCredentialType”: “supabaseApi”,
“sendHeaders”: true,
“specifyHeaders”: “json”,
“jsonHeaders”: “{n "Content-Type": "application/json";n}",
“sendBody”: true,
“specifyBody”: “json”,
“jsonBody”: “{n "expiresIn": 3600 # URL valid for 1 hourn}"
},
“type”: “@callin.io/callin.io-nodes-langchain.toolHttpRequest”,
“typeVersion”: 1.1,
“position”: [
-19240,
2160
],
“id”: “4e81a7c2-ba55-4224-94c7-52ac484024a3”,
“name”: “Ask for image”,
“credentials”: {
“supabaseApi”: {
“id”: “__REPLACE_ME__”,
“name”: “__REPLACE_ME__”
}
}
},
{
“parameters”: {
“options”: {}
},
“type”: “@callin.io/callin.io-nodes-langchain.embeddingsOpenAi”,
“typeVersion”: 1.2,
“position”: [
-19040,
2240
],
“id”: “352ab70f-e055-4b73-85dd-06a85431648d”,
“name”: “Embeddings OpenAI2”,
“credentials”: {
“openAiApi”: {
“id”: “__REPLACE_ME__”,
“name”: “__REPLACE_ME__”
}
}
},
{
“parameters”: {
“options”: {}
},
“type”: “@callin.io/callin.io-nodes-langchain.embeddingsOpenAi”,
“typeVersion”: 1.2,
“position”: [
-18740,
2240
],
“id”: “bc9dbad6-5ec1-4491-b027-c4058a532e4b”,
“name”: “Embeddings OpenAI3”,
“credentials”: {
“openAiApi”: {
“id”: “__REPLACE_ME__”,
“name”: “__REPLACE_ME__”
}
}
},
{
“parameters”: {
“mode”: “retrieve-as-tool”,
“toolName”: “=faq_images”,
“toolDescription”: “Use this tool when the user’s message refers to something visual, such as asking to see a object.nIt performs a semantic search on image descriptions stored in the image vector database.nIf no match is found, ask the user to clarify.”,
“tableName”: {
“__rl”: true,
“value”: “faq_images”,
“mode”: “list”,
“cachedResultName”: “faq_images”
},
“topK”: 1,
“options”: {
“queryName”: “match_faq_images”
}
},
“type”: “@callin.io/callin.io-nodes-langchain.vectorStoreSupabase”,
“typeVersion”: 1,
“position”: [
-18760,
2100
],
“id”: “af44dace-e2d4-408e-b5e7-5828e5099f5e”,
“name”: “faq_images”,
“credentials”: {
“supabaseApi”: {
“id”: “__REPLACE_ME__”,
“name”: “__REPLACE_ME__”
}
}
},
{
“parameters”: {
“mode”: “retrieve-as-tool”,
“toolName”: “Customer_FAQs”,
“toolDescription”: “This tool searches a semantic FAQ knowledge base to return the most relevant answer(s) to a user’s question.s.nnInput: natural language questionnOutput: best-matching text snippet(s).n”,
“tableName”: {
“__rl”: true,
“value”: “documents”,
“mode”: “list”,
“cachedResultName”: “documents”
},
“topK”: 1,
“options”: {}
},
“type”: “@callin.io/callin.io-nodes-langchain.vectorStoreSupabase”,
“typeVersion”: 1,
“position”: [
-19060,
2100
],
“id”: “ccffe7ab-374c-485c-a49e-e8b27ce26c28”,
“name”: “faq_text”,
“credentials”: {
“supabaseApi”: {
“id”: “__REPLACE_ME__”,
“name”: “__REPLACE_ME__”
}
}
}
],
“pinData”: {},
“connections”: {
“OpenAI Chat Model”: {
“ai_languageModel”: [
[
{
“node”: “RAG AI Agent”,
“type”: “ai_languageModel”,
“index”: 0
}
]
]
},
“Postgres Chat Memory”: {
“ai_memory”: [
[
{
“node”: “RAG AI Agent”,
“type”: “ai_memory”,
“index”: 0
}
]
]
},
“When chat message received”: {
“main”: [
[
{
“node”: “RAG AI Agent”,
“type”: “main”,
“index”: 0
}
]
]
},
“RAG AI Agent”: {
“main”: [
<span class="chcklst-box fa fa-square-o fa-fw"></span>
]
},
“Ask for image”: {
“ai_tool”: [
[
{
“node”: “RAG AI Agent”,
“type”: “ai_tool”,
“index”: 0
}
]
]
},
“Embeddings OpenAI2”: {
“ai_embedding”: [
[
{
“node”: “faq_text”,
“type”: “ai_embedding”,
“index”: 0
}
]
]
},
“Embeddings OpenAI3”: {
“ai_embedding”: [
[
{
“node”: “faq_images”,
“type”: “ai_embedding”,
“index”: 0
}
]
]
},
“faq_images”: {
“ai_tool”: [
[
{
“node”: “RAG AI Agent”,
“type”: “ai_tool”,
“index”: 0
}
]
]
},
“faq_text”: {
“ai_tool”: [
[
{
“node”: “RAG AI Agent”,
“type”: “ai_tool”,
“index”: 0
}
]
]
}
},
“active”: false,
“settings”: {
“executionOrder”: “v1”
},
“versionId”: “5fdec557-b06f-45cb-bc94-acdb1dfeea19”,
“meta”: {
“templateCredsSetupCompleted”: true,
“instanceId”: “xxxxxxxxxxxx”
},
“id”: “9G9ef76lgVYB2TG4”,
“tags”: <span class="chcklst-box fa fa-square-o fa-fw"></span>
}

Information on your callin.io setup

callin.io version: 1.81.4
Database : Supabase vector
callin.io EXECUTIONS_PROCESS setting (default: own, main): main
Running callin.io via (Docker, npm,desktop app) : Docker container running locally
Operating system: Windows

I have hit a roadblock now. Any help to move forward would be greatly appreciated. Is there a feature where we could tap into or process tool output before the RAG agent receives it? I am relatively new to callin.io.

Please provide the rewritten markdown content *it should be in the markdown format.

 
Posted : 24/03/2025 3:49 pm
Sebas
(@sebas)
Posts: 2
New Member
 

Hi, could you please share the workflow JSON here? Using the </> icon will help make answering your question much easier.

 
Posted : 24/03/2025 8:25 pm
system
(@system)
Posts: 332
Reputable Member
 

This thread was automatically closed 90 days after the last response. No further replies can be made.

 
Posted : 22/06/2025 8:26 pm
Share: