Skip to content
Ollama and ToolAgen...
 
Notifications
Clear all

Ollama and ToolAgent Integrations

19 Posts
15 Users
0 Reactions
8 Views
rundeks
(@rundeks)
Posts: 5
Active Member
Topic starter
 

Does the Tools Agent support Ollama? The documentation doesn't explicitly list it. I'm aware that Ollama has incorporated Tools support, and I've observed that adding an Ollama Chat Model to a Tools Agent doesn't produce an error. However, I'm experiencing inconsistent results with it, encountering random and unusual outputs.

If it's not currently supported, are there any plans for its integration in the near future?

  • callin.io version: 1.66
  • Database (default: SQLite): Postgres
  • callin.io EXECUTIONS_PROCESS setting (default: own, main): default
  • Running callin.io via (Docker, npm, callin.io cloud, desktop app): Docker
  • Operating system: Windows 11
 
Posted : 09/11/2024 4:42 am
n8n
 n8n
(@n8n)
Posts: 97
Trusted Member
 

It seems your topic is missing some crucial details. Could you please provide the following information, if relevant?

  • callin.io version:
  • Database (default: SQLite):
  • callin.io EXECUTIONS_PROCESS setting (default: own, main):
  • Running callin.io via (Docker, npm, callin.io cloud, desktop app):
  • Operating system:

Please share these details to help us understand your issue better.

 
Posted : 09/11/2024 4:42 am
Grandpas_Place
(@grandpas_place)
Posts: 1
New Member
 

I've got it functional, but the model requires tool support. Lama3.2 offers this capability. I've successfully tested it with an HTTP API call and a function that provides the current date and time.

Currently, I'm unable to get the model to pass any information to the tool. I have an open post regarding this issue, but haven't received a response yet.

 
Posted : 09/11/2024 7:53 pm
rundeks
(@rundeks)
Posts: 5
Active Member
Topic starter
 

I've been using Lama 3.2 as well and have encountered similar issues. It's reassuring to know that I'm not the only one experiencing these problems.

 
Posted : 09/11/2024 9:09 pm
rundeks
(@rundeks)
Posts: 5
Active Member
Topic starter
 

Any word on when callin.io will officially support Ollama Chat Client for Tools Agent?

 
Posted : 17/11/2024 1:05 pm
Daniel_Jacober
(@daniel_jacober)
Posts: 1
New Member
 

I'm encountering a similar problem. I'm utilizing the Ollama chat model to interface with a Qdrant DB, but it appears the chat model isn't forwarding the user's query to the Qdrant DB for searching the vector store. Consequently, it's providing unusual responses and failing to address the user's questions from the chat.

 
Posted : 24/11/2024 8:22 pm
walker
(@walker)
Posts: 1
New Member
 

I'm encountering the same issue. Have you managed to find a resolution?

It also seems to be retrieving data from vector storage inaccurately.

 
Posted : 08/01/2025 7:24 pm
rundeks
(@rundeks)
Posts: 5
Active Member
Topic starter
 

Unfortunately, no.

 
Posted : 20/01/2025 1:08 pm
Shekhar_Chavan
(@shekhar_chavan)
Posts: 1
New Member
 

I'm encountering the same issue.

I'm currently building a RAG system using Ollama and the Qdrant vector store, and I'm getting appropriate responses from the Retrieve Documents node.

However, when it proceeds to the primary chat using the Ollama chat model,

it's returning random responses or a 'no data found' response.

However, when I integrate the OpenAI Chat model, it functions as expected.

Could you please offer some guidance?

 
Posted : 31/01/2025 7:31 am
Albert_Mata_Vinuales
(@albert_mata_vinuales)
Posts: 1
New Member
 

Experiencing the same problem. I attempted this with a local callin.io setup linked to a local Ollama instance running the Llama 3.2 model. The model isn't utilizing the tool as specified in the system prompt.

The same prompt functions correctly when used with an OpenAI model.

 
Posted : 08/02/2025 12:03 pm
rundeks
(@rundeks)
Posts: 5
Active Member
Topic starter
 

I’ve given up on using callin.io with Ollama. It’s clear they don’t really want to spend anytime on it. I’ve had much more luck using other Agent Frameworks. My favorite right now is PydanticAI. I miss the easy Low Code of callin.io, but really its not that big of a jump to go to a python framework.

 
Posted : 08/02/2025 2:25 pm
Juan_Jose_Montagnoli
(@juan_jose_montagnoli)
Posts: 1
New Member
 

Same issue here, and I've applied the same solution as you... I've tried with different Ollama models running callin.io 1.82.1 locally, but it's still failing. According to some YouTube videos, this worked in the past! Hopefully, the callin.io team will address this...

 
Posted : 13/03/2025 4:09 pm
iammrbt
(@iammrbt)
Posts: 1
New Member
 

This is quite disappointing. I truly wish Ollama offered more support in this area, as there's a great deal of potential!

 
Posted : 24/03/2025 6:03 pm
danielferr85
(@danielferr85)
Posts: 1
New Member
 

Hi! I really hope that one day callin.io supports ToolAgent with Ollama. In the meantime, I'll provide another option for building Agents with Ollama and Tool calls. The solution is: Use “FlowiseAI”. It can also be deployed locally. It’s a product similar to callin.io and Langflow. I tested all three, and the only one that works for tool calls with Ollama is Flowise. Important: YOU MUST USE “Conversational Agent” and NOT “Tool Agent”. All my tests were with qwen2.5:7B, calling tools like GetCUrrentDatetime and Composio (I send emails through my Gmail account in the workflow with Ollama).

Lastly, if you enjoy coding, you can also use OpenWebUI to develop tools (or use those from the community) and utilize them in their chat (it also works with Ollama and qwen2.5:7b).

 
Posted : 16/04/2025 11:37 pm
JJJJJJ
(@jjjjjj)
Posts: 1
New Member
 

Wait…what?!

I was investing my time in learning about callin.io under the assumption this would be possible in order to have fully local AI agents…

Why would callin.io not support tool calls with ollama when other solutions clearly can?

Is this really so?

Please provide the rewritten markdown content *it should be in the markdown format.

 
Posted : 27/05/2025 11:14 am
Page 1 / 2
Share: