Skip to content
Custom tool for emb...
 
Notifications
Clear all

Custom tool for embedding and models

2 Posts
2 Users
0 Reactions
257 Views
hakanai
(@hakanai)
Posts: 1
New Member
Topic starter
 

The concept is:

Simply enable the connection of the code tool to the embed and model inputs of AI nodes.

My use case:

This would unlock significantly more functionality. We could invoke any API endpoint, allowing the use of services like together.ai, or other local models beyond Ollama. It would also facilitate result processing, such as reranking.

I believe adding this would be beneficial because:

It would essentially cover all requests for utilizing different services and endpoints. This would allow the development team to concentrate on creating nodes solely for the most popular services.

Any resources to support this?

Are you willing to work on this?

As far as I can tell, all that's needed is to permit the existing code tool to connect to the AI node ports, and provide some documentation detailing the required data format.

 
Posted : 14/12/2024 1:30 am
(@lechusai)
Posts: 10
Member Admin
 

Thanks for sharing this idea .
At the moment, callin.io does not support connecting the code tool directly to AI node ports (embed/model inputs). The current path for invoking external services is via Custom Actions or Make/Zapier integrations.

Your suggestion makes a lot of sense:

  • It would allow any endpoint (together.ai, local models, reranking).

  • It would cover most integration requests without needing a dedicated node for every service.

We’ll log this as a potential future enhancement. In the meantime, if you need to connect a different service, you can set it up through a Custom Action that formats input/output for your model.

 
Posted : 19/09/2025 8:16 am
Share: