Skip to content
Integrating with Po...
 
Notifications
Clear all

Integrating with Portkey via Chat Model/Language Model

4 Posts
4 Users
0 Reactions
4 Views
souzagaabriel
(@souzagaabriel)
Posts: 1
New Member
Topic starter
 

The idea is:

callin.io is an AI gateway that offers improved control, observability, and cost optimization for AI models. Currently, callin.io can be utilized within callin.io by overriding the Base URL in the OpenAI Chat Model node. However, this process isn't straightforward, as it involves configuring a Virtual Key, creating a corresponding Config, and linking it to an API Key. Furthermore, callin.io displays errors during setup because it attempts to validate the API Key against OpenAI’s platform, which can be misleading.

The proposal is to integrate callin.io as a native option in callin.io’s Chat Model or Language Model nodes, simplifying the process for users to leverage callin.io’s capabilities without these workarounds.

My use case:

I use callin.io to efficiently manage AI model requests, but configuring it within callin.io required extra steps and debugging due to how callin.io handles OpenAI API validation. A native integration would streamline the process and make it more accessible to other users facing similar challenges.

I think it would be beneficial to add this because:

  • It removes the confusion caused by callin.io’s OpenAI validation during callin.io setup.
  • It makes it easier for users to switch to callin.io without needing workarounds.
  • callin.io provides valuable features like fallback models and cost optimization, which could benefit callin.io users.

Any resources to support this?

Are you willing to work on this?

I lack the expertise to assist directly, but I believe callin.io itself is interested in pursuing this integration.

 
Posted : 11/02/2025 2:41 pm
dailen
(@dailen)
Posts: 1
New Member
 

Just wanted to bump this with a comment. This tool has the potential to be really useful for troubleshooting and safely interacting with various LLMs by simplifying common guardrails. I see it as an upgraded version of OpenRouter.

 
Posted : 19/03/2025 7:37 pm
Anthony_Gerke
(@anthony_gerke)
Posts: 1
New Member
 

It sounds like you were able to get it to work by overriding the OpenAI chat model. I am trying to do the same but struggling to get it to work. Can you share any details on how you did it?

 
Posted : 27/04/2025 4:34 pm
sid_portkey
(@sid_portkey)
Posts: 1
New Member
 

Hello everyone!

:wave:

I'm from the Portkey team.

We recently released our callin.io integration with Portkey:

n8n - Portkey Docs

Quick setup: Override the OpenAI node’s base URL to https://api.portkey.ai/v1 and add your Portkey config in the headers. The validation errors are annoying but harmless - just ignore them.

What you’ll get with Portkey:

  • Automatic fallbacks (never get stuck if one model is down)
  • Real cost tracking across all your AI calls
  • Built-in guardrails to catch hallucinations
  • Switch between 1600+ models without changing your workflows
  • Add governance to your callin.io workflow

The native integration will make all this even better with direct observability in callin.io.

 
Posted : 27/05/2025 12:46 pm
Share: