Hello everyone,
I'm currently working on a custom AI chat model named Cloudflare Chat Model. It functions flawlessly with nodes such as Basic LLM Chain and Summarization Chain. However, I'm encountering difficulties when attempting to integrate it with the AI Agent node.
Upon reviewing the source code, I discovered the following filter within the Agent.node.ts
file:
specialInputs = [
{
type: 'ai_languageModel',
filter: {
nodes: [
'@n8n/n8n-nodes-langchain.lmChatAnthropic',
'@n8n/n8n-nodes-langchain.lmChatAwsBedrock',
'@n8n/n8n-nodes-langchain.lmChatGroq',
'@n8n/n8n-nodes-langchain.lmChatOllama',
'@n8n/n8n-nodes-langchain.lmChatOpenAi',
'@n8n/n8n-nodes-langchain.lmChatGoogleGemini',
'@n8n/n8n-nodes-langchain.lmChatGoogleVertex',
'@n8n/n8n-nodes-langchain.lmChatMistralCloud',
'@n8n/n8n-nodes-langchain.lmChatAzureOpenAi',
'@n8n/n8n-nodes-langchain.lmChatDeepSeek',
'@n8n/n8n-nodes-langchain.lmChatOpenRouter',
'@n8n/n8n-nodes-langchain.lmChatXAiGrok',
],
},
},
];
It appears that the AI Agent node is configured to only accept a specific list of models.
My question is: what is the procedure for registering or incorporating my custom model to ensure its compatibility with the AI Agent? Is there a recommended method for expanding this list or exposing a custom node to achieve seamless integration?
Thank you for your assistance!
I'm encountering the same problem when attempting to link my custom chat model node. It would be very helpful to find a resolution for this. Thank you.
I'm also interested, as I'm looking to utilize a custom chat model.
I'm looking into doing something similar using NVIDIA NIM and am curious about how to attach a custom model to an agent.
Hi there! We currently need to filter the models within the Agent node this way, as we must exclusively permit chat-completion models. Traditional completion models are not compatible with the Agent node.
There isn't a feature available yet to filter by capabilities, which would be the ideal solution for this.