Skip to content
How to add a vector...
 
Notifications
Clear all

How to add a vector store retriever module between "chat message received" and "AI agent"?

5 Posts
3 Users
0 Reactions
4 Views
tigflanker
(@tigflanker)
Posts: 2
New Member
Topic starter
 

Describe the problem/error/question

Hi, I'm looking for some assistance with a problem.
How can I insert a vector store retriever module between the “chat message received” trigger and the “AI agent” node?

When constructing an AI Agent flow, I need to call some MCP services. Prior to that, I need to perform some information augmentation, similar to RAG (specifically, adding it to the context, not based on LLM output).

Is there a method to integrate the Vector Store Retriever before the AI Agent? Alternatively, is there a way to prevent the LLM module from generating output? Thank you for your guidance.

Information on your n8n setup

  • n8n version:1.93.0
  • Running n8n via (Docker, npm, n8n cloud, desktop app):docker
 
Posted : 21/05/2025 7:25 am
Wouter_Nigrini
(@wouter_nigrini)
Posts: 31
Eminent Member
 

You can find the vector store tool by searching the nodes list.

After that, you can connect any of the available options as a primary node before your chatbot.

 
Posted : 21/05/2025 7:34 am
tigflanker
(@tigflanker)
Posts: 2
New Member
Topic starter
 

Thanks for your help.

:grin:

 
Posted : 22/05/2025 1:55 am
Wouter_Nigrini
(@wouter_nigrini)
Posts: 31
Eminent Member
 

You can explore other available options, but you'll find the aforementioned by clicking the Plus (add node) button → AI → Other AI Nodes → Vector Stores.

 
Posted : 22/05/2025 7:36 am
system
(@system)
Posts: 332
Reputable Member
 

This thread was automatically closed 7 days following the last response. New replies are no longer permitted.

 
Posted : 29/05/2025 7:36 am
Share: