Skip to content
Can AI agents be tr...
 
Notifications
Clear all

Can AI agents be trained using personal documents?

5 Posts
3 Users
0 Reactions
6 Views
Kooka
(@kooka)
Posts: 2
New Member
Topic starter
 

Describe the problem/error/question

I am creating a workflow where the user provides a topic, and the desired output is an article. To ensure the article closely matches my personal style, tone, and voice, I want to provide the workflow with articles I have previously written.

Is this achievable? What nodes would be necessary for this?

Information on your callin.io setup

  • callin.io version: Unknown
  • Database (default: SQLite): Default
  • callin.io EXECUTIONS_PROCESS setting (default: own, main): Default
  • Running callin.io via (Docker, npm, callin.io cloud, desktop app): npm
  • Operating system: W11

PS: For the LLM, I am using Ollama and Llama3.2 locally.

 
Posted : 10/01/2025 8:23 am
liam
 liam
(@liam)
Posts: 3
New Member
 

Welcome to the community.

This is achievable. The straightforward method involves crafting a highly detailed system prompt that outlines your writing style and provides examples of specific characteristics.

The more complex approach utilizes tools like qdrant, where you generate embeddings from your previous writings. This path has a steep learning curve and requires significant experimentation to implement a use case like this. Therefore, I suggest beginning quickly with a robust system prompt, and then dedicating time to understanding qdrant.

 
Posted : 10/01/2025 3:37 pm
Kooka
(@kooka)
Posts: 2
New Member
Topic starter
 

Is there a limit to the system prompt?

I was considering pasting an entire article into it and am unsure if that's the intended use, especially when also including instructions, persona, and other details.

I could save my writing in a PDF or Airtable, and then have an agent review/read the content. So, perhaps not as complex as qdrant, but still somewhat advanced?

Out of curiosity, if I include HTTP links in the system prompt, will that have any effect?

 
Posted : 10/01/2025 6:22 pm
liam
 liam
(@liam)
Posts: 3
New Member
 

You can achieve something similar. Take a look at Max’s notion chat assistant; you can implement something comparable and instruct it to use that as a writing style context rather than for references.

You don't want to make the system prompt excessively long, as you incur costs based on tokens. A longer system prompt will increase the API call expenses.

If you're uncertain about how to describe your style, consider asking an LLM to assist you in writing the system prompt, providing it with relevant articles for context.

 
Posted : 10/01/2025 6:45 pm
system
(@system)
Posts: 332
Reputable Member
 

This discussion was automatically closed 7 days following the last response. New replies are no longer permitted.

 
Posted : 28/02/2025 1:16 pm
Share: