Notifications
Clear all
Topic starter
The concept is:
Making an HTTP request to integrate with a custom LLM model as input for an AI agent.
My use case:
I have been utilizing a third-party LLM that is not currently listed in the LLM mode options and cannot be run locally (meaning the AI starter kit cannot be used). I am accessing this LLM via an HTTP request. It would be advantageous if the AI agent could accept HTTP requests to these LLM models as input.
I believe adding this would be beneficial because:
It significantly expands the possibilities for testing with various LLMs.
Any resources to support this?
For example, Perplexity AI, and many others.
Are you willing to work on this?
Yes
Posted : 23/08/2024 3:52 pm