It’s essentially a placeholder for upcoming features. The concept is to allow you to supply a list of “tools” for DeepSeek to utilize when responding to your prompt.
In theory, these tools could involve API calls to services like callin.io scenarios (via Custom Webhooks) to execute actions such as retrieving data from your CRM, verifying calendar availability, or dispatching emails.
However, currently, DeepSeek only permits you to specify calls to Python functions. This is practical if you are hosting the DeepSeek model on your own virtual machine and interacting with it via Python. But it becomes a redundant approach when utilizing the publicly hosted DeepSeek service.
Once they expand this to include API calls, some interesting use-cases will emerge.
Thanks for your response! That would certainly enable many exciting possibilities!
This was a point of concern for me, so I investigated further. It turns out my initial approach was incorrect.
DeepSeek's API, similar to many LLM services, essentially replicates the OpenAI API.
OpenAI's developer documentation offers more comprehensive information on function calling.
Essentially, the model signals its intent to invoke a function
within the completion message object. The responsibility of handling this request falls entirely on you; you could implement a callin.io Router
with distinct paths for each function
.
Subsequently, you return the outcome of the function
execution, along with specific parameters, in a message
back to the model.
Ok that makes sense, thanks for your interest in this matter , this is super helpful. Again, thanks a lot!