feat: function calling support in a chat session's prompt
function
#101
Labels
help wanted
Extra attention is needed
new feature
New feature or request
released on @beta
released
roadmap
Part of the roadmap for node-llama-cpp (https://github.com/orgs/withcatai/projects/1)
Milestone
Make it possible to provide functions that the model can call as part of the response.
It should be as simple as something like that:
If you have ideas of a text format I can use to prompt the model with, please share.
I'm looking for a format that can achieve all of these:
It can also be OK to give the model only a brief overview of the available functions and let it fetch more info about a function on demand.
I thought about implementing support for this format as part of
GeneralChatPromptWrapper
, but I'm not really sure whether this the safest way to distinguish between text and function calling:Then when the model will write text, it may go like that:
I'll then detect the function call in the model response and evaluate this text:
So the model can then continue to provide completion:
I'll then detect the function call in the model response and evaluate this text:
So the model can then continue to provide completion:
I plan to use grammar tricks to make sure the model can only call existing functions and with the right parameter types.
How you can help
LlamaChatPromptWrapper
) format, this would also be very helpfulI'm currently working on a major change in this module, so if you'd like to help with implementing any of this, please let me know beforehand so your work won't become incompatible with the new changes
The text was updated successfully, but these errors were encountered: