What need to be done to support function calling with other models except Llama series? #1690
Unanswered
hpx502766238
asked this question in
Q&A
Replies: 1 comment
-
in vllm 0.6.0+,auto function calling has been supported.https://docs.vllm.ai/en/latest/serving/openai_compatible_server.html#automatic-function-calling,It seems to support function calling of different models by using a unified tool parsers. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I use langchain_openai to calling the openai compatible api:
Howerver,the result is:
content='' additional_kwargs={'function_call': {'arguments': '{"location": "San Francisco International Airport, CA, US”, "}', 'name': 'check_weather'}, 'tool_calls': [{'id': 'call__0_check_weather_cmpl-bf097969-d4e1-49c4-a23f-48e28b86a6e1', 'function': {'arguments': '{"location": "San Francisco International Airport, CA, US”, "}', 'name': 'check_weather'}, 'type': 'function'}], 'refusal': None} response_metadata={'token_usage': {'completion_tokens': 15, 'prompt_tokens': 28, 'total_tokens': 43}, 'model_name': '../models/Qwen2-7B-Instruct', 'system_fingerprint': None, 'finish_reason': 'tool_calls', 'logprobs': None} id='run-f339c75c-a5e9-4d8c-a7ec-915ede8f648c-0' tool_calls=[{'name': 'check_weather', 'args': {'location': 'San Francisco International Airport, CA, US”, '}, 'id': 'call__0_check_weather_cmpl-bf097969-d4e1-49c4-a23f-48e28b86a6e1', 'type': 'tool_call'}] usage_metadata={'input_tokens': 28, 'output_tokens': 15, 'total_tokens': 43}
content='抱歉,作为一个AI模型,我无法提供实时信息或数据更新,包括天气预报。请访问可靠的气象网站或使用气象应用以获取最新的天气情况。' additional_kwargs={'refusal': None} response_metadata={'token_usage': {'completion_tokens': 34, 'prompt_tokens': 28, 'total_tokens': 62}, 'model_name': '../models/Qwen2-7B-Instruct', 'system_fingerprint': None, 'finish_reason': 'stop', 'logprobs': None} id='run-163d4985-31c0-44d7-9e25-d2d24056b621-0' usage_metadata={'input_tokens': 28, 'output_tokens': 34, 'total_tokens': 62}
#################################
It seems that the option "auto" of tool_choice is not supported or not compatible with my model,what should I do?
https://llama-cpp-python.readthedocs.io/en/latest/server/#function-calling
As the official document saying,I should use a functionary model,which is fine-tuned from llama3,but I still need to use other models except llama3(such as Qwen2:7B Instruct).
I guess I should fine-tune my model by myself,but I don't know what dataset to be use,or there is another way to solve the problem?
Beta Was this translation helpful? Give feedback.
All reactions