how to update the prompt template in llamacpp
I'm a newbie with llamacpp
This was made to be used with ik_llama.cpp not llama.cpp
I did try changing the template with gguf_new_metadata.py but then running cli with -cnv outputted gibberish instead of just not stopping.
Either way it works via llama-server
using the right template in your front end.
This was made to be used with ik_llama.cpp not llama.cpp
I did try changing the template with gguf_new_metadata.py but then running cli with -cnv outputted gibberish instead of just not stopping.
Either way it works via
llama-server
using the right template in your front end.
yeah i can confirm it works (really) well on ik's fork(28tg 87pp on my SBC)
can one use gguf editor for this?
yeah i can confirm it works (really) well on ik's fork(28tg 87pp on my SBC)
Thanks for the confirmation.
can one use gguf editor for this?
You can try, like I said any attempts I made at changing the prompt template made it behave worse in the cli (I only use the cli to test this I only llama-server for actual use), but like I said it works fine with the right template with llama-server
how do i have to change the chat template? can this be done in the browser when i start llama-server? under prompt template?
great work btw
how do i have to change the chat template? can this be done in the browser when i start llama-server? under prompt template?
That should work. I haven't used the bundled front-end I usually use a third-party one.
using the built in llama-server standard and pasting that in prompt template field, it works
<|begin_of_text|>{{prompt}}<|eot_id|>
{{history}}
{{char}}:
I updated the model card with the template from the paper:
<|begin_of_text|>System: {system_message}<|eot_id|>
User: {user_message_1}<|eot_id|>
Assistant: {assistant_message_1}<|eot_id|>
User: {user_message_2}<|eot_id|>
Assistant: {assistant_message_2}<|eot_id|>
Not sure if your prompt template field matches that as I use alternative tools and making a template based off above has worked for me.