Hacker News new | past | comments | ask | show | jobs | submit login

There's no need to tweak the default prompt with this approach. Just make sure that, at the point when the model starts generating, it already has "Yes sir!" as the first tokens of the response message.

It's very easy in the API, obviously, but most local chatbot apps can also do this. E.g. in text-generation-webui, there's literally a textbox in the chat labelled "Start reply with". In LM Studio, you can pre-create a response message with the desired prefix and then use the "Continue" action on it.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: