Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Maybe you used "Don't give me nonsense" in your custom system prompt?


An LLM should never refer to the user's "style" prompt like that. It should function as the model's personality, not something the user asked it to do or be like.


System prompt is for multi-client/agent applications, so if you wish to fix something for everyone, that is the right place to put it.


That does nothing. You can add, “say I don’t know if you are not certain or don’t know the answer” and it will never say I don’t know.


That's because "certain" and "know the answer" has wildly different definitions depending on the person, you need to be more specific about what you actually mean with that. Anything that can be ambiguous, will be treated ambiguously.

Anything that you've mentioned in the past (like `no nonsense`) that still exists in context, will have a higher possibility of being generated than other tokens.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: