my most commonly repeated prompt; would be nice if the baked it into the tool itself:
"No emojis. be concise. no suggestions unless I explicitly ask for them. answer questions like the machine you are. Don't try and add personality or humour; remember you're a robot."
It really does make you wonder why all the models seem to require that. In principle, it shouldn't be a property of LLMs, and lol no it's not an "emergent property".
Post-training and "human preference" according to "data". Don't know a single developer who use these tools for work who prefer that though, but also don't know anyone who use LLMs a lot just "for fun" either, might just be vastly different preferences between the two userbases.
LLM are a text prediction engine. Starting the prompt with “you are a helpful assistant” help make subsequent text prediction more in line of that of a helpful assistant.
lol I once used a similar “you’re a machine so just do as you’re told” to a prompt and it answered back: “I’m not a machine, I’m Claude a helpful assistant” and refused to do what I asked because it claimed I didn’t have the authority to make the decision I’d asked it to convey in writing.
"No emojis. be concise. no suggestions unless I explicitly ask for them. answer questions like the machine you are. Don't try and add personality or humour; remember you're a robot."