this post was submitted on 08 Oct 2025
1109 points (98.5% liked)
Funny
12307 readers
1246 users here now
General rules:
- Be kind.
- All posts must make an attempt to be funny.
- Obey the general sh.itjust.works instance rules.
- No politics or political figures. There are plenty of other politics communities to choose from.
- Don't post anything grotesque or potentially illegal. Examples include pornography, gore, animal cruelty, inappropriate jokes involving kids, etc.
Exceptions may be made at the discretion of the mods.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
You can tell if to switch that off permanently with custom instructions. It makes the thing a whole lot easier to deal with. Of course, that would be bad for engagement so they're not going to do that by default.
You can, but in my experience it is resistant to custom instructions.
I spent an evening messing around with ChatGPT once, and fairly early on I gave it special instructions via the options menu to stop being sycophantic, among other things. It ignored those instructions for the next dozen or so prompts, even though I followed up every response with a reminder. It finally came around after a few more prompts, by which point I was bored of it, and feeling a bit guilty over the acres of rainforest I had already burned down.
I don't discount user error on my part, particularly that I may have asked too much at once, as I wanted it to dramatically alter its output with so my customizations. But it's still a computer, and I don't think it was unreasonable to expect it to follow instructions the first time. Isn't that what computers are supposed to be known for, unfailingly following instructions?