I highly recommend using Anubis as a proxy for your entire instance. It's a little complicated to get going, but it stops any and all AI scrapers with a denial of access. Having a robots.txt works, but only so much, because some of these bots do not respect it. And, honestly, with the way Sam Altman talks about the people he's stolen and scraped from, I don't think anyone should be surprised.
But, I have Anubis running on my personal website and I've tested to see if ChatGPT can see it, and it cannot. Good enough for me