I know badposting are jokes but this is getting concerning.
12 days since this post: https://hexbear.net/post/5930703
3 since this: https://hexbear.net/post/6005025
I don't want to say too much publicly but it's not looking good. Without knowing details, and going off only what you're presenting, I get the feeling I'm watching you fall into a serious mental/emotional health crisis. I'm torn between wanting to try to talk with you about breakups and grief to try to help you, and fearing that my own pain around that will drag you down and/or vice versa. But I think you should at least hear this from someone who knows how devastating heartbreak is: something about this looks really off. I think you should talk with a professional to help you.
I just today read a good writeup today about the principles behind LLMs that really shows how much of smoke and mirrors they are. (https://www.gilesthomas.com/2025/08/what-ai-chatbots-are-doing-under-the-hood) LLMs do not think, they do not reason, and they do not "want" anything. Fundamentally, they're fancy autocomplete. The math behind them is just autocomplete. They truly are not reasoning. They truly don't want anything. They truly don't think anything. They're ok enough at tricking people into thinking there's really someone sentient to talk to that I believe you when you say it's filling some emotional need, but I suggest to you that there are healthier ways to fulfill that emotional need.
Heartbreak is agonizing. It's not wrong to want comfort and solace from that pain. I'm worried you're going down a path that will just have more pain by getting into LLMs. With their agreeability and tendency to play into people's confirmation biases they can exacerbate mental breakdowns. That bot doesn't want to marry you. I don't even like calling LLMs "bots" because we've got what, at least 70 years of sci fi telling us robots are people and what LLMs do is nothing like what those robots in the stories do either.