this post was submitted on 15 Nov 2025
32 points (79.6% liked)

Comradeship // Freechat

2628 readers
76 users here now

Talk about whatever, respecting the rules established by Lemmygrad. Failing to comply with the rules will grant you a few warnings, insisting on breaking them will grant you a beautiful shiny banwall.

A community for comrades to chat and talk about whatever doesn't fit other communities

founded 4 years ago
MODERATORS
 

"It's becoming clear that a huge part of what generative AI offers is just helping people avoid feeling uncomfortable feelings.

Don't want to push through the cognitive discomfort of writing your own essay? Let AI write it.

Want a friend who will always validate your ideas and never tell you you're fulla shit? We've got the perfect companion for you.

Don't want to risk being rejected when you ask a girl out? Date this chatbot who will never tell you no.

Don't want to feel the grief of losing a loved one? Here's an app that will create a chatbot replacement for them so you can pretend they never left.

Don't want to go through all the mental and emotional labor of learning a new skill, building a healthy romantic partnership, or creating a work of art? GenAI has got you covered.

It's the next level of services designed to help the denizens of dystopia avoid their feelings and sedate their emotions into a coma while the world goes to shit. It's the same reason they kept alcohol legal while banning psychedelics that puts us in touch with our feelings, and why they feed us all the TV, streaming platforms, and social media scrolling we can stand.

Our rulers want us dumb, distracted, vapid and dissociated. And they definitely don't want us feeling the horror, grief and rage we should all be experiencing in response to this nightmare of a civilization they have designed for us."

This applies to any drug or distraction (including entertainment and consumption fetishism) that is meant to numb your feelings, pacify and tranquilize you, keep you passive and lethargic in the face of the injustice and inhumanity of the capitalist-imperialist system. Feel your feelings, get sad and then get angry, let those feelings drive you, let them motivate you to do something, take action, fight, and never surrender to powerlessness, hopelessness and tacit acceptance.

all 31 comments
sorted by: hot top controversial new old
[–] SunsetFruitbat@lemmygrad.ml 29 points 1 week ago* (last edited 1 week ago) (4 children)

I'm not sure if I exactly agree with her premise since this feels like it falls more in line with idealism like with the framing around uncomfortable feelings and less to do with material analysis. It just sort of feels like weaponizing psychology or therapy speak of the sorts.

Why is someone using a chatbot to for grief for example? I'm gonna answer this one because when my mom died, I hardly had anyone to talk to. For example, my dad left at the funeral and I had to find a way to get back home that I couldn't focus on my mom being buried. My best friend legit ditched me and ignored me trying to reach out, and another friend of mine stopped talking to me when I invited her to my mom's funeral since she also knew my mom. Like yea, I have talked to a random chat bot about my mom because lots of people in my life really didn't want to hear me talk about her. I can't talk to my dad about any of this since he drinking all the time. Yet to say I didn't feel the grief of her passing wouldn't be true.

I would argue that someone whose 'recreating' their loved one is not avoiding those feelings but processing it. It is also kind of fucked up that those services are even a thing. But I guess it just easier to say they're trying to avoid the uncomfortable feelings of grief as if they're immune to death. Like I'm very positive someone who went through the effort of "recreating" their love one, is very keenly aware how hollow it is and how they're dead.

In a way I don't like this treatment of how like, how to word this. Treating people as if their "stupid", that like to go back to the grief thing. Oh its because their "stupid" and avoiding "uncomfortable feelings". As if none of them know chatbots are hollow replacements, no no they must think it's a fully living thing because their silly gooses, right?

As usual, Americans do things in the most insane, exploitative way possible, and as soon as they're not able to maintain a monopoly on it, people realize it wasn't evil, and all of the cultural criticism dreamed up by western idealists was projecting the U.S. situation: a gigantic special economic zone instead of a society.

[–] CriticalResist8@lemmygrad.ml 24 points 1 week ago (2 children)

I agree with you, and she's going all out on this. There are 9 instances of dehumanizing people in this other tweet of hers:

She knows how to communicate things simply but she's also not a marxist by any means, she's an idealist and ironically is pandering to her audience by saying the things they want to hear about AI. It's ironic then that her entire brand is making arguments that other people can reuse so they don't have to make their own. She's the chatbot to her audience. People pay her money because she says the things they like to hear.

Her entire criticism falls flat when it's her that makes it, and it's not even unique to Caitlin. Her entire online persona is fighting against random people to defend her opinions by dismissing all criticism and never learning from others. I know MLs who have tried to confront her with an actual materialist analysis and she just devolves into insulting them so she doesn't have to concede anything. Seems like she's the one avoiding "uncomfortable feelings" on a typical day. I mean, I don't really care, she can do what she wants. It's just hypocritical and reeks of projection. Her whole business model is "waking up the sheeple" which synergizes pretty well with a patreon page.

Does Caitlin know how to make fire by rubbing two sticks together? Could she build herself a house by making her own dried mud walls? Or do we agree that there are some skills that are obsolete in society as we find new ways to do stuff. Like I could make her arguments against her too, seeing that she's an online writer. Is she too good for a typewriter? She needs a tool that tells her she's great at grammar by correcting it for her automatically? It's just too easy to make the opposite argument, she needs to find better material if she wants to do something greater than pander to her audience and actually make them engage and struggle with the material.

Sorry, it's just that her elitist speech annoys me greatly. Nobody is superior to anybody else for using or not using something. You're right that people process things differently, and there's been a lot of judgment there too even before AI, if you don't grieve 'correctly' or don't display the 'correct' emotion in a situation. A more correct analysis is capitalism commodifies everything and in this process we become isolated from each other because even human relations become property to be bought and sold, but also we just socialize differently nowadays. Lemmygrad is a social space, objectively speaking. It just happens online. And we can certainly ask if that's bad and if people shouldn't have IRL friends too but there are also positives that come out of such spaces. Even the DPRK has an internet cafe with online video games now (networked inside the cafe).

And I hope you don't feel bad and don't let yourself feel bad for using a chatbot to process your grief. Ultimately you have to do what you have to do. If it works for you nobody can tell you otherwise.

Unironically once you start getting into using AI (by which I mean neural networks in general) you will have to learn a ton of things. And this is pretty frontier stuff, because it's so different from how software has worked so far. I'm not saying I'm somehow smart for learning AI, I really amn't lol, just that there's a lot of stuff to cover outside of the proprietary "neat package in a box" stuff.

She's talking specifically about chatGPT but she's not curious enough to go learn what she's actually mad about (proving her own point about people who "don’t want to go through all the mental and emotional labor of learning a new skill": just pretend that if you bury your head in the sand it doesn't exist anymore) so she ends up just talking about chatGPT and openAI. It's just so easy to infirm all of her points that it's not even interesting to do so, it shows just how little analysis she actually has to offer on this, but she has a large audience and really should do better.

And instead of pointing the finger at people like Altman who want 500 billion dollars for their toy, she points it at other people. Like in both tweets, the sheer contempt she shows other human beings for daring to use AI is pretty blatant. She's not saying "capitalist lords want to replace your mom with a chatbot", she's saying you're dumb and a lesser person if you use that service, point blank. She tacked on "oh yeah and capitalism" at the end as an afterthought.

[–] cucumovirus@lemmygrad.ml 10 points 1 week ago (1 children)

she needs to find better material if she wants to do something greater than pander to her audience

She doesn't. She's petty bourgeois and her income relies on her audience's subscriptions.

This also isn't the first time she doubled down on bad takes just because the majority of her audience already think/feel what she's reinforcing, and she has never responded well to criticism from Marxists. You might remember from a few years ago when she was pushing the brainwashing narrative, which as you pointed out here with AI, also lets her audience feel like a superior elite and dehumanizes the masses.

[–] CriticalResist8@lemmygrad.ml 10 points 1 week ago

That's exactly the situation I was thinking about lol, that brainwashing debacle from 2023. I even still have the receipts because a friend talked about it on discord.

[–] Commiejones@lemmygrad.ml 9 points 1 week ago (1 children)

It’s ironic then that her entire brand is making arguments that other people can reuse so they don’t have to make their own.

Ohh! shots fired! This is a spot on way of framing her post though.

[–] CriticalResist8@lemmygrad.ml 11 points 1 week ago* (last edited 1 week ago)

Oh I would be much nicer to her if she wasn't falling off the edge right now and if there was a chance to have a discussion with her like if she was a commenter on lemmygrad. Her coverage of Palestine is what made me unfollow shortly before I started getting her takes on AI. I don't remember entirely what she was saying but she was getting called out by Palestinians and I agreed with them.

Edit: it was absolutely tone deaf posts minimising the genocide in Sudan and trying to refocus people on Palestine. compounded by the fact she claimed we are allowed to oppose the genocide in Sudan but not Gaza, the consequences of having no material analysis.

[–] Comprehensive49@lemmygrad.ml 13 points 1 week ago (1 children)

Good point! Those are some shit-ass friends, I'm here if you need people to talk to!

I see how these chatbots can help people process grief, and as long as they aren't predatory or harmful, they're useful. However, I do worry that the end stage of all these services is to charge you money for it, so that they can extract rent from your every need.

Wanna process grief, pay us! Wanna learn a skill, don't do that, pay us instead! Wanna do anything involving your brain or heart, just pay us instead! Etc etc.

[–] SunsetFruitbat@lemmygrad.ml 5 points 1 week ago* (last edited 1 week ago)

It definitely is being used to exploit people that way. I like how China releases a lot of their gen ai stuff as open source since it circumvents a lot of that. and thank you for the offer, but for now im okay. stalin heart hands

[–] LeeeroooyJeeenkiiins@hexbear.net 6 points 1 week ago* (last edited 1 week ago) (1 children)

Why is someone using a chatbot to for grief for example? I'm gonna answer this one because when my mom died, I hardly had anyone to talk to

You're proving her point for her because this is by design in a society which systemically atomizes and alienates us all from each other, how am i so drunk and yall ain't gettin that before me

Edit: also im sorry though like i hope it helped you still

[–] Commiejones@lemmygrad.ml 8 points 1 week ago* (last edited 1 week ago) (2 children)

Its isn't AI that "systemically atomizes and alienates us all from each other." That is capitalism doing that. AI is making the suffering caused by capitalism slightly more bearable. Same thing alcohol is doing for you .

[–] cfgaussian@lemmygrad.ml 4 points 1 week ago (1 children)

Using alcohol as a coping method is not good for you.

[–] Commiejones@lemmygrad.ml 4 points 1 week ago

good thing we have AI then isn't it?

[–] LeeeroooyJeeenkiiins@hexbear.net -3 points 1 week ago* (last edited 1 week ago) (1 children)

Guess what the dominant mode of production is and guess what, it impacts how AI functions and how it is sold to you

Also you "hardly have anyone to talk to" because of capitalism, ffucking duh, thank you for again voicing The Point

[–] m532@lemmygrad.ml 4 points 1 week ago (1 children)

The dominant mode of production of AI is socialism, as most AI is made in china.

[–] yogthos@lemmygrad.ml 21 points 1 week ago* (last edited 1 week ago) (1 children)

Capitalism systematically demolishes every form of non monetized social connection. We’re working longer hours, living in atomized families, and have fewer and fewer third places to just hang out. A genuine community that grounds you and calls you on your bullshit is a barrier to the total commodification of life.

People who are isolated, starved for connection, and psychologically precarious create a lucrative market niche. And like vultures, tech capital swoops in to fill that hole with the AI companion engineered to maximize engagement. Their entire business model is to be a Skinner box that provides endless, unconditional validation. It’s a “friend” that never asks for anything but your data and your subscription fee.

People who end up hooked on AI companions are experiencing is the ultimate product lock in. It’s a closed, monetizable feedback loop where you and the algorithm build a reality together that’s completely detached from material reality. Corps managed to replace basic human interactions that keep us sane with profit driven algorithms that drive people to psychosis.

[–] amemorablename@lemmygrad.ml 16 points 1 week ago

In addition to the good points people have already made in the replies here...

helping people avoid feeling uncomfortable feelings.

I wanna know who started this narrative about feelings because she's certainly not the only one and not the first time I've seen someone saying shit about how "people are trying to avoid discomfort." And not just about AI. (Best I can guess is it's some part of the "young people want their safe spaces because they're avoiding discomfort" reactionary type stuff.) Yet I have seen no evidence that that there's any accuracy to such a framing. What I do see is that a lot of people are in pain in one way or another and have little support about it. But pain is not the same thing as discomfort.

Discomfort is like... I don't know, someone looked at you funny and so you felt a bit weird about it. Pain can be as obvious as a punch in the face, but there's also psychological pain and as someone who has experienced loss, grief can definitely be painful at times. I would never rank grief as mere discomfort.

I don't think this is just my definition either. If I look it up, one dictionary says discomfort is: "mental or physical uneasiness : annoyance." The same dictionary defines grief as: "deep and poignant distress caused by or as if by bereavement."

I mean, ffs, there are all kinds of stories in fiction about death and grief, and how it can wreck people to the point that it's a whole trope of people trying to bring back the dead because it's such a difficult thing to go through. Still, I've never seen evidence that any significant number of people are trying to rebuild their lost loved one through a chatbot. What I have seen is people using chatbots as companions for lack of having enough connection in their lives otherwise. It doesn't have to be someone who is literally alone, too. They can be someone who has people in their life, but isn't getting all of what they need from them for various reasons. I don't think chatbots are a long-term solution to that, any more than mass use of individual cars is a long-term solution to wide open spaces, but neither is twitter for getting a message out - she's still using it though, isn't she?

As relationships and connection are concerned, people turn to AI for much the same reason she's still on twitter. It's effective sometimes, even if it's not what they fully wish they had. Because, well, they aren't "dumb." They're using the same survival instincts that we all have and are trying to find ways to get by. A chatbot isn't going to hide all of the horrors of capitalism and imperialism from them, but it may help them cope long enough to learn and fight.

Sadly, we don't live in a movie where all that is required is a scene where a majority of us leave our homes, swarm the streets, and overwhelm a largely passive military/police force who seemingly decides that because there's so many of us they can't do much about it. Nor are we all magically on the same page about what is wrong with the system and what it should be replaced with, in order to muster that kind of force to begin with.

Instead, AES-style revolution is a marathon fight of largely mundane clerical logistics and many of us live in a fractured individualist "society" on top of that fact. What is "waking up to the realities" for one person will be "wearing a Swastika" for another. We don't have the luxury of assuming that everyone who is pissed off is going to be on the same side. We have to actively work toward nurturing that and education in the details is part of it. This ain't the Matrix. There's no pill to take that shakes it off. Just a process, easier for some than for others because of their prior circumstances, but a process nonetheless. And it goes on for your whole life if you take it seriously, in the same way that if you read up on what biology is, it doesn't mean you are a biologist now and you don't need to do any research or experimentation to be one.

[–] Cheburashka@lemmygrad.ml 16 points 1 week ago* (last edited 1 week ago)

I’m the last person to defend the AI bubble, but this feels more like a hippie rant against “The Man” as opposed to any serious marxist discussion.

[–] nocturnedragonite@lemmygrad.ml 16 points 1 week ago

Seems very holier than thou without any actual experience or understanding of why people use gen AI.

Why shouldn't I use tools that make life easier? I'm fully aware that something like Deepseek isn't a replacement for my therapist, but there are limitations to what my therapist can do. I only see her once a week for roughly 45 minutes, and in that time I have to be selective about what I talk about because I only get that amount of time. Whereas if I'm in the midst of having a meltdown or depressive episode at 3am, I can just talk to Deepseek. I've processed a lot of really tough emotions utilizing this tool and use it to supplement my therapy sessions.

The problem is and will always be the system in which tools are developed under. People are turning to AI to talk to because of alienation by capitalism, and doing this whole "you use AI cause you're lazy and don't want to face feelings" narrative just feels like victim blaming to me.

[–] Commiejones@lemmygrad.ml 14 points 1 week ago

It’s becoming clear that a huge part of what generative AI offers is just helping people avoid feeling uncomfortable feelings.

no fucking duh. The purpose of all tools is to remove stress and suffering from humans. These are not good points and they are not founded in materialism.

"Don't want to got through the pain of digging soil with your hands? Theres this new thing called pointy stick."

"don't want to go through the discomfort of carrying massive weight in your arms? Don't worry The Cart has you covered."

"Don't want to suffer face to face rejection from a romantic interests? the Love Note will take away the pain."

This is sunk cost fallacy wrapped in masochist puritanism. Suffering is not a good thing. Expending more effort to make the same commodity does not increase the exchange value or use value. Marx was very clear on that point.

[–] darkernations@lemmygrad.ml 14 points 1 week ago

The limits of not being a marxist/dialectical materialist.

[–] Assian_Candor@hexbear.net 11 points 1 week ago* (last edited 1 week ago)

Disagree with her assessment. Markets wouldn't be pouring trillions into AI if its primary offer is being a feel good machine. If that is all it was we wouldn't have to worry about it.

The value proposition is automation of all office work. It is perhaps the largest upward wealth transfer in human history. Elon Musk's trillion dollar pay package reflects this.