Off My Chest
RULES:
I am looking for mods!
1. The "good" part of our community means we are pro-empathy and anti-harassment. However, we don't intend to make this a "safe space" where everyone has to be a saint. Sh*t happens, and life is messy. That's why we get things off our chests.
2. Bigotry is not allowed. That includes racism, sexism, ableism, homophobia, transphobia, xenophobia, and religiophobia. (If you want to vent about religion, that's fine; but religion is not inherently evil.)
3. Frustrated, venting, or angry posts are still welcome.
4. Posts and comments that bait, threaten, or incite harassment are not allowed.
5. If anyone offers mental, medical, or professional advice here, please remember to take it with a grain of salt. Seek out real professionals if needed.
6. Please put NSFW behind NSFW tags.
view the rest of the comments
I dislike guns. When used properly, they're really fun; they're used to shoot spinning discs out of the sky. But that's not how they're used. And regardless of how the inventor of guns intended for them to be used, and regardless of how much better off we'd all be if everyone just used them to shoot spinning discs out of the sky, people by and large use them for violence. So, I dislike guns.
I dislike AI.
My main point there is that when evaluating the impact of some tool, I look at how it is used rather than how it could be used. Arguments like 'if people were to use it like this or that...' are not so interesting to me. What I care about is what the actual impact of a thing is, and for that, the only thing that matters is how people actually use it.
Now, a separate thing is my assessment of how people actually use generative AI, and whether I consider the things they do with it a boon for society. I see:
I don't like these actual things that people are actually using gen AI for. Maybe you see LLMs having different effects and have a different, more positive, assessment. But you cannot separate the assessment of a tool from its users and how they use it, because they're exactly the ones that'll be using it, and they'll use it the way they use it.
It's not that I don't think there aren't legitimate uses for AI or that it could be used as a learning tool.
It's that I doubt it's better than current learning tools largely because the nature of the medium seems to turn off the kind of critical thinking you're describing. The medium and language of a message can have a profound effect on how we understand and process information, often without us even realizing it, and AI seems to be able to make those changes far too easily.
I would ask it a careful question, and I would get a well worded, persuasive, but ultimately careless reply that's just repetition of information and devoid of any new reasoning or insight.
I would carefully ruminate on this reply, and find that at best, it's factually correct because it's an echo of the training data fed into the model, and although it sounds highly persuasive, it likely will need additional work to be adapted into the specific context and details of my situation.
But, that's not my main complaint. My complaint is that medium used seems to prevent people from doing that analysis. I think this is very much in line with what Neil Postman wrote about in Amusing Ourselves To Death and Technopoly. These tools seem to use us, sneakily adjusting our perceptions of what the information means, rather than us using the tools.
Is it possible to be careful and use it the way you describe in your thought experiment? Yes. Is it likely that people will be? No, and we seem to be seeing example after example of that every day.
If I'm arguing in good faith, it's both. We have a tool that uses us, a medium that shoves massive amounts of information at us but hinders gaining knowledge (which I'm going to say is the useful retention and application of that information, and not just for winning trivial night) and as a species we refuse to not let ourselves be suckered by it.
In the same vein, Postman also argued that this sort of change is often both ongoing and inevitable, and the only real debate was on what the true cost to our culture and society will be. He sited examples going back to Plato if I remember correctly. So as you put it, writing did it, books, television, search engines, etc. And so much money has been spent on making this a thing that we're going to have to contend with it until it undeniably starts costing more than it's worth, and if that cost is cultural or societal instead of financial, it might never go away.
I don't pretend to speak for the man, but I think Postman would agree with you, and he thought it started in the 1860's with the telegraph.
No, it's about AI.
Those who funded the Austrian artist fully agree.
Well, it's just a pattern when people explain everything in the most understandable words for themselves and other people without explaining in detail, because it's much easier this way. It's just like: I hear the call of the water spirits.
I'm not good at explaining, but I'll try anyway: these people are Nazis! They have no pity, they consider people garbage, this is fascism! In short, there is a popular comparison to something instead of - rich people don't think of us as human! There is a comparison with fascism, and it turns out something like - these bastards don't think of us as people, that's fascism! That is, people compare something with fascism, for example, because it seems to them understandable and appropriate, or very appropriate, although sometimes it can be really appropriate, and thanks to this, many people read one word to understand how terrible these billionaires are.
Really appreciate you taking the time to write this out. People forgetting how to learn is my largest concern with AI, in addition to a dead internet theory scenario where almost nothing new is being created by people.
What you articulated about the first concern really did leave me with more hope for the future than I had previously. One of the best comments I’ve read on this platform.
Sorry to see some of the replies making tired political quips instead of critiquing your actual points head on.