The Oatmeal had a wonderful post about AI art last year that captures so many of my own feelings around this: A cartoonist's review of AI art
Fuck AI
"We did it, Patrick! We made a technological breakthrough!"
A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.
AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.
Wow, that is beautiful. So yeah, he and I mostly agree. I would say that AI probably should be heavily restricted, because right now it’s putting the entire economy into a really precarious place, and it’s also developed through extremely extensive copyright infringement. But yeah, that’s a great take.
It's not the same if scientists use AI as tool to create new materials, vaccines, genetics and in other investigations, solving problems which in traditional manner with millons of data need a lot of years, or an dumb user got even dumber, substituting the own creativity and intelligence with an AI app. AI can be an usefull tool for certain tasks, no an substitute for human capabilities. The problem with AI is this, not the AI as such, but it's abuse, to convert it to an hype, to obtain user data by big corps, to manipulate and control decisions. The correct use of AI need human intelligence.
I would agree, yeah.
I think a big driving force is that people who are drawn to generative AI are more likely to be mediocre at a thing, as well as demoralised by the effort required to improve.
I can sympathise with that drive, at least. After all, I'm a pretty mediocre writer. I desperately wish I could be better, but I am so far away from where I'd like to be that it feels hopeless sometimes.
Sometimes I wish that I believed that it actually was hopeless, because then I could just give up on trying rather than having to bear the pain of practicing my way out of mediocrity. However, I care more about improving than I do about my discomfort, and so I keep going with the XP grind.
A big thing that keeps me going is that I have seen the power of practice. I'm still far from where I'd like to be (and no doubt when I reach that point, my ambition will have grown along with my skill such that I will still be satisfied), but I'm able to look back on my efforts of the last few years and see real progress.
That's why I find people who use generative AI to be quite tragic — they're like alternate timeline versions of myself. It's more comfortable to believe that the reason you're not good at things is because there are people who are Good at it, and people who are Bad at it. If it's a case of immutable categories of capability, then you have an excuse not to try. What's especially tragic is that when these demoralised novices use generative AI, that's often because they still have that drive to create inside themselves.
But man, it sucks to see, because I know that they will never find the satisfaction they crave in these tools. Sure, they might make something they're proud of, giving them a facsimile of fulfillment — but it won't compare to what they could be feeling. When I argue against generative AI, I'm not just being anti-AI, but pro-Art. Actually, no, it's more than that — I'm pro-passion. If they could cultivate the kind of vulnerability required to actually use and develop their inner passion, then I would treasure any piece of art or writing generated through that process. I might not enjoy the art itself, in its own right, but I don't need to, because what I love most about art is that it's a fundamentally human process, and so any creative work is best enjoyed with that context
Ugh, it drives me mad. I just want to grab them by the shoulders and shake them, while yelling "PLEASE COME AND JOIN US. I GENUINELY WANT TO SEE WHAT PASSIONS DRIVE YOUR URGE TO CREATE. I KNOW IT HURTS TO BE MEDIOCRE, BUT YOUR PASSIONS ARE WORTH PERSISTING FOR. WE'VE ALL BEEN THERE, AND WE WANT YOU HERE WITH US SO THAT WE CAN HELP SUPPORT YOU". Alas, screaming at someone like this is not an effective evangelisation strategy — even if you tell them that we throw better parties, and that they're invited
The only person around me (that I know) that uses AI is me and the company mechanic. I only ever use it as an easier 'image search' to find the source of manga/anime or similar things. He only uses it to figure out what brand/model machine he needs to work on so he uses it to find the manual pdfs.
I feel like we're using it the 'right way,' but I feel like we're not actually using the AI part so...
Sounds like you're both using a search engine that has the word "AI" slapped on it.
Yeah, I use it through gemini (only cause pixel and bottom button press is easy) and he (unfortunately 🤮) uses grok, but we (or at least I do) only use it for image/object recognition. Really useful for that at least, although again, calling it AI might be a misnomer.
Thank goodness its not just me
Its the conundrum. "When i ask (random slop machine) its so smart and gives me answers!!"
"Did you ask it things you already know?"
"No. But look at the answers!!"
People have no idea how much they're damaging their brains.
I've never met a real human person that loves AI. I've used it in very specific circumstances. I've met other people who've used it. But every one of those people share some variation of my opinion - It's useful for very few things and trash for the 99% of other things. I don't know who these lovers of AI are but I bet it's the same handful of idiots who all run in one or two social circles reinforcing each other's opinions on everything. If a person's ideas can't be challenged or they surround themselves with people exactly like themselves their minds are doomed to atrophy. Humans are coded to save energy. Talent and skill takes long grueling effort. AI allows the lazy to phone it in which allows the midiocre to cosplay as the talented. But AI is a tool. For people with reading/writing difficulties it can bridge gaps that previously required much more effort and many more resources. That independence has value, but AI is not a replacement for the novelist. Anyone who says that the sun shines out of AI's ass or the sun never sets on AI or whatever BS they're spouting is either a snake oil salesman or their mark. Neither should be given much oxygen.
There's definitely a correlation between the understanding/misunderstanding of what it does and the understanding/misunderstanding of what it's capable of.
If you understand even the basics of how an LLM works you understand that it's not capable of much more than mediocre at a consistent level because even with the best possible training data and a lack of bad training data it's ability to"hallucinate" is based on how incomplete the data set it's trained on is.
When it "hallucinates" it does so because it doesn't have a direct answer it can generate for whatever query is given. This is because humans can't generate a complete data set of everything that is without flaw.
So they are as average as we are if we do not attempt to better ourselves. The difference is that they can't better themselves because they do not learn. And even further than that, we can try to better the training data, but they still lack the ability to understand anything including context which makes it a useless task. The man power of humans working together might be able to get pretty close to a training set that would raise an AI LLM above the average human, but the LLM can't maintain that trajectory by itself and literally nothing else would get done.
LLMs are per definition “mediocre machines”. They are a statistical approach. The most common answer is far from the single best answer.
Wow, yes. I think it goes both ways though, relying on the AI for the human part of your work (design, writing) makes you more stupid. But yes my direct boss is an Elon fanboy, ChatGPT devotee and his thinking is slow. He's not exactly stupid, there is stuff he's good at, but doesn't quickly make connections, and it sure seems like it's related to the ChatGPT.
As an artist myself, when it comes to generated images, it's weird and uncanny with the mistakes it makes.
It's not the kinds of things a below average artist gets wrong, image gen gets things wrong in very specific ways. It aims for perfection on everything, but unlike a human, the algorithm has no understanding of what it's trying to make.
If a below average artist makes mistakes, I still have a pretty good idea what they're going for, because a human working on art has some real world understanding that every other human has, a big one being object persistence.
No matter how rudimentary one is at art, a human will always understand that things in the background are independent of things in the middle and foreground. AI's obsession with making everything symmetric and balanced always results in the most repulsive uncanny valley looking slop. About the only thing it comes close to getting is abstract patterns but anyone can do the same thing with 25 year old software.
I had someone tell me it allowed them to "make the best app they've ever made". It was a bootstrap CRUD task app.
The thing that no one every talks about in the software industry is how the majority of software developers are just barely good enough to get by.
I spent 10 years consulting and there are entire companies out there where nobody even knows what high quality code looks like.
LLMs are trained on all this so they produce at the same level. For most developers they don't know the difference between good code and code that works (but is low quality).
In a world where no one cares about the code, and only cares that the product works (badly), LLMs are perfect.
I write code that no one is going to look at, ever (yet it goes in production).
Oh, I can recognize good code from code that works... I'm just not skilled enough to produce the former. (Does that put me ahead of most people by default?)
To me, one of the best ways to close that gap is the book The Pragmatic Programmer. Its old and if you ask me its still as valuable as ever. It's not about any particular language. It's about how to write high quality code in any language.
Wait until to meet the haters lmao
The same people who love AI are those that also loved the Magic 8-ball.
Listen, AI is a godsend for some people.
Take a totally fictional person who works in my IT department, Joe Foyle. He always tries talking to things that he flat out doesn't understand, just oozing that "pay attention to me" and "I talk for the sake of me taking" vibe. For fucking years Joey boy here has been trying too hard to get noticed, often to the detriment of his own goals, and has refused to take constrictive criticism/gentle guidance to dial that shit back.
He simultaneously is upset that others are smarter than him while at the same time refusing to better himself with training, mentorship, and/or reading.
Joe fucking loves AI.
Now he can be the one (instead of the other principal engineers) talking about things that interest the C-suite. Now when Joe talks, people have to listen because AI is the future... holy shit, did Joe just become a goddamned futurist?
AI has to work because otherwise Mr. Foyle will be proven (again) to be full of shit. And so he pushes harder and harder the narrative that AI is the future and can do no wrongs.
So in answer to your original question, OP - yeah, the biggest fan of AI that I (hypothetically) know is below average at doing things that he is supposed to be doing.
Generative AI has always, probably will always, be most attractive to the lazy and the cheap. And if a person is lazy and cheap then they don’t care about making something good. When the choice is between effort and crap, they will always choose crap.
my older bro who was in tech just starting using the slop machine like last year, he think its great, and thinks the answer to anyone asking him a question is chatgpt, think for yourself, and also AI generated picture from an actual PHOTO too.
Regarding code, 90% of software devs are sub-par because they have no idea how computers work. Or how to create code that is as efficient as possible and won't cause issues with 1000 users instead of 1 for example. Or code that does not contain security flaws.
Problem is, AI is not going to fix that for them.
Someone once said to me "If a creative project is of high quality, the longer you look at it the more details you'll notice. If it's bad, the longer you look at it, the more detail you notice is missing."
And I think that about sums it up. AI slop is programmed to be eye catching but can't produce much detail beyond that.
On the topic of details, ai also sucks ass at giving details meaning. Or implementing details in a way that makes sense.
One thing I noticed is that a lot of AI "art" is shiny and colorful, which looks appealing to people who know nothing about art. Specifically with DLSS5 I saw some people saying they liked it because of how shiny it made things. Have they never experienced reality before? Since when is everything super shiny? Also you really do not at all need AI to make a material more shiny or colorful. That can be done already very easily and it was the artists' choice to not do that because they know what they're doing.
One thing I noticed about DLSS5 is that it makes all the lighting look like default Unreal Engine lighting.
In my writing group, there's one guy who's very enthusiastic about AI; everybody else hates it.
The one who loves AI is a Libertarian who admits he's never read any book except Conan the Barbarian, which he thinks is the pinnacle of literature and whose writing is kind of bad and extremely political. (He wrote a short story about being fined for not using someone's correct pronouns because that was, and I quote, "The scariest thing I could imagine.")
I don't think it's at all a coincidence that the one guy who's enthusiastic about having Grok (of course he uses Grok) 'edit' all his writing is, pretty obviously, the worst writer in the group.
the worst writer in the group.
Sound to me like he's the worst person in the group.
I think it's like that dunning-kruger idea. People who are bad at things don't know they're bad, and are poor judges of quality.
So someone who's kind of bad at coding isn't going to know or understand what the LLM puts out, so they won't fix as many issues.
Also humans are lazy, and when presented with something that looks good at a glance, we don't really want to dive deeper.
I saw a PR from someone today at work. Guy's nice but I don't think he's much of a programmer. He asked copilot to fix a warning. It did, and generated a linter error. So he asked it to fix that. It did, but for whatever reason decided to delete an entire function call.
Unfortunately that part of the code has no unit tests, so he just pushed it up for review. I look at it and I'm like if that call is important, don't delete it. If it can be deleted, remove the now-unused code around it. We'll see what he says.
He probably spent more time fussing with copilot than it would have taken to do it right in the first place.
I think it's like that dunning-kruger idea. People who are bad at things don't know they're bad, and are poor judges of quality.
The amount of people that have shown me their really low quality tattoo and believed it was really good work is way higher than people with legit good body art showing it off.
People will buy a $75 printed decor canvas of a plant from hobby lobby before buying real artwork from a local artist,
It was these experiences that made me the pretentious art snob I am today.
Most people don't know art appreciation.
Not even the base level critique that anyone should be able to do.
I think it's like that dunning-kruger idea. People who are bad at things don't know they're bad, and are poor judges of quality.
Precede the line with a > to make a block quote.
Thank you. I couldn't remember what symbol it was.
And precede it with # if you want to yell!
Just based on how AI is trained, makes sense that it would be around average at best. Bad training data gets mixed in with the good, and a lot of times AI really wouldn’t be able to tell which is good vs what is bad. A lot of objectively good quality stuff goes totally unnoticed on the internet, and memes and shitposts get a ton of traffic.
So what I’m saying is we should continue to poison AI with shitposts
I think as far as software development goes, a large amount of people are not creating innovative work. For most corporate jobs AI or human slop is good enough. The ramifications of not writing it themselves and learning as they go is problematic and eventually we will not have people with enough experience to be in senior roles, where the experience and intuition is a must. That being said, it's very understandable to use a tool that lets you cruise through a job that pays 80k annual salary with benefits.