290
The greatest risk of AI in higher education isn’t cheating – it’s the erosion of learning itself
(theconversation.com)
"We did it, Patrick! We made a technological breakthrough!"
A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.
AI, in this case, refers to LLMs, GPT technology, and anything listed as "AI" meant to increase market valuations.
Honestly, a lot of these predicted problems with AI are actually overly optimistic because they assume AI can actually DO the work that we are talking about and the current state of AI very much cannot.
I sometimes think that articles like this are plants by the AI industry to create the narrative that their shit is even capable of causing this problem.
What people think AI can do is a problem, even (especially) when it cannot.
I would say that THIS is the biggest risk of AI. its not what it does, its what people believe it does. Especially people who aren't capable of actually assessing its performance.
Or C-level execs that are so out of touch with what their employees do and are convinced it can and/or should replace one or all of an employees job duties.
The problem is AI replacing essentially busy work that historically brand-new workers tend to be assigned. AI, in its current state, absolutely can do a lot of this work and replace the need for junior employees in a lot of cases. The problem is, junior employees need this work in order to improve, so if we delegate all of it to AI, there will no no future senior employees to do the more advanced work that AI can't do.
I think it’s actually just the perception that it’s doing that from people who are so far away from the working that they don’t have any clue.
Like 300 years ago if you wanted to be a sailor maybe you started by rowing oars, or swabbing decks or dealing with ropes and rigging or whatever. If you’re a 19 year old seaman in 2026 you’re probably using a throttle controller that’s 26 different systems disconnected from the actual mechanical work. No one says “oh that guys not a novice seaman” or “the boat literally drives itself hurr”
If you’re a graphic designer in 1970, you’d cut out and hand lay a magazine page out on a big glowing table, page by page, resetting the type so it fit along with giant physical images so you could design the page. In 2026, you’d use Illustrator and lay it out on your computer. No one says “oh the magazines just lay themselves out” or “that’s not how you lay out magazine”. It’s just done with the tools available.
What’s going on is very few people seem to understand the difference between a tool and a solution. A tool is a thing that does something. A hammer is a tool. A solution is something that solves a problem. A hammer is a tool but it is not a good solution for a crying baby. Applying random tools to non-problems does not generate solutions randomly, it just creates even more intractable problems.
It probably depends on your industry but it absolutely cannot even do entry level work in a lot of fields.
I feel like that goes without saying, yeah. AI can do a lot of a junior software engineer's job, but it's going to fail miserabley at being a junior house cleaner.
I agree, and that's a great way of putting it. We're kneecapping ourselves collectively because enough individual companies are deprecating the junior dev experience. We'll see if it holds up when senior devs are in such short supply that companies have to pay them 4x the margin they saved on junior devs. I think they're hoping that the machine learning gets good enough to do senior dev work before the humans retire. Or else they're just line-go-up types
LLMs do tend to be pretty good at textbook problems, because they've been trained on the textbooks. We have working students at $DAYJOB, who tell us that you can often get a flawless grade by handing in something AI-generated.
But then, yeah, you don't learn anything, and that will become a problem sooner or later, because none of problems at work are textbook problems.
I do agree, it may be that the situation is exacerbated to the point where doomerism is essentially encouraged.
The thing is, there was an article where the students themselves didn't want AI on the mandated Chromebooks. It feels force fed, and the fact is, nobody, no matter the age wants garbage screwing up their workflow.
Honestly, a lot of these articles are repeating the same story. In the end it all leads to a similar conclusion - and it isn't age related.