243
you are viewing a single comment's thread
view the rest of the comments
[-] jocanib@lemmy.world 12 points 1 year ago

It will almost always be detectable if you just read what is written. Especially for academic work. It doesn't know what a citation is, only what one looks like and where they appear. It can't summarise a paper accurately. It's easy to force laughably bad output by just asking the right sort of question.

The simplest approach for setting homework is to give them the LLM output and get them to check it for errors and omissions. LLMs can't critique their own work and students probably learn more from chasing down errors than filling a blank sheet of paper for the sake of it.

[-] nulldev@lemmy.vepta.org 8 points 1 year ago

LLMs can't critique their own work

In many cases they can. This is commonly used to improve their performance: https://arxiv.org/abs/2303.11366

[-] jocanib@lemmy.world -1 points 1 year ago
[-] nulldev@lemmy.vepta.org 5 points 1 year ago

Whoops, meant to say: "In many cases, they can accurately (critique their own work)". Thanks for correcting me!

load more comments (24 replies)
this post was submitted on 14 Jul 2023
243 points (93.9% liked)

Technology

57291 readers
3352 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS