38
top 6 comments
sorted by: hot top controversial new old
[-] Deceptichum@sh.itjust.works 23 points 3 months ago

People are biased against resumes that imply a disability. ChatGPT is just picking up on that fact and unknowingly copying it.

[-] boredtortoise@lemm.ee 11 points 3 months ago

We've lived in a world where resume evaluation is always unjust. It's just that. A resume can't imply anything that can be used against you.

[-] lvxferre@mander.xyz 10 points 3 months ago

studies how generative AI can replicate and amplify real-world biases

Emphasis mine. That's a damn important factor, because the deep "learning" models are prone to make human biases worse.

I'm not sure but I think that this is caused by two things:

  1. It'll spam the typical value unless explicitly asked contrariwise, even if the typical value isn't that common.
  2. It might take co-dependent variables as if they were orthogonal, for the sake of weighting the output.
[-] SuperCub@sh.itjust.works 4 points 3 months ago

I'm curious what companies have been using to screen applications/resumes before Chat GPT. Seems like they already had shitty software.

[-] kata1yst@sh.itjust.works 4 points 3 months ago

Yet again sanitization and preparation of training inputs proves to be a much harder problem to solve then techbros think.

[-] andrew_bidlaw@sh.itjust.works 2 points 3 months ago

Let the underwhelming brain in a jar decide if your disability would make you less efficient at your work.

this post was submitted on 23 Jun 2024
38 points (95.2% liked)

Hacker News

2171 readers
2 users here now

A mirror of Hacker News' best submissions.

founded 1 year ago
MODERATORS