ChatGPT is biased against resumes with credentials that imply a disability
ChatGPT is biased against resumes with credentials that imply a disability
UW researchers found that ChatGPT consistently ranked resumes with disability-related honors and credentials — such as the “Tom Wilson Disability Leadership Award” — lower than the same...
People are biased against resumes that imply a disability. ChatGPT is just picking up on that fact and unknowingly copying it.
23 0 ReplyWe've lived in a world where resume evaluation is always unjust. It's just that. A resume can't imply anything that can be used against you.
11 0 Replystudies how generative AI can replicate and amplify real-world biases
Emphasis mine. That's a damn important factor, because the deep "learning" models are prone to make human biases worse.
I'm not sure but I think that this is caused by two things:
- It'll spam the typical value unless explicitly asked contrariwise, even if the typical value isn't that common.
- It might take co-dependent variables as if they were orthogonal, for the sake of weighting the output.
10 0 ReplyI'm curious what companies have been using to screen applications/resumes before Chat GPT. Seems like they already had shitty software.
4 0 ReplyYet again sanitization and preparation of training inputs proves to be a much harder problem to solve then techbros think.
4 0 ReplyLet the underwhelming brain in a jar decide if your disability would make you less efficient at your work.
2 0 Reply