That reminds me of the time, quite a few years ago, Amazon tried to automate resume screening. They trained a machine learning model with anonymized resumes and whether the candidate was hired. Then they looked at what the AI was looking at. The model had trained itself on how to reject women.
Another similar "shortcut" I've heard about was that a system that analyzed job performance determined that the two key factors were being named "Jared" and playing lacrosse in high school.
And, these are the easy-to-figure-out ones we know about.
If the bias is more complicated, it might never be spotted.