I wonder if Meta implemented a very basic "no porn keywords" filter. "Interracial" is quite a common keyword on porn websites, perhaps that's why it won't pick up on it well or wasn't trained on images like it?
I'm thinking it's an issue that's more along the law of averages in the training data. Interracial images don't tend to have descriptions specifying that the content is interracial, or which individual is which race, but there are many photo's of individuals who are known to be of a specific race, so it falls back to the stronger probabilistic confidence in race x OR y OR z and completely ignores the weak "AND" correlation coefficients.
When asked by CNN to simply generate an image of an interracial couple, meanwhile, the Meta tool responded with: “This image can’t be generated. Please try something else.”
You rtfa....
When asked by CNN to simply generate an image of an interracial couple, meanwhile, the Meta tool responded with: “This image can’t be generated. Please try something else.”