This just shows that AI sucks for getting accurate information. Even if it didn't hallucinate black people, it would've been just as wrong, just with white skinned queens. Now the lies just line up with "current social freakout of conservatives".
It's "February", so when the user typed "Febu" the program discarded "February" as an option. Only the "ary" part is autocompleted in that image, the "Febu" part has been typed manually. Although I can imagine that that first R will steadily get dropped in some dialects of English considering how it isn't really pronounced in them
Someone posted the version I saw below. In that one, the person has only typed "Fe". I haven't seen one that had "Febu" typed, but yeah, that obviously would throw it.
It really does not, even if you have a perfectly accurate model and ask it "draw an English queen, but make it ethnically diverse" this would still appear.