The awkward workaround Google and Apple have implemented in its photo software highlights the difficulties tech companies face in advancing image analysis technology as well as the inherent biases in AI — especially in terms of recognition of dark-skinned faces.
AI only “knows” what it has been trained on. Since structural racism exists, racism will be present in how AI operates. It does not mean we will have AI Hitler trying to kill Jews, but it might mean things like an AI drawing program defaulting to a white woman when asked to draw a generic woman. It could mean that bias that already exists gets amplified, for example an AI “pre-crime” program targeting Black neighborhoods as potential hotspots while ignoring similar White neighborhoods.
Concerning that both Google and Apple have to remove the term Gorilla from their apps. Wonder if AI will become racist?
Too late.
It already does reflect biases in its training data, which is not sanitized from racism. :/
Oh, my friend, that ship has sailed.
AI only “knows” what it has been trained on. Since structural racism exists, racism will be present in how AI operates. It does not mean we will have AI Hitler trying to kill Jews, but it might mean things like an AI drawing program defaulting to a white woman when asked to draw a generic woman. It could mean that bias that already exists gets amplified, for example an AI “pre-crime” program targeting Black neighborhoods as potential hotspots while ignoring similar White neighborhoods.