A Google engineer has issued an apology after an automated Google Photos app labeled a photo of two black people as containing "gorillas." The issues began when Jacky Alciné found that pictures he had uploaded to the app consistently mislabeled him and his friend.
He tweeted the issue at Google and, to their credit, Google's chief social architect Yonatan Zunger responded quickly with an apology. Alciné is a computer programmer himself, and, in a separate tweet stated that while he understood how something like this could happen, but that he wanted to know why.
According to the Verge, part of it is that Google's photo app is both a learning app, and very new. The app currently labels what it thinks is in photos, but is still familiarizing itself the shapes and shades of objects. This isn't the only problem that the app has had labeling things, just the most glaring and inappropriate. And this is hardly a new problem.
Even before the age of digital photography Kodak film was notorious for the bias it had against properly capturing dark skin. Point-and-shoot cameras with facial recognition software would ask if Asians had blinked. And now, in the age of nearly unlimited images that are being categorized by algorithms, we are inevitably going to see mislabeling that offends and embarrasses.
This isn't the place to get into the deeper issues of unintentional bias and privilege that had a part in creating these problems, but, to me, the issue is certainly an example of the difficulty and dangers that come with our increasingly automated world.
A benign example from my own life is Google's Chrome browser, which recently updated to try and automatically categorize new bookmarks. For me, it was unable to tell the difference between the homepages of artists and the actual art they were creating. And these sorts of things happen all the time. Apple's Siri voice-recognition software had a lot of trouble deciphering Scottish accents, or, again, the Google Photos App mislabeling dogs as horses.
As the world, by necessity, becomes more automatic and based on algorithmic sorting, we're going to see more of these issues crop up. At a certain point we should be able to predict when this will happen, before our occasional embarrassment become a more permanent shame.