, among many other examples. For these reasons, you could tell yourself that A.I. tools’ struggle to conjure up a long-partnered couple that is not white—unless they were explicitly poor—is not surprising enough to write about. Initially, I did.But as I consumed article after tweet after newsletter heralding the extraordinary feats accomplished by artificial intelligence that week, I changed my mind.
And yet, while playing around with these tools, rather than feeling as if I’d stepped into the future, I felt as if I’d entered a portal back to a magazine from the 1950s. What’s going on here?The answer, I concluded after talking to a bunch of people who research this stuff, seems to be data issues, human blindspots, and the fact that these A.I. tools don’t work the way that many of us assume they do when creating couples or hands.
The problem is that the AI isn’t really thinking. It’s generating an amalgamated image based on what it’s found on the internet. Fingers can be in countless positions. It’s just averaging everything out. Same with “poor”. More “poor people” on the internet are non-white.