Google’s chatbot Gemini (formerly Bard) became a Twitter punching bag this week when users noticed that it’s extremely difficult to get the AI to generate an image of a white person. This tweet appears to have been patient zero:
From there, Twitter was off and running:
We also learned that Gemini has obvious double standards but is also programmed with enough chutzpah to lecture you:
I don’t even feel like telling jokes about this. I mean, folks: The guy asked for a German soldier in 1943 and the AI gave him a picture of a young Asian woman dressed as a Nazi. I can’t top that. But I did want to experience Gemini’s double standards for myself. The chatbot and I had an exchange that went in a direction that I did not expect it to go — that exchange is pasted below. The only context you need is that the Homestead Grays were a Negro League baseball team and white teams like the New York Yankees did not start integrating until the late 1940s.
I thought this was going to end with you asking it to open the pod bay doors.
I am simultaneously angry and relieved at the extreme lameness exhibited. It enrages me that accuracy is so little valued. It encourages me that it IS so inaccurate, because there is no way it can be taken seriously.