97
Google explains Gemini’s “embarrassing” AI pictures of diverse Nazis
(www.theverge.com)
This is a most excellent place for technology news and articles.
Why would anyone do that? Like just tell the AI you want diverse people yourself, making it the default just adds more moving parts that could fail(and you end up with diverse nazis or ryan gosling as black panther)
It's done because the underlying training data is heavily biased to begin with. It's been a known issue for along time with AI/ML, for example racist cameras have been an issue for decades https://petapixel.com/2010/01/22/racist-camera-phenomenon-explained-almost/.
So they do this to try to correct for biases in their training data. It's a terrible idea, and shows the rocky path forward for GenAI, but it's easier than actually fixing the problem ¯\_(ツ)_/¯