A new artificial intelligence (AI) product created by Google and being touted for its ability to write news articles comes on the heels of the technology company’s embarrassing mix-up of two Black celebrities, prompting concerns about the tool’s ability to be discerning when it comes to race and diversity.
Genesis, the name of the AI product, is still in its beta phase, but that hasn’t stopped Google from marketing it to mainstream media companies. The New York Times reported on Thursday that it was among the news outlets recently approached by Google about Genesis.
A spokesperson told the New York Times that Google is “in the earliest stages of exploring ideas to potentially provide A.I.-enabled tools to help their journalists with their work,” but cautioned it’s not meant to be a replacement for journalists “reporting, creating and fact-checking their articles.”
But Jeff Jarvis, director of the Tow-Knight Center for Entrepreneurial Journalism at the Craig Newmark Graduate School of Journalism at the City University of New York, told the Times he had some concerns about Genesis handling content that dealt with “cultural” issues.
“If this technology can deliver factual information reliably, journalists should use the tool,” Jarvis said before offering the flip side of that scenario.
“If, on the other hand, it is misused by journalists and news organizations on topics that require nuance and cultural understanding, then it could damage the credibility not only of the tool, but of the news organizations that use it,” Jarvis added.
All one has to do is point to Google’s viral photo error earlier this month when search results for legendary soul singer Luther Vandross produced photos of gangsta rapper Master P. Despite Google being alerted to the mix-up, the problem wasn’t resolved for more than a day.
In that instance, a different spokesperson for Google told NewsOne that the error wasn’t its fault.
“We source images for Knowledge Panels from a range of sources, including licensed image providers,” the spokesperson said in a statement. “In this case, the image we received was unfortunately mislabeled. The provider has updated the image metadata, and our systems now reflect that update.”
Of course, if the Google search engine as we know it is susceptible to such avoidable and preventable errors, particularly when it comes to confusing one Black person for another – harkening to the offensive racist trope that all Black people look alike – then what similar pitfalls lie ahead for Genesis?
Never mind the fact that Google is part of Silicon Valley, a notoriously and disproportionately white place of employment where diversity issues have long gone unaddressed.
Google boasted in its 2023 Diversity Annual Report as having met “its Racial Equity Commitment of increasing leadership representation of Black+, Latinx+, and Native American+ Googlers by 30%.” However, a closer look at those numbers revealed Black employees only make up 5.6% of Google’s workforce. That means that if Black people were even involved in the creation and fine-tuning of Genesis, chances are they were few and far between even though their cultural input is obviously needed to help avoid mixups like the Vandross-Master P flub.
Case in point: AI systems heavily rely on vast datasets to learn patterns and make predictions. Unfortunately, historical data is often plagued with biases and have elements of systemic racism baked within. AI systems also require the help of humans, who can inherently insert their own biases into AI algorithms and programming.
If the data used to train AI algorithms disproportionately represents negative stereotypes or discriminatory practices, the resulting models can perpetuate and amplify those biases. This is dangerous because it can create the perfect breeding ground for anti-Blackness, leading to unfair treatment and discrimination against Black individuals in various domains, such as criminal justice, employment and lending.
Or, as we saw with Google, it can result in confusing one Black person for another even though they don’t look alike or share a similar background or, really, have much in common at all.
Master P summed it up best with some words of advice for Google after the photo mix-up.
“Stop letting AI run your [expletive] company…I don’t look like no [expletive] Luther Vandross,” Master P told TMZ. “This is why humans aren’t replaceable!”
Life After Hepatitis C: How Ruby Manuel Broke Free From Lifelong Trauma
Surviving Hepatitis C: Jessica's Story
How To Support A Loved One Who Is Living With Heart Failure
Heart In Your Hands: Important Lifestyle Changes For Heart Failure Recovery
Life In Heart Failure Recovery
Jail Justice: Social Media Memes Mock Derek Chauvin After George Floyd's Murderer Stabbed In Prison
Racist Karen Shouts 'F****** Black People' After Spitting At Pro-Palestine Demonstrators
Dr. Roni Dean-Burren, Texas Mom Who Called Out Textbook For Lying About Slavery, Dies At 46