Google Gemini: Former employee, tech leaders suggest what went wrong with the AI chatbot (2024)

close

video

Nobody is going to use Google’s woke chatbot unless they ‘fix’ its bias: Jessica Melugin

Jessica Melugin, Competitive Enterprise Institute Director of Center for Technology and Innovation, reacts to Google’s ‘woke’ artificial intelligence tool on ‘The Big Money Show.’

Backlash to the Google Gemini artificial intelligence (AI) is prompting responses from tech leaders, including a former Google software engineer and a tech entrepreneur working closely with one of Google's startup programs.

Brett Farmiloe founded Featured.com, an AI startup in the Google for Startups Cloud Program and a Level 4 partner with OpenAI through Microsoft's Startup Program.

In the three months his company has worked with Google, they have been introduced to several AI models, such as Bison, Unicorn, Gemini, Gemini Pro and Gemini Ultra. Each model, he said, has had the same "internal cheerleading and hype," but none of them have proven to match where OpenAI was nine months ago.

As a result, Farmiloe and his colleagues have been unable to adopt Google's AI models into their workflows. Though he expects this to change in six months, GPT 3.5 and GPT 4 still best serve their workflows.

OHIO SENATOR DEMANDS GOOGLE 'BREAKUP' AMID GEMINI DEBACLE: 'ONE OF THE MOST DANGEROUS COMPANIES IN THE WORLD'

Google Gemini: Former employee, tech leaders suggest what went wrong with the AI chatbot (2)

A former Google employee told Fox News Digital that Gemini may have had "critical oversights in execution." (Tobias Schwarz/AFP/Jonathan Raa/NurPhoto/David Paul Morris/Getty Images / Getty Images)

"What's happening at Google internally is that they're simply trying to catch up with Microsoft and OpenAI in the AI race. For the first time in decades, they're second best...at best," Farmiloe told Fox News Digital.

He suggested that Google is "trying to close the gap" with releases, PR, hype, and developer involvement but has been unable to close the technical gap despite putting all its engineering focus into AI.

He noted that the company approaches AI in three "buckets:" text, visual and audio. While Google likely feels its text is in a good place, according to Farmiloe, the visual category is the earliest inning.

"As an AI startup in their Cloud Program, we didn't even have access to their visual tools. That's how early they are...even developers with access don't have access. I suspect that Sora's release put even more pressure on Google to make progress in the visual category, and the technology likely wasn't ready for public use," he added.

Google halted Gemini's image generation feature last week after users on social media flagged that it was creating inaccurate historical images that sometimes replaced White people with images of Black, Native American and Asian people.

Google CEO Sundar Pichai told employees last Tuesday the company is working "around the clock" to fix Gemini's bias, calling some of the images generated by the model "completely unacceptable."

GOOGLE'S ‘ECHO CHAMBER’ WORKPLACE CLOUDING ITS IMPARTIALITY: FORMER EMPLOYEE

Google Gemini: Former employee, tech leaders suggest what went wrong with the AI chatbot (3)

FILE PHOTO: Google logo and AI Artificial Intelligence words are seen in this illustration taken, May 4, 2023. (REUTERS/Dado Ruvic/Illustration/File Photo / Reuters Photos)

Farmiloe surmised that given the immense pressure on Google to match Open AI, it likely rushed the release of its image generation, resulting in a "hard and fast" feedback loop.

"What I will say is that when Google releases new models and tools for developers to use, they've been great at saying that the model may not be ready to be used in production. That step seems to have been skipped in this instance, unfortunately," he said.

Former Google product marketing manager Garrett Yamasaki told Fox News Digital that although Gemini's initiative to promote diversity may have been "well-intentioned," it is clear the company encountered "critical oversights in execution."

Yamasaki, who spent many years working as a software engineer for Google before founding WeLoveDoodles, said it can be challenging to balance AI's representation without tipping into bias either by omission or overcompensation. While the "diversity-friendly AI" could have been a step toward addressing historical underrepresentation in digital media, Yamasaki said the backlash reveals the "fine line" between promoting diversity and inadvertently creating a "new form" of bias.

"The implications of biased AI on society are profound, affecting not just image generation but decision-making processes in healthcare, law enforcement, and employment," he added. "The Gemini backlash serves as a cautionary tale about the societal impact of AI and the importance of designing these technologies with a nuanced understanding of human diversity."

Yamasaki suggested that, internally, Google is likely reassessing its algorithms and the ethical frameworks guiding AI development. He said Google is probably engaged in "intense discussions" on refining Gemini's AI to depict human diversity in a "more balanced and accurate manner."

EXCLUSIVE: MONTANA AG CLAIMS GOOGLE GEMINI HAS ‘POLITICAL BIAS,’ MAY HAVE VIOLATED THE LAW IN LETTER TO CEO

Google Gemini: Former employee, tech leaders suggest what went wrong with the AI chatbot (4)

The Google AI logo is being displayed on a smartphone with Gemini in the background in this photo illustration, taken in Brussels, Belgium, on February 8, 2024. (Jonathan Raa/NurPhoto via Getty Images / Getty Images)

Yamasaki also predicted that the situation might push Google and the tech industry towards "deeper collaboration" with "diverse communities," including experts in ethics, sociology, and cultural studies to inform AI development processes.

"What went wrong in this rollout points to a broader issue in AI development: the need for more inclusive datasets and ethical guidelines that prioritize fairness and representation," he said. "Moving forward, tech companies must engage in open dialogues with diverse communities and stakeholders to ensure AI technologies serve and reflect the richness of human diversity, avoiding biases that could perpetuate inequalities."

The controversy surrounding Google Gemini has also reignited concerns about bias in AI in the larger tech world.

LexisNexis Risk Solutions Global Chief Information Security Officer Flavio Villanustre said large language models (LLMs), on which generative AI products are based, can amplify implicit or explicit bias found in the corpus material they are trained on.

"Because these models are not deterministic but probabilistic, it is very difficult to eliminate bias through algorithmic means or by restricting or filtering the training material. Bias can be exhibited only under conditions depending on prompts and context, so it's not easy to identify every possible scenario that could bias a given response," he told Fox News Digital.

The potential implications for society, Villanustre said, are significant and can range from a "slightly inappropriate response" to an outcome that could break existing anti-discrimination laws.

GOOGLE GEMINI BACKLASH EXPOSES COMMENTS FROM EMPLOYEES ON TRUMP, ‘ANTIRACISM’ AND ‘WHITE PRIVILEGE’

Google Gemini: Former employee, tech leaders suggest what went wrong with the AI chatbot (5)

Gemini's senior director of product management at Google previously issued an apology after the AI refused to provide images of White people. ((Photo by Betul Abali/Anadolu via Getty Images) / Getty Images)

For example, he said using these models to make hiring decisions raises the possibility of discriminating against certain groups or individuals based on ethnicity, gender, race, age and other factors. These models could also make incorrect decisions related to state benefits eligibility, loan rates, and college admissions.

"If these issues are not concerning enough, we are starting to see a more pervasive use of these models in medical applications for diagnostic and therapeutic purposes. If, due to bias, a model incorrectly assesses the condition of a patient or the appropriate treatment, it could lead to life altering consequences," he said.

But CUST Chief AI Architect Adnan Masood said open-source AI models like Gemini are "pivotal" for advancing the field because they enable a democratized approach to innovation. This, in turn, can accelerate the pace of discovery and application across many areas of study.

GET FOX BUSINESS ON THE GO BY CLICKING HERE

Masood said he believes open-source models can also promote transparency and ethical AI development by allowing broad scrutiny and understanding of the models' workings and biases, a fate that befell Gemini.

Masood, who also works as a regional director for Microsoft, said he believes Gemini will enable developers and researchers to innovate and build applications with powerful, efficient and scalable AI models.

"By providing open models like Gemma, Google aims to foster a collaborative ecosystem where the broader community can contribute to the advancement of AI, ensuring responsible development and deployment," he said. "This initiative reflects Google's commitment to open science and technology, encouraging widespread use and exploration of AI capabilities while addressing the need for models that are accessible on a variety of hardware configurations, from high-end GPUs to more modest CPUs."

Google did not return Fox News Digital's request for comment.

Google Gemini: Former employee, tech leaders suggest what went wrong with the AI chatbot (2024)
Top Articles
Return Labels Are On Us | eBay
Sorry Ripple, XRP is NOT being used by the banks
Ou Football Brainiacs
Provodac: Unveiling the advanced-technology - Business Caution
955 Ups jobs in Amsterdam
The Fat Soluble Vitamins Are Weegy
12+ MATCHING BEST FRIEND TATTOOS TO Show Off Your Bond With Your Bestie
Mienviro
How are investment banks changing?
Amazing Lash Bay Colony
Union Corners Obgyn
Swimgs Sodor Party Golfing Three Tabs Sabrina Lloyd Hit Music Winnie The Pooh Halloween Bob The Builder Christmas Summers Cow Dog Pig Seek Mister Rogers’ Neighborhood Seek Category Pages Fun Turning Invisible Pledge Break The Alpha Baa Baa Twinkle
Hodgkins Il Ups Delay 2022
Watermarke Tower Shooting
Preschool Smiles Discount Code
Joy Studio Yupoo
7Starhd Movies
Europa Universalis 4: Army Composition Guide
Schedule An Appointment With H&R Block
Tcp Cypresswood
The Africa Forum Berlin: Reframing Conservation for a Sustainable Future
Digoxin Ati Medication Template
Catholic Church Near Seatac Airport
Lady Wicked Playground
Swgoh Darth Malgus
(6302Z) Rillenkugellager einreihig 2Z Deckscheibe beidseitig, Außen-Ø 42mm, Innen-Ø 15mm, Breite 13mm von NTN
Befouled Bolt Ons
Hurst Or Tonyan
What is God Saying To YOU Today?
LA ABUELA (2021) – „Sie wartet auf Dich“ | Filmkritik
Mechanic Falls woman interviewed for paranormal documentary on the Bridgewater Triangle
Sinfuldeeds Legit Married Italian
Hampton Chronicle Hampton Iowa
Ogden Body Rubs
Webmail Inmotion Hosting Login
Wal-Mart 5220 Supercenter Photos
Eras Tour Photographer Brings Exhibition to Miami
Truist Cash Reserve: Line of Credit | Truist
Six Sigma: The Definitive Guide - SM Insight
Lexiacore4
Sam's Club Gas Price Mechanicsburg Pa
Thankful Thursday Good Morning Images
Heb Partnernet Peoplesoft Login
Foxes Are Amazing 99.Github Io
Part Time Jobs Petsmart
Express Employment Sign In
Mangadex.oeg
Jodie Sweetin Breast Reduction
Flashscore Tennis Scores
Joes Barbershop Maricopa Az
Indiana Wesleyan Transcripts
Chelactiv Max Cream
Latest Posts
Article information

Author: Van Hayes

Last Updated:

Views: 5622

Rating: 4.6 / 5 (66 voted)

Reviews: 89% of readers found this page helpful

Author information

Name: Van Hayes

Birthday: 1994-06-07

Address: 2004 Kling Rapid, New Destiny, MT 64658-2367

Phone: +512425013758

Job: National Farming Director

Hobby: Reading, Polo, Genealogy, amateur radio, Scouting, Stand-up comedy, Cryptography

Introduction: My name is Van Hayes, I am a thankful, friendly, smiling, calm, powerful, fine, enthusiastic person who loves writing and wants to share my knowledge and understanding with you.