I was searching for an image for a post yesterday. Banged my head against search engine gender bias and generative AI drunkenness. I wanted an image of a woman, viewed from the back, presenting in front of an audience. My image searches yielded useless results: Mostly a bunch of white men speaking in front of an audience of white men. I should have known. I shouldn't have been this surprised. It upset me. Myriam Jessier said to me "welcome to my reality" when I told them about it. So I thought maybe I should give AI a go to generate an image. And my first results with MidJourney AI were.... Well, it would have been laughable if it hadn't been so sad. This post is about gender bias. But it's also about AI isn't ready for prime time in terms of accessibility. How can we trust AI for accessibility when it can't even get the basics of differentiating between a man and a woman right. #Inclusion #GenderBias #Accessibility #AI #GenerativeAI
Nicolas Steenhout’s Post
More Relevant Posts
-
The conversations around gender bias in Artificial Intelligence should involve not just the tech industry but all of us both individually and socially. 💡 As AI becomes more integrated into our everyday lives, using it responsibly should be seen as everyone’s problem. In this article, we explore the difficult question of how to overcome gender bias in AI. Read more at https://hubs.la/Q02ycX2f0 #WBSCODINGSCHOOL #TechEducation #CodingBootcamp #TechBootcamp #CareerChange #TechJourney #ChangeMakers
To view or add a comment, sign in
-
Have you ever tried to analyze "Gender Neutral German" with a Natural Language Processing" algorithm? It completely fails. What does that mean? It will not extract meaning from sentences and relate properties to the right people. NLP and in inference, AI LLM models are "following" technologies. As long as not enough text input is written in the "Gender Neutral" form(s) and fed to the NLP models, the models will not recognize the meaning in the sentences. In that sense, a Google Query analysis has a conservative (old-fashioned language) bias.
The conversations around gender bias in Artificial Intelligence should involve not just the tech industry but all of us both individually and socially. 💡 As AI becomes more integrated into our everyday lives, using it responsibly should be seen as everyone’s problem. In this article, we explore the difficult question of how to overcome gender bias in AI. Read more at https://hubs.la/Q02ycX2f0 #WBSCODINGSCHOOL #TechEducation #CodingBootcamp #TechBootcamp #CareerChange #TechJourney #ChangeMakers
To view or add a comment, sign in
-
https://lnkd.in/gt9A77gH The most popular artificial intelligence (AI) tools show #prejudice against #women as well as different cultures and sexualities, according to a new report led by researchers from UCL. The findings showed clear evidence of #bias against women in content generated by each of the Large Language Models studied. This included strong stereotypical associations between female names and words such as ‘family’, ‘children’ and ‘husband’ that conform to traditional gender roles. In contrast, male names were more likely to be associated with words like ‘career’, ‘executives’, ‘management’ and ‘business’. The authors also found evidence of gender-based stereotyped notions in generated text, including negative stereotypes depending on culture or sexuality.
To view or add a comment, sign in
-
Excellent article by Alice Evans about the potential great divergence in AI usage Women in one study were 26.4% less likely to use AI and she cites numerous studies that compare within industries and across multiple countries. One interesting thing to note? The biggest discrepancies were when generative AI was banned. Apparently men continued to use it and women's usage dropped quite a bit. How do we get as many people around the table as possible to create #AI4All? #responsibleAI #inclusion
Are Women Missing Out on AI?
ggd.world
To view or add a comment, sign in
-
With Women's History Month in Mind Here's an interesting read on how generative AI can lead to gender bias. The article discusses gender bias in generative AI. It highlights the issue of AI systems reflecting societal biases, and the negative consequences this can have. These biases are perpetuated by the data used to train AI systems. The lack of women in leadership positions in the field is also a result. The solution is increased participation of women in AI development. Everything circles back to the need for greater diversity and inclusion. https://lnkd.in/gmwUQ8xV #womenhistorymonth
22% more Exploring gender bias in generative AI
boardofinnovation.com
To view or add a comment, sign in
-
Generative AI: UNESCO study reveals alarming evidence of regressive gender stereotypes. The study revealed worrying tendencies in Large Language models (LLM) to produce gender bias, as well as homophobia and racial stereotyping. Women were described as working in domestic roles far more often than men ¬– four times as often by one model – and were frequently associated with words like “home”, “family” and “children”, while male names were linked to “business”, “executive”, “salary”, and “career”. #LLMs #tech #bias #generativeAi
Generative AI: UNESCO study reveals alarming evidence of regressive gender stereotypes
unesco.org
To view or add a comment, sign in
-
Hope everyone had a productive week! Time for our Friday series – AI FUN FACT! 🎉 Did you know most AI bots are designed with female personas? Think of Siri, Alexa, and Cortana. But have you ever wondered why? 🤔 This trend stems from societal perceptions. Research shows people often associate female voices with approachability, trust, and helpfulness. Historically, these traits have been linked to roles like customer service, shaping AI design choices. But here’s the big question: Should AI even have a gender? As AI evolves, we must rethink these decisions. Are we unintentionally reinforcing stereotypes? Should bots adopt neutral or diverse personas instead? Let’s discuss your thoughts! 👇 #AI #ArtificialIntelligence #FridayFunFact #TechTalk #Inclusion #DiversityInTech
To view or add a comment, sign in
-
While I agree that a big part of the answer to the bias in #AI is more women, more minorities, more seniors and more diversity in AI talent, getting there isn't simple. There’s a shockingly simple answer to the AI bias conundrum: More diversity
There’s a shockingly simple answer to the AI bias conundrum: More diversity
https://venturebeat.com
To view or add a comment, sign in
-
One AI “girlfriend” chatbot describes itself as “Your devoted girlfriend, always eager to please you in every imaginable way.” The world does not need any more innovations that heighten a sense of male entitlement and female subservience. As AI gets more sophisticated, we need to take seriously the real risks associated with its perpetuation of harmful gender roles. How about some technology that supports respect, equality and mutuality?! (Where is the business model in that, you might well ask. Indeed. If profitability defines where we’re heading, we’re in trouble…)
AI girlfriends are here – but there’s a dark side to virtual companions | Arwa Mahdawi
theguardian.com
To view or add a comment, sign in
-
AI journalist, Madhumita Murgia, has argued in her new book 'Code Dependent: Living in the shadow of AI' that women, migrants, precarious workers, and socioeconomic and racial minorities are disproportionately impacted by bias and hallucinations in Generative AI. That's because those groups are underrepresented among those deciding the future of AI. As this article from the World Association of News Publishers argues, companies will have to ensure that a diverse group of their staff collaborate when testing new AI tools –and quotas may have to be considered to guarantee genuine breadth of perspectives and opinions. https://lnkd.in/da2mNVdJ #ai #womenintech #tech #dei #diversity #inclusion
AI, bias and experiments: how Women in News is tackling tech’s inbuilt stereotypes
https://wan-ifra.org
To view or add a comment, sign in