Artificial Intelligence is a technological advancement that demands our attention and scrutiny. In 2024, we have access to experts in AI who are women of color and people of color. It's imperative that we keep both eyes open for biases in AI, and I'm grateful to have at least half a dozen first-degree connections on LinkedIn who are working tirelessly to ensure that AI is fair and equitable. Let's continue to support and amplify the voices of experts from diverse backgrounds in AI. Try harder Zuck! #AI #diversity #inclusion #meta https://lnkd.in/ecH_54e3
Trellis U.’s Post
More Relevant Posts
-
The main concern of a customer I spoke to recently is with the *bias* of AI 💡 Not with finding use cases for AI within the organization. And understandably so, if AI is not being governed and represented by humanity in its entirity. It is imperative that we govern and train AI to the best of our abilities when it comes to diversity and inclusion, and incept it with a diversity-first filter through which every chat, recommendation and suggestion is made. So it can be better than we are, and lead the way. #AI #Diversity
Global Talent & Learning Executive | 3X Start-up Founder, Mentor & Advisor | Breast Cancer Survivor & Patient Advocate
Artificial Intelligence is a technological advancement that demands our attention and scrutiny. In 2024, we have access to experts in AI who are women of color and people of color. It's imperative that we keep both eyes open for biases in AI, and I'm grateful to have at least half a dozen first-degree connections on LinkedIn who are working tirelessly to ensure that AI is fair and equitable. Let's continue to support and amplify the voices of experts from diverse backgrounds in AI. Try harder Zuck! #AI #diversity #inclusion #meta https://lnkd.in/ecH_54e3
Meta’s new AI council is composed entirely of white men | TechCrunch
https://techcrunch.com
To view or add a comment, sign in
-
Raising your Voices for - Inclusivity in AI. In light of the recent TechCrunch article on Meta’s new AI council, it’s time to speak up about the glaring oversight of diversity. The council’s composition, entirely of white men, is a stark reminder of the ongoing struggle for recognition faced by women and black individuals in the tech industry. The words of Dr. Joy Buolamwini echo in my mind as I read this - "If you have a face, you have a place in the conversation about AI." Her powerful spoken word piece, “AI, Ain’t I A Woman?” sheds light on the biases within AI that misinterpret and misrepresent iconic black women. We must ensure that AI technology respects and recognizes the dignity of all individuals, regardless of gender or skin color. The impact of such biases extends beyond mere representation; it affects the lives and opportunities of many. When AI fails to identify women and black people correctly, it perpetuates a cycle of exclusion and discrimination. This is not just about technology; it’s about the values we embed within it. Let’s use our voices, to advocate for a more inclusive AI. Let’s request that councils, committees, and boards reflect the diversity of the communities they serve. We owe it to the pioneers like Dr. Buolamwini and to future generations to build an equitable digital world. I am not speechless I have much to say about this. #DiversityInAI #InclusiveTechnology #AIForAll https://lnkd.in/gpS5kbvM
Meta’s new AI council is composed entirely of white men | TechCrunch
https://techcrunch.com
To view or add a comment, sign in
-
Meta has established an AI advisory council to steer its technology and product development, but it consists solely of white men. This composition is concerning given AI's potential to perpetuate and even amplify existing biases in data, which can lead to racial and gender discrimination. Such issues underline the importance of diverse representation in AI governance, as these technologies impact everyone. Read more here: https://lnkd.in/gfj4z6Dz
Meta’s new AI council is composed entirely of white men | TechCrunch
https://techcrunch.com
To view or add a comment, sign in
-
Meta’s all-white, all-male AI advisory is a stark reminder of the gaps that still exist in understanding the true essence of diversity. It’s not just about optics or fulfilling a quota—it’s about the broad, multifaceted perspectives that only a diverse group can bring. Homogenous groups are prone to groupthink, where similar experiences and perspectives can lead to blind spots. This is especially critical in AI, where the consequences of bias can be far-reaching and profound. For example, facial recognition technology has repeatedly shown racial bias, failing to accurately identify people of color. A diverse panel would catch these flaws early, ensuring more equitable outcomes. Understanding different triggers and sensitivities requires those who live them. Women, minorities and non-native speakers bring essential insights into how AI can perpetuate harmful stereotypes or unintentionally exclude people. Meta must reflect the diversity of the world it serves. True innovation and ethical responsibility demand it. An advisory council that mirrors the diversity of its user base can better foresee and mitigate AI’s unintended consequences. #diversity #meta Dr. Joy Buolamwini
Meta’s new AI council is composed entirely of white men | TechCrunch
https://techcrunch.com
To view or add a comment, sign in
-
Thank you Brian Miller for encouraging us to share stories of AI's emergent heteronormativity. My turn! I recently tried out an experimental GenAI image generator with a few peers. The conceit was that you feed it an image of your likeness, and it pops out 2 illustrations of you inside AI-imagined scenes. As it happens these peers were also LGBTQ+, but I was the only gender non-conforming. My cis peers were reborn in goofy caricature as younger and heteronormatively "hotter" - but still recognizably themselves. I posed more challenge for the AI. I can only conclude that it didn't know what to do with an 'enby': a non-binary person, often of ambiguous gender presentation. In the first scene I was caricatured as a young cis man on a motorcycle. Looking at this and my peers' outputs, it was clear that the model logic was operating in congruence with an antiquated theory of binary gender: only men and women exist, and males look masculine and do man stuff, while females appear feminine and do girl stuff. But it just got weirder with the second image, where I popped out as a ... grotesquely muscular TIGER?! My cis peers were never rendered in animal form. As The Atlantic put it in a recent article on AI's "hotness problem," "In the world of generated imagery, you’re either drop-dead gorgeous or a wrinkled, bug-eyed freak." Or, I'd now add, you aren't human at all. 🐯😂🤷🙈🙅🤡🤡🤡🤡 Granted I love tigers as much as the next queer and its low-key flattering (roar!), but hopefully we can see why all this is a problem. In 2024 it's not just 'us' who are going to be turned off by retrograde heteronormativity and assumptions of traditional binary gender in visual media. GenAI models will need to be much more socially attuned than this to be relevant. As Brian notes, that attunement will require investments in inclusive research, diverse teams, and meaningful governance. #inclusivetech The Algorithmic Justice League #enbytigers EPIC People
Just got hit in the face by an excellent example of the AI dynamic that Cindy Gallop has been sounding an alarm about — the tiny, non-representative group of people who have designed AI. I was providing some basic career advice yesterday around getting started in research to younger LGBT people — particularly gay men and women — and every single one of my messages and comments was deleted by an AI-driven moderator. The culprit? My comments were found to be “in violation of our comment policy” because they had the term “gay” in them. AI has basically written gay men and women out of active participation on a major platform because gay men and women had no significant level of influence in the decisions made to implement the tech. The same is true of women, people of color, and numerous other communities. Without a *major* push to replumb the infrastructure, companies that turn to AI as an intermediary will find their customers from these communities fleeing as fast as our feet can carry us. If you’re in a decision-making role around AI, it’s vital for the health of your business that you put real governance — and inclusive research — at the heart of any AI strategy today, right now. That governance needs to include a zero-tolerance “pass-fail” model with extensive testing from diverse communities — and is best accomplished by ensuring you’ve got a diverse group of people driving your strategy for using this technology. Meanwhile, I’ve encouraged early career gay men and women to engage on LinkedIn instead. #InclusiveTechnology
To view or add a comment, sign in
-
Excellent article by Alice Evans about the potential great divergence in AI usage Women in one study were 26.4% less likely to use AI and she cites numerous studies that compare within industries and across multiple countries. One interesting thing to note? The biggest discrepancies were when generative AI was banned. Apparently men continued to use it and women's usage dropped quite a bit. How do we get as many people around the table as possible to create #AI4All? #responsibleAI #inclusion
Are Women Missing Out on AI?
ggd.world
To view or add a comment, sign in
-
👉 What beauty standards do we want to teach AI? The Fanvue World AI Creator Awards beauty pageant only accepts AI-generated images of women. Most of the entries feature a pattern of thin, light-skinned women. Excluding diverse representations of women in the AI revolution can lead to the lack of representation we already see in other industries. 🌍 At HeadStart, we aim to tackle this issue by engaging young girls in shaping the future of the sector. 🤝 🔗Read more about the contest entries here: https://lnkd.in/ga4SsWcv
The rise of the AI beauty pageant and its complicated quest for the ‘perfect’ woman | CNN
edition.cnn.com
To view or add a comment, sign in
-
“If you build your AI system off the content on the web, you’re embedding inequalities in that.” The foundation of AI systems is often built on existing technological frameworks, which already harbor many inequalities in them. What's important about this interconnectedness? It underscores the importance of ensuring that AI development takes inclusivity and accessibility into account at all stages. #AI #Inclusion #Accessibility #DigitalDivide
‘AI Is Going to Cause the Next Digital Divide’
govtech.com
To view or add a comment, sign in
-
I was searching for an image for a post yesterday. Banged my head against search engine gender bias and generative AI drunkenness. I wanted an image of a woman, viewed from the back, presenting in front of an audience. My image searches yielded useless results: Mostly a bunch of white men speaking in front of an audience of white men. I should have known. I shouldn't have been this surprised. It upset me. Myriam Jessier said to me "welcome to my reality" when I told them about it. So I thought maybe I should give AI a go to generate an image. And my first results with MidJourney AI were.... Well, it would have been laughable if it hadn't been so sad. This post is about gender bias. But it's also about AI isn't ready for prime time in terms of accessibility. How can we trust AI for accessibility when it can't even get the basics of differentiating between a man and a woman right. #Inclusion #GenderBias #Accessibility #AI #GenerativeAI
To view or add a comment, sign in
-
Learn how to fix lack of representation in AI, here: https://cogniz.at/4eF1WqW #Diversity #Inclusion #AI
71% Of AI Workforce Is Men, Just 29% Are Women — Here’s How To Fix This
social-www.forbes.com
To view or add a comment, sign in