top of page

Reasons why the AI world is sexist and solutions for making it more open to women & girls



When we look around us at the “rapid growth and adoption” of technology using some form of Artificial Intelligence, do we think of it in gendered terms? No. It is given to us in a neutral form, not specially designated for girls or boys, women or men. For instance, the AI behind GPS, Google Maps, or search engines from Qwant[1] to Swiss Cows[2], Startpage[3] to Bing. Not to mention the ubiquitous predictive analysis, Deep Learning algorithms, and Machine Learning tracking our every move via our smartphones.


The actual results of applied ML are indeed gendered in that it will supposedly anonymously tell 100s if not 1000s of advertisers whether you are male or female and what you buy - your infamous “preferences” - based on your sex, age, demographic, education, location, and past consumer habits. That is the so-called “oil” captured within our volunteered data; it is the fossil fuel of information mined in this Web 2.0 iteration of our internet age.


And now 5G and Web 3.0 are emerging with 3D avatars, hybrid metaverses, and more “immersive experiences”. Arguably, this could shift the crude oil of e-commerce and screen-based Human Machine Interactions into the realm of pure gold. We are entering a highly gendered internet age where it matters even more to the owners of digital spaces using AI-powered algorithms to collect data about you in minute detail and use it even after you are dead e.g., the active “legacy” accounts of Facebook/Meta users.


We are beginning an era where the advertisers might be the avatar walking next to you in your adopted male/female/trans/non-binary digital outfit or costume – whether it be a human/animal/monster/fantasy creature in a bespoke metaverse created by brands and virtual Digital Out Of Home (DOOH) advertising. These metaverses are no longer confined to screen interactions on our phones and personal devices. Nor will they be online only. Instead, Human-Machine Interactions will occur in a hybrid world involving 3D holographic avatars - interacting with us in an Extended Reality – a mix of AR, VR, and sensory immersive experiences.


What is Web 3.0 you may ask? Well, if you check Wikipedia you will find entries about this latest evolution of the internet, the Singularity, AI, Mixed Reality, and metaverses. I choose the crowd-sourced “authority” of Wikipedia deliberately to make a relevant point. The entries cite predominantly men. It is nearly impossible to find a woman quoted in any of these Wikis.


Wikipedia is notorious in terms of its gender track record, where 90% of its editors are male and less than 20% of its content is about women, even fewer Wikis are written by women on any subject. As a former Wiki editor myself, I've seen indeed that fewer than 10% of the volunteers are female. Just 17.8% of its biographies are about women[4]. This is the crux of the problem in terms of our use of current forms of ubiquitous AI in all its forms. It is the writing of AI history about itself, the emerging authorities, who are the gatekeepers stopping women and girls from entering this realm as authors, authorities, and actual creators of AI.


Unsurprisingly, industry statistics prove that the number of women in the AI workforce is reducing. Women are not increasing their presence since the melting of the AI Winter[5] in the 1980s. Various data on “women in AI” reflect the exclusion of females from the AI-powered workplace and the largely Developed World centres where cloud-based AI services and emerging technologies like self-driving cars, XR gaming, and metaverses are being developed.


The three core problems I have identified from personal and professional experience have to do with (a) publishing (b) citations, and (c) reviews. With its predominance of men over women in positions relevant to AI development and, more importantly, critiques and history writing, Academia feeds into all three problems.


Publishing contracts. Most books on AI are written by men. It isn’t a mysterious accident or unknowable chance who gets worthwhile, powerful publishing contracts and commissions. The editors of major publishing houses do their research on who they approach to write for them or if you pitch unsolicited, they will put your ideas through a rigorous selection process whether they are a small independent press or a large multinational conglomerate. So either very few women are approached by editors or initiate this process themselves, or an equal number go through it and are then “de-selected” by the publishing houses. Who has this important data? We need it to create a solution to this fundamental problem.


The solution? Change publishing to create an equal pool of women authors to counteract the predominantly male authorship that dominates all views and analysis of this ever-important subject in emerging technologies.


Citations. The content of these mostly male-authored books on AI is referring to other male authors. From my experience, you are lucky if you scan the pages of any chapter and see a ratio of 1 woman expert quoted to every 9 men cited as experts on AI and related subjects. Again we are missing the granular analysis of this, but an easy way is to go to your local library or physical bookshop. Stand in front of all the IT shelves where the AI books are. Pick up any tome randomly and you will find proof of the above: male author(s), men who are cited throughout the text. Repeat online with a virtual bookshop. Visit Amazon bookstores in any language or Google “AI books” and you will see the gendered head count with an 80/20 ratio against women.


The solution? Change the citational bias: editors should challenge their authors when they submit their manuscript with overwhelmingly male sources ie quoting mostly texts written by men (this applies to both female and male authors). If authors are obliged to quote women writers, then we will no longer be sidelined (at best) or ignored (at worst). ‘Women writers’ will become a redundant term, as there will only be ‘authors’ writing about AI, with an evident and implied gender balance.


Reviews. Publicity-wise, check out the reviews quoted on the front pages and back cover, even on the front cover, of any of these books. The gender ratio is often 90/10 men promoting and praising other men. Usually men of very high status and unquestionable positions of power in science, the arts, politics, technology, and business. The majority of books I’m reading for my forthcoming book have 100% exclusively male reviews about mostly men authors as mainstream women authors are so scant. They also need to be published by presses of the same reach and stature/reputation as those that publish men on the subject of AI.


The solution? Change the reviewer bias: newspaper and magazine editors need to commission more work by female reviewers, women critics, and commentators on tech. If the media publishers don’t publish reviews written by women, we will continue to have a very skewed view of the world in terms of technology or indeed, many other subjects. Academics have an important role to play as they are uniquely placed to influence press and industry journals as to who is in their stable of writers and reviewers.


If you want to do something immediately about this, here is what you can do:


In Education & Work: challenge all male reading material. When your corporate librarians order e-books or subscribe to business book collections, demand equal representation of women and male authors on technology subjects, which will invariably include the “hot topic” of Artificial Intelligence.


In Government & Institutions like museums and galleries: if you are a politician, official, administrator, or decision-maker of any kind, be aware of the gender biases above. You are in a position to make a difference! Your choice of keynote speaker, conference attendees, workshop presenters, and trainers can make a difference. If you keep hiring mostly men, referring and citing as authorities more often than not men, then you are setting a public, powerful signal that the world of AI is created and assessed by men with a few token women as bystanders. There are enough active, high-status women experts in AI to choose from a broad and deep pool to change the visible participation of females in this crucially strategic and historic aspect of public life.


In our everyday life: the next time you use AI, think about its gendered creation. Did the content and structure of the computer game come from an all-male production team? Did your smartphone’s algorithms launch from a mostly male team with a consensual, unchallenged view of the world of shopping, consumerism, ethics, and fair play? Where are the gendered perspectives on these applications from the ideation, concept stage, execution, and deployment stages? Become a discerning user of AI and don’t let it use you.


 

Dr. phil. Tania Peitzker is an Adjunct Professor of metaverses and cognitive interfaces at the University of Silicon Valley & will be teaching MOOCs on diversity based on her forthcoming SpringerNature book (2023).

She has been in the AI business for several decades, as a developer of NLP algorithms and their Use Cases for Mixed Reality and Conversational Commerce. Tania has written on Conversational AI for New York’s Business Expert Press and San Francisco’s Venturebeat, apart from having given many webinars and workshops on these interrelated topics of Singularity, Web 3.0, and emerging technologies for Smart Cities. Please visit her portfolio www.taniapeitzker.expert which reflects her interdisciplinary origins in Humanities, Law, Philosophy, and Emerging Technologies criticism. Tania has various personal and professional roots in Australia, Germany, France, the UK, and Switzerland.


Dr. phil. Tania Peitzker shares her thoughts on gender diversity in AI - as part of the 60 Leaders on AI initiative.



Cover photo by John Schaidler on Unsplash

2 comments
bottom of page