August 13, 2025
The report highlights the gender discrimination inherent in artificial intelligence (AI) algorithms used in screening of professional resumes (CVs) when choosing new employees. This makes it less likely for women to gain employment than men.
Menna Fathy remains optimistic as she continues to look for a suitable job. No matter how difficult this is, she believes that smart applications can boost her chances as she applies and competes for jobs that come up.
At the same time, Menna is clear about the struggle she has had in trying to find a job: “I’ve put in 75 job applications, but no one has gotten back to me. I don't know why. It's like a black hole.”
Menna is not alone in this situation. 88 percent of employers believe they are losing highly qualified candidates, who are distanced from the recruitment process by the ATS software , as their CVs do not match its criteria and keywords. As a result, 75 percent of CVs are rejected and deleted from the database and are not even seen by the recruitment team.
Even though entrepreneurs know that they are missing out on the best candidates, 70 percent of large companies, 20 percent of small and medium-sized companies, and 75 percent of recruiters currently use ATS software or some other AI recruitment tool to review job applications.
Contrary to the widespread belief that technology is human friendly, ATS relies on the information it is given. The challenge is no longer about automating jobs or saving money; if someone is not fairly represented in the major inputted data, AI will not treat them impartially, because they “don’t exist” according to the programming model and the pre-defined criteria of the algorithmic classifications.
Elisabeth Kelan, Professor of Leadership and Organisation at Essex Business School and Director of the Cranfield International Centre, interviewed 69 experts in AI development to determine their position on using technology at work, including in recruitment. They were open in saying that AI systems could create algorithms biased against women by automatically ranking female applicants lower, because they have been trained on CVs that did not include women.
Machine learning works by recognising past patterns in data to predict current and future models.
Algorithms are not in themselves biased, but men account for around 90 percent of data entered, as such, women are effectively invisible to the program.
Ahmed Awad, an expert in AI and in the development of machine learning programs s, confirms that bias exists in the screening of CVs. This happens via the data used to train the model, which may not contain equal information for both sexes, or may even provide more or less details about one group than another. Therefore, at the outset of the process of machine learning, pattern analysis, the search for common features, and predictions become weaker for one category and stronger for the other.
Bias takes many forms, Awad explains. It can prioritise a particular sex, age group, or ethnicity. But this does not mean that the data is wrong, only that it
is not balanced, because it follows the narratives of the trainers who instruct the software.
For example, if the program data is instructed that men have moustaches, then anyone who does not have a moustache is classified as belonging to the other category! AI reinforces existing biases through repetition, as it derives its language models and material from information already published.
Moataz Osman, a digital program development expert at Go Media, points out that it is possible to detect potential discrimination and bias in AI and ATS systems through techniques of algorithmic auditing, which can analyze the output of the system and check if there are any unjustified disparities between different groups.
Dr. Mahmoud Khaled, technical expert and lecturer at the Faculty of Information Technology at Sinai University, agrees that the accuracy of these systems depends on the quality and variety of the data and whether biases are removed through human feedback. To ensure that there is gender justice in the use of technology, legislation and regulation are needed, as well as education and training of employees on how to deal with screening programss and evaluate them on an ongoing basis.
Tasnime Koura, who works as a specialist in data analytics programs s on a grant from ALX Africa, says that,based on her experience of an AI development and training project, the predominance of men in the workforce is a clear indication of bias. Inadvertently, masculine forms of language are used, along with language that does not focus on feelings or human traits, which is typical of a patriarchal society. These characteristics, therefore, become ingrained within the programss themselves as well as the preferences of their developers and reviewers.
Koura proposes that a profession of ethical auditing and review should be set up by AI developers and trainers as a means to reduce bias and discrimination and ensure that all issues are addressed responsibly and fairly, as is the case with Arabic transcription software, due to its reliance on foreign sources.
Tasnime Koura adds that there is no such thing as “coincidence” or “luck” online, as algorithms follow you everywhere, not only recording your preferences but predicting what you want and how you interact.
With respect to specialists working on improving search engines, Abdul Hasseb Tariq, a media buyer, says that platforms such as LinkedIn provide advertisers with detailed targeting options, allowing them to access a user’s professional network and use their data to target job advertisements to the right professional audience.
Tariq says algorithms can play an important role in providing job adverts or in skewing them by gender or ethnicity, because of algorithmic optimisation concealed within the platform. This functions without the consent of the advertiser. Audience segments are selected based on the advertiser's profile and on analysis of competitor market and data.
According to UNESCO, women are 25 percent less likely than men to know how to use digital technology. Women account for only 12 percent of AI researchers and a mere six percent of software developers. When it comes to technical roles in computer science, programming, data science, and software engineering, men hold 75 percent of the jobs.
Menna, meanwhile, is still looking for a job. “Maybe I made a mistake on my CV. I don't have any negative or even positive experiences I can talk about, because everything stops at the application stage.” But she says she will go on till she gets an opportunity one day.
This report was published in Arabic in Raseef 22 | Alyaoum 24 | Mada News | Muwatin | Al-aalem Al-jadeed | Yemen Future | Maghress