Gender-inclusive language in digital pedagogy

August 29, 2021

Digital language must evolve but not with an intention to exclude, demean, ostracise or marginalise others

Gender-inclusive language in  digital pedagogy

The phenomenon of linguistic bias is widespread. It has been observed in societies all over the world. Whether it is about race, ethnicity, religion, nationality, or gender, language has the capacity to carve out a unique place, a biased nuance, or a specific trend for its gendered users. Most of the assumptions and practices employed in a language are rooted in patriarchal ideology and driven by hegemonic influence of the powerful groups of society that are factually and categorically masculine.

Here I use the word ‘pedagogy’ in a broad sense. It involves instruction imparted at the educational institutes but also includes training of minds at a societal level. Language is indispensable for any pedagogical process. Language users are active agents that run this process. The language feed by human agency channelises and orients precise meanings for language recipients. They, in turn, have to bear the burden of the ideologies inherent in these meanings.

The effects of language that caters to some and excludes the others have become an area of extensive research during the last few decades. Researchers and educationists have emphasised the need to introduce and establish a feminist perspective in language and prevalent discourses in order to erase the bias towards female gender. Yet, even in the 21st Century we see a profusion of male-stream disciplines, male-centred pedagogies, and male-dominated institutions. There is, however, a gradual but influential shift, especially in academia to allocate a balanced and unbiased environment to all genders and respect their particularities and peculiarities.

Apparently, technology has played an important role in giving voice to the females and shedding many of the stereotypical linguistic biases considered previously as fixed. A careful scrutiny of the digital world reveals how one gender is favoured against the other. Most of the digital devices are designed and wired by professionals who adopt standard linguistic structures (masculine vocabulary).

For example, predictive text apps give options by assessing the co-texts, i.e., girls will be matched with words like ‘doll‘ and ‘cute‘. Boys may end up with ‘fast’, ‘power’, ‘brilliant’, the qualities that empower an individual. This can be of special significance if we talk about digital pedagogy that has taken hold of our educational and learning management systems with the suspension of physical classrooms. Since most of the time the learners inhabit the digital world, tendencies towards genders are likely to get locked.

Moreover, texting via different apps has become an alternative mode of learning that can jeopardise the formation of fair learning platforms in the presence of masculine algorithms. Assessing the predictive text feature, one of my colleagues noticed that Google has shown some improvements during the last few years in terms of introducing and appropriating linguistic repertoire related to gender. She tried leading phrases like ‘most boys are’ and ‘most girls are’ and did not get any autocomplete options. It was heartening, indeed.

There is also a dire need to train the learners in the art of identifying and analysing the language that reeks of bias and makes an equitable digital space inaccessible or unacceptable.

However, by typing ‘why are girls’, she got ‘so hot’, ‘always cold’, ‘so sensitive’, ‘so cute’, ‘more flexible’, and ‘mean to me’. While, for ‘why are boys’, she got ‘mean to girls’, ‘taller than girls’, ‘faster than girls’, ‘so cute’, ‘attracted to girls’, and ‘so immature’. The responses made it clear that certain positive qualities are by default attached with male gender, while girls despite being ‘hot’, ‘cute’ and ‘sensitive’, exhibit an ideological weakness and delicacy, marking them as less powerful and more dependent.

Typing the word ‘success’ in ‘Youtube’ would give a plethora of songs, videos, motivational talks, interviews, and visuals showing only men; whereas the insertion of word ‘delicate’ mostly revealed females and feminine images. When I inserted the word ‘intelligence’ in the search bar, I got several lectures, a majority of them delivered by men scholars, educationists, and speakers, whereas, the word ‘beauty’ unlocked a world of pretty female faces with pouting lips and sexually provocative gestures. Linguistic associations are created in such a way that the whole system of digital network imbibes them as a given.

A website, showcasing French vocabulary used for technology categorises, ‘computer’, ‘laptop’, ‘world wide web’, ‘email’ and ‘hotmail’ as all masculine. Interestingly, if I think about ‘computer’ in Urdu, I use a masculine grammatical category. This tendency belongs to the construction of language per se that takes us back in history and allows us to acknowledge the persistent male principle actively institutionalising languages.

In the digital world, instances show that algorithms are mostly scripted with a bias towards one gender. Since most of the machine learning models are conceived and made by humans, they are potentially prone to human biases, especially male biases. No wonder we still see masculinity coded and decoded naturally and effortlessly by our brains.

Research shows that gender scripts are embedded in software, educational tools, simulations, games, emails and the internet. These are reinforced in institutional practices and affect the learning experiences. For the same reason, despite the prevalent discourse on female emancipation and equality, words like, ‘manpower’, ‘guys’, ‘middleman’, ‘chairman’ are difficult to rid of.

Coded language either assumes the gender of the developers or users or else helps them to assume genders. Artificial intelligence, which consists of this coded language, directly controls the conception and creation of new technologies and by extension digital gadgets. Consequently, it becomes quite obvious that a data generated and coded will carry the qualities of human brain from where it originates. Hence all prejudices, biases, and predispositions emerge in technology through human intervention.

Uploading of unrepresentative data, amplifies gender biases because of the widespread access of technology and its huge following. What remains localised in a society spreads like virus in the digital world and attains the status of a reality that once positioned as a norm, takes ages to debunk.

Hence, digital language must not evolve to exclude, demean, ostracise or marginalise others. Coding should be done by assessing the shades of vocabulary and replacing biased structures with more acceptable language. By using more inclusive and politically and ideologically appropriate language in codes and documentation, a more unbiased digital environment can be created.

There is also a dire need to train the learners in the art of identifying and analysing the language that reeks of bias and makes an equitable digital space inaccessible or unacceptable.


The writer serves at the Department of English and Literary Studies, University of Management and Technology, Lahore, as associate professor and chairperson.

Gender-inclusive language in digital pedagogy