Virtual assistants have made all our lives more productive ,efficient and less complicated . From forecasting the weather, to setting our reminders they assist us in virtually everything in our lives and have evolved as quite inescapable to human existence. Life without any of these assistants seem almost impossible now. But wait, have we ever thought why most of these assistants sound female by default? why do we tend to associate them with female identities always? Does this have anything to do with gender stereotypes and the social stigmas prevalent around us.?
Google, Siri, Alexa and most other popular virtual assistants have defaulted to female voices and in many cases given female names. This evidently points to the deep-rooted social attitude of reinforcing the acts of service to women. From ancient times women were expected to take care of everyone in the family and manage both household chores and several other additional responsibilities with equal efficiency. It’s petrifying that even in this century such stereotypes do exist very strongly in one way or the other amongst us. The typical portrayal of women as ‘natural caregivers‘, capable of handling innumerous tasks acquiescently , still do find a place in our society. Attributing female identities to virtual assistants will only underpin such biases.
At the same time there are several technological reasons behind this. As the majority of the systems are trained in female voices, it is very difficult for them to perceive any other voices. Moreover, women’s voices are discerned as more clear and well-pitched, capable enough to reach out to a multitude. A recent Market research showed majority of the clients round the globe prefer female voice for their virtual assistants as it sounds much more reliable and easily perceivable. On a psychological aspect, many of us unknowingly associate women’s voices with care and responsibility quite habitually. Earlier, the United Nations had asked all the companies globally to turn the voices of their virtual assistants gender-neutral, to completely avoid any promotion of societal paradigms. Apple’s virtual assistant Siri stopped defaulting to female voice and was made available in four different voices this year, thus making a small step for many but a giant leap to equality.
Companies must try to give their clients an opportunity to choose the voice of their assistants rather than setting one as default. It’s apparent that such stereotypes underline women as obliging, ready to obey without question any blunt voice commands. Feminizing virtual assistants is an archetype of how these gender inequalities trickle deep into every field. It is indeed a pressing priority to annihilate pervasive gender biases and stereotypes entrenched in our society, which are enormous obstructions in our path to gender equality. Let’s smash the glass ceilings and rise up for an “equal” tomorrow.