Indiary

Follow us
Stay Informed about the Latest News
Sign up to our newsletter
Back to news

AI replicating same conceptions of gender roles that are being removed in real world

There is nothing inherently empowering or sexist about technology. It just reflects the values of its creators.

A recent article in The Guardian estimated that the sex tech industry, which is less than a decade old, is already worth $30 billion. This estimate is expected to grow exponentially as industry gears up to unveil hyper-realistic female sex robots customised for men. This has two main implications: first, considerable money, time and effort are dedicated towards modelling machine behaviour to cater to male preferences by objectifying the female form. Second, the technology needed to drive these innovations are designed in most cases by male coders.

The gender equation is reinforced in another manner. While lines of code are written by men, artificial intelligence (AI) is often female. The fact that Siri, Alexa, Amelia, Amy and Cortana are all designed as hyperintelligent yet servile female chatbots is not coincidental.

On the other hand, women’s participation (and, therefore, data sets from women) in certain media fora is highly under-represented. A 2015 paper by ORF’s Sydney Anderson (‘India’s Gender Digital Divide: Women and Politics on Twitter’) found women’s voices to be “significantly under-represented” in online political conversations.

So, it is not surprising then that when Microsoft released the ‘millennial’ chatbot Tay in March 2016, she quickly adapted to her male-dominated ecosystem and started using racist slurs and sexually offensive language on Twitter.

As coders and consumers of technology are largely male, they are crafting algorithms that absorb existing gender and racial prejudices.

AI is replicating the same conceptions of gender roles that are being removed in the real world. For instance, Apple’s Siri, Microsoft’s Cortana and Amazon’s Alexa are essentially modelled after efficient and subservient secretaries. This seemingly innocuous assignment of female characteristics to AI personalities has dangerous implications. These chatbots reportedly receive sexually-charged messages on a regular basis. More damaging still is the fact that they are programmed to respond deferentially or even play along with such suggestions. Essentially, sexual harassment that has now been made illegal in physical workplaces is normalised by AI.

Voices of disembodied, supportive AI tend to be female as both men and women find them less threatening. This comfort in issuing orders to a female voice is inherently problematic that tech companies have now acknowledged. Not only are companies investing in developing male bots and genderless bots, reportedly when someone asks Cortana, “Are you a girl?” she replies, “No. But I’m awesome like a girl.” Similarly, Alexa has been described as a ‘self-identified feminist’.

While feminist female chatbots are encouraging, they can hardly solve the inbuilt sexism by design of AI. In 2015, Carnegie Mellon University researchers found that the Google search engine was less likely to show ads of highly paid jobs to women as compared to men. A 2016 study discovered that data-mining algorithms associated words like philosopher, captain, warrior and boss with maleness, while top results for ‘she’ were homemaker, nurse and receptionist.

As AI grows in influence and gender biases continue seeping through algorithms, existing inequalities will be exacerbated.

In India, for instance, the legal sector is gradually embracing AI, which is expected to improve speed and efficiency by automating tasks such as document drafting, undertaking legal research and due diligence. Similarly, news-writing bots are now functioning in the world of journalism.

In both cases, AI will autonomously generate output by identifying story angles based on algorithms with ‘built-in’ criteria. When cases involving sexual violence and their portrayal in traditional news media are already under scrutiny, it’s important to question how male-hegemonic data sets will impact future news stories and court coverage of sexual assault and other topics requiring greater gender sensitivity. Since only 29% of internet users and 28% of mobile phone owners in India are women, improving access to basic information and communication technology services and infrastructure remains critical.

There is nothing inherently empowering or sexist about technology. It just reflects the values of its creators.

This commentary originally appeared in Economics Times.

Vidisha Mishra & Samir Saran
27 June 2017

Post a comment

Please check that the information in the fields here below is correct.

Your comment is awaiting approval and will soon appear below!

Comments :

  • No comments

Newsletters

Stay Informed about the Latest News

Created by BlueLeaf.ch
Stay Informed about the Latest News
Sign up to our newsletter