Story image

Can AI promote inequality in the workplace?

02 Nov 18

Artificial intelligence (AI) is bringing incredible changes to the workplace, yet left unchecked, can emphasise and reinforce gender bias. How is it possible for machines to be sexist?

While it may sound odd that AI could be gender biased, there’s plenty of evidence to show that it’s happening when organisations don’t take the right steps.

Given the spotlight on gender equality through #MeToo and New Zealand’s focus to lift gender parity in the workplace, we think it’s important to uncover why this bias in AI occurs and what to do about it.

At Accenture, we have set the global goal to have a gender-balanced workforce by 2025.

We know that diversity in the workplace leads to better outcomes and it’s time for business to step up and truly address this and create a truly equal workplace.

New Zealand workplaces still suffer from gender inequality. The median wage for women is just under $38,000 and the gender pay gap is 9.4%, according to Statistics New Zealand.

With this in mind, it’s worth it for software developers to be fully versed on the potential bias that can be embedded into AI and prevent that from happening.

Showing prejudice towards others does not require a high level of cognitive ability and could easily be exhibited by artificially intelligent machines, recent research suggests.

Computer science and psychology experts from Cardiff University and MIT have shown that groups of autonomous machines could demonstrate prejudice by simply identifying, copying and learning this behaviour from one another.

There are plenty of examples of gender prejudice in AI; last year media reported on translation websites converting the names of occupations from Turkish and Finnish, gender-neutral languages, to English. While search websites gave male pronouns to professions such as police, engineer and leader, and female pronouns to jobs such as secretary, nanny and nurse.

Another study shows how images that are used to train image-recognition software amplify gender biases.

Two large image collections used for research purposes – including one supported by Microsoft and Facebook – were found to display predictable gender biases in photos of everyday scenes such as sport and cooking.

Images of shopping and washing were linked to women, while coaching and shooting were tied to men.

Another way to find gender bias in machines is to develop algorithms that can find it.

Scientists at Boston University have been working with Microsoft on a concept called word embeddings – sets of data that serve as a kind of computer dictionary used by AI programs.

They have found that even word embeddings trained on Google News articles exhibit female/male gender stereotypes to a disturbing extent, according to their paper.

They’ve combed through hundreds of billions of words from public data, keeping legitimate correlations (man is to king as woman is to queen) and altering ones that are biased (man is to computer programmer as woman is to homemaker), to create an unbiased public data set.

It’s important to get the settings right now for AI to ensure that it doesn’t further increase the gender divide in the workplace, and instead, supports the movement to an equal workplace.

Article by Justin Gray, country MD of Accenture NZ

Survey reveals CX disconnect is risky business
Too much conversation and too little action could lead companies to neglect, lose, and repel their very lifeblood, according to Dimension Data.
Should AI technology determine the necessity for cyber attack responses?
Fujitsu has developed an AI that supposedly automatically determines whether action needs to be taken in response to a cyber attack.
Police making progress into Cryptopia breach
New Zealand Police say they are making ‘good progress’ into the investigation of an alleged cryptocurrency theft from Christchurch-based crypto exchange Cryptopia.
NEC concludes wireless transport SDN proof of concept
"Operation and management of 5G networks are very complicated and require automation and closed-loop control with timely data refinement and quick action."
Trend Micro’s telecom security solution certified as VMware-ready
Certification by VMware allows communications service providers who prefer or have already adopted VMware vCloud NFV to add network security services from Trend Micro.
Top cybersecurity threats of 2019 – Carbon Black
Carbon Black chief cybersecurity officer Tom Kellermann combines his thoughts with those of Carbon Black's threat analysts and security strategists.
Avaya introduces private cloud delivery of its UCaaS solution
Avaya is supposedly taking a flexible hybrid approach to the cloud with these new solutions.
Data growth the growing case for managed colocation
The relentless growth of data could see colocation take on a new importance, says Jon Lucas.