Could artificial intelligence be a brand’s diversity saviour?
I’ve argued that big data, in general, is often misguided and that our industry’s unconscious bias and the reliance on data measured in a dysfunctional way could be doing more harm than good. That bias flows through into the workflow, the creative work, out to our customer and impacts the wider society. But what if this technology could be used to avoid the unconscious bias that many people in our industry still harbour?
Artificial intelligence, when left to people to train and program, can be passing on bias and perpetuating it.
We’ve learned from the US Justice Department’s National Institute of Corrections that AI amplified existing racial bias by close to 200% through a system designed to predict the likelihood of criminals re-offending. The system sent more black population to prison than white people, based on an assumption that past data can’t be relied on if bias is in play.
This is an important wake up call for marketing as we start to head to a world of AI on a curve of greater uptake.
Let’s look at Google’s demonstration of their AI Assistant technology for making appointments and doing regular tasks that humans do. This demonstration presented by Google shows an AI Assistant making a hairdressing appointment, even with the human- esque um’s and ah’s a regular person might do making it undetected as a robot (let’s park the ethics of this for another conversation!).
Most people view this video and are just amazed at how smart the technology is, but miss the subtle but biased language used. I won’t spoil it yet, check out the video and see for yourself:
It is pretty likely that a lot of men won’t have a clue what’s wrong with this demonstration. A lot of women too. It is pretty evident that the AI script has been trained by a man when you really get to it.
The truth is that no woman would call a hairdressing salon and ask for a ‘Women’s Haircut’, she would be likely calling a women’s salon in the first place and therefore even mentioning gender is defunct; but she would likely be asking for specifics such as ‘I’d like a cut and colour’, or ‘I’d like a cut and half head of foils’, or ‘I need a cut and blow dry’.
Bias of all kinds is very subtle, and many of us don’t even notice, and simply keep perpetuating it in our data and in our creative that we produce.
The interesting opportunity for us to think about is that once a problem is identified in AI scripting it can be corrected, and then it is correct once and for all. In human’s, we can forget, in another situation we don’t apply it again and between individuals we can be very different and inconsistent. Unconscious bias to the point of brand and cultural damage keeps sneaking out there into society by the marketing industry. We’re not reliable.
The issue for the creative industry to think about is that unless the people in the creative and technology departments can remedy their own bias and stop putting unhealthy thinking into the creative chain (and what we’re exposing the customer to) it is quite possible we’ll program creatives out of the process sooner than later.
If we look at the average profile of the advertising and media industry we’re a population of 27 year old males. This is not aligned with the real customer out there who has a diverse age, race, cultural make-up, and a range of abilities and thinking styles.
If you look at the staff photos of many creative agencies they are as if cast for sameness. Many agencies think culture is about the way their people look as a collective, and not the way they think.
We’ve seen the commoditisation of good ideas and strategic thinking over the last decade and resorted to more tactical activity that is short term focused. This next phase of reducing creativity even further using automation could be the fast-track to remedy our industry’s problem with diversity and inclusion.
So, what to do?
- We need to stop thinking that diversity and inclusion is an HR problem and think of it as a strategic misalignment with our customers
- There needs to be more robust checks and measures in the creative and technology pipeline to be sure that subtle bias is removed from our process
- Research for bias coding guidelines that help to correct minority bias*
- Creatives who want to remain relevant for longer need to educate themselves on the impact of their own bias on their work
- Stress test past data to be sure there is no bias impacting the results in the past, and stop perpetuating it moving forward
- Question everything and stop working on auto-pilot
- Marketers need to be vigilant and stop assuming that their agency is thinking about this, as it is likely they are well behind your business in this topic
Let’s ensure we remain relevant and able to protect the brands we work for who trust us to understand their real customer. Let’s protect the incredible creative talent in our industry that will be pushed out by more reliable technology, at the expense of great ideas that impact business for good.
*Decoupled classifiers for fair and efficient machine learning by Cynthia Dwork, Nicole Immorlica, Adam Tauman Kalai, and Max Leiserson.
Sources: AI Racial bias: https://www.propublica.org/article/machine-bias-risk- assessments-in-criminal-sentencing
Agency statistics via The Communications Council 2018.