Technology will save us… is the name of a brand of tech constructor kits that have bucked a rather dismal trend to achieve a massive following among pre-teen females.
But can technology – in the form of machine learning – save us from gender- and race-biased advertising? This was the subject of a fascinating discussion hosted by Lobster.media (vendors of ‘bias-free’ image search) last night. So far, the evidence doesn’t look promising. Google Photos’ algorithm has tagged black men as gorillas, and a man was tagged as a woman simply because he was in a kitchen.
But the speakers were convinced that ML would rise above this, learn where it’s going wrong and help advertisers create less culturally biased campaigns.
Personally I’m pessimistic, and not just because I’m a white middle-class male creative director. I’m pessimistic for three reasons:
First, rejection of prejudice is largely an emotional decision. We reject biased ‘data’ because we’re offended by it. How will algorithms know which data to reject? Or if they’re programmed to monitor public opinion on these matters, whose opinion to accept? (I’m going to make a prejudiced claim here and state that bigots produce the most data.)
Second, algorithms have a vested interest in going along with the cultural status quo. Most of them will earn their crust by producing granularly personalised emails and social posts for each target customer, based on their browsing and social behaviour, and unavoidably, their conscious and unconscious prejudices.
Third, in these days of uncritical data-worship, algorithms – and many marketers – will tend to gravitate towards whichever statistics, are most, um, black and white. If the data says that images are shared more often when they exhibit certain people, in certain poses, doing certain activities, that’s what we’ll see.
ML will have all kinds of exciting uses, but it doesn’t look like it’ll be a recipe for diversity. But please tell me I’m wrong.
More reasons for gloom:
When I typed ‘family’ into the bias-free Lobster photo search engine, it gave me happy white families, happy Asian families, and happy shack-dwelling Black families.