Brands, Don’t Be Evil

Share this post

“All of us who professionally use the mass media are the shapers of society. We can vulgarize that society. We can brutalize it. Or we can help lift it onto a higher level.” – Bill Bernbach

Technologists argue that algorithms are the future – a machine-intelligence that works in our best interests, learning our preferences and improving our lives. These invisible hands shape your life in countless imperceptible ways. Algorithms determine the content of your news feed, your musical choices, and your search results. Trust the algorithm –  the siren song of Silicon Valley – it knows you better than you know yourself….

Dark side of data

In recent years a body of opposition has been growing. People are questioning the impact of algorithms on their lives, their jobs, their children. Are they creating informational bubbles, are they making us more isolated, are they ripping us off? It is timely to consider whether they are really acting in our best interests. It would be naive to think that there was no dark side to all this. Algorithms are tools of control and influence. They play an increasing role in media, commerce, politics and the military. US tech groups already face fierce scrutiny of their impact on society, from politicians and consumer advocates, activist investors and their own former employees.

The cyberwarfare of our imaginations involved shadowy team of hackers infiltrating our water and power systems. The reality was even more extraordinary. The Russians bought adverts on Facebook to disrupt the democratic process of an American election. They created and magnified social divisions by distorting the truth. “America, we have a problem,” said Representative Jackie Speier, a California Democrat who sits on the House committee. “We basically have the brightest minds of our tech community here and Russia was able to weaponize your platforms to divide us, to dupe us and to discredit democracy.”

Facebook owes me money

The largely automated and algorithmically governed content machines need to accept their responsibilities as media companies and publishers. “They cannot masquerade as technology companies” as Martin Sorrell has stated.  Most of the social media platforms are almost entirely monetised through advertising revenues. Facebook, for example, earned approx. 40B USD in 2017. Advertisers care deeply about the context in which their brands are advertised. There are constant risks of reputational damage and it is hard for them to control. Earlier this year some of the largest advertisers discovered that their brands were placed amongst far-right propaganda and other forms of extremism. Less dramatic, but equally problematic, are the advertising contexts created by YouTube stars such as Paul Logan. Paul has 15M subscribers and was pulling down over a million a month in ad revenue. In the last couple of months, he’s created outrage with his tasteless and disrespectful footage of suicides in Japan, encouraged his young audience to take Tide Pod challenge, and, in one recent video, took a fish out of his pond to jokingly give it CPR and then tasered a dead rat.

Is 1984 here to stay?

The response from the biggest spenders such as Unilever and P&G has been unequivocal. Unilever CMO Keith Weed is crystal clear that there is a problem and it needs urgent action from businesses like Facebook and Google: “We cannot continue to prop up a digital supply chain…which at times is little better than a swamp in terms of its transparency.” P&G have followed suit cutting $100M in their digital spending because “[they] couldn’t be assured that our ads would not appear next to bad content like a terrorist video. “We simply will not accept or take the chance that our ads are associated with violence, bigotry or hate,” said Mark Pritchard.

For the first time in years brands have a point of significant leverage with the big media tech platforms and the malaise is spreading throughout the industry. Advertisers are seizing the opportunity to reduce their media costs and force the big platforms to invest in systems to police the content on their platforms. Ironically the solution is human and not algorithmic – 10,000 employees are being hired to police the content on preferred channels for YouTube. The fight against the big platforms is daunting because they control 80% of digital advertising revenues. There may be interesting and unintended consequences. If budget is being pulled from digital then where will brands buy the reach that can no longer be supplied by TV audiences?

Brand, don’t be evil to your fans

The relationship between brands, people, content creators, advertisers, publishers and media tech are at a crossroads. There is an erosion of trust at every level of society which must be checked and there is a need in our industry to find communication solutions that are healthy and sustainable. Some big principles are being tested. Mark Zuckerberg wrote that Facebook had a responsibility to help people and asked “Are we building the world we all want?”. Time will tell. As Bill Bernbach wrote “A principle is not a principle until it costs you money”.

There is an unwritten contract between brands and their customers and it’s based on trust. Brands cannot afford to be sponsors of hateful content on platforms that create social and individual harm. They must be part of the solution and stop funding the problem. In this small way they have a duty of care to position their brands in a way that is regulated and contributes positively to society. Everyone who works in marketing is implicated in this – we have an unavoidable responsibility to use the powerful tools of branding and communication in ways we can be proud of. This is not a new idea, but the urgency is greater than ever.


Share this post
No Comments Yet

Leave a Reply

Your email address will not be published.