Hot on the heels of extraordinary customer service failures in the US airline industry comes news of even bigger brands attracting reputational damage. Facebook, the social media giant, now boasts 1.86 billion weekly users, with over 150 million in Africa. More than 65% of these customers use the medium every single day, mostly from their mobile devices. Google’s video platform YouTube holds the attention of over a billion users worldwide, who watch hundreds of millions of hours of content every day.
When you create communities this large, you are also bound to attract the darker elements of society. Therein lies the root of reputational damage to Facebook and Google. Increasingly they contain posts that range from highly influential ‘fake news’ to downright incitement to mass murder courtesy of politico-religious extremists. And on continents where people have more time on their hands than we do in Africa, the old saying has come true. The devil has made work for idle hands, in the proliferation of the most appalling child pornography, sado-masochistic torture and (Lord help us) even cannibalistic communities.
Young people, disconnected from reality by their excessive time on screens, are even being incited to kill themselves. Police in Europe are trying to combat a ‘challenge’ called Blue Whale which encourages suicide and sets members’ daily tasks towards this goal that must be completed within 50 days. The inventor of the game was arrested in Russia last year and said this about his victims: “…they died happy. I gave them what they do not have in real life: comprehension, communication and warmth.”
Facebook and Google are both under increasing pressure to do something about dangerous content. But even though their reputations are being tarnished, they seem strangely reluctant to act like brand leaders.
Initial criticism came from advertisers, dismayed to find their advertising appearing alongside extremist content. Commercial pressure alerted governments – whose primary role is protection of the people they represent. In the UK, a parliamentary sub-committee has been examining ‘fake news’ and other aspects of unregulated social media content. Facebook this week removed thousands of bogus accounts and began a national advertising campaign to “help consumer spot fake news”. But much of the advice is fatuous: “If no other news source is reporting the story it may indicate that the story is fake.”
Paul Armstrong, author of ‘Disruptive Technologies’ says: “There’s a lot more Facebook could be doing. They are in complete control of the algorithms … and could limit sources.”
As Kenya swings into full election mode, the region should keep its eyes open for the darker side of the social media phenomenon.
Chris Harrison leads The Brand Inside