If there is one thing Covid-19 has taught us, it is that when a virus enters the community the only way to stop the spread is to break the chain of transmission. If an infected person would have interacted with 10 different people in a day, each of whom would then interact with 10 other people, the virus would spread exponentially. If this infected person stays home and interacts with no one, the virus stays contained.
Now think of social media and its trove of hateful, bigoted and outright misinformation as a similar virus. Wouldn’t it make sense to break the chain so this transmission of hatred can be contained?
This week in The Global Tiller, we look closely at the #OneClickSafer campaign that tries to do just that — to make Facebook and other social media safer, and less of an instrument that literally kills people across the world. We look at the revelations made by the Facebook whistleblower and Facebook’s tendency to exploit the Global South.
#OneClickSafer is a public campaign launched by the Center for Humane Technology to pressure Facebook to implement platform changes supported by its own research that prioritise people over profits. It aims to address the issue of "frictionless sharing", which is essentially the one-click share button available on every post on Facebook. When it is so easy to share something, people tend to become thoughtless and sharing becomes a knee-jerk reaction.
Added to this ease of sharing is the fact that our news feeds on Facebook are dominated by what the algorithms believe generate the most engagement, which most often tends to be content that makes us angry, sad or emotional in some way.
If a thing’s been shared 20 times in a row it’s 10x or more likely to contain nudity, violence, hate speech, misinformation than a thing that has not been shared at all. — Wall Street Journal Facebook Files
The end result is Facebook’s complicity in Myanmar’s attacks on minorities, its services being used to spread religious hatred in India, its failure to take down a fake video that led to an army clampdown against secessionists in Cameroon, and other similar catastrophic events.
Facebook’s own employees have become increasingly uncomfortable with the unchecked power the social media company has over the law and order situation in many countries of the Global South, especially since its safety protocols and internal detection mechanisms for harmful content leave a lot to be desired when it comes to non-English content.
One of those former employees recently released internal company documents to the Wall Street Journal that confirmed what many already suspected: that Facebook knew its platforms were causing ethnic conflict, violence and mental health issues yet it chose profits over people.
There is a whole range of interventions needed on many different levels to solve a lot of these issues. For example, Facebook should not have the ability to buy out any company that threatens its market share, like it did with Instagram and WhatsApp. Mark Zuckerberg and other Facebook executives should not be able to get away with insider trading ahead of a public scandal, like they did before the Cambridge Analytica revelations.
But these are interventions that the rest of the world can hope American lawmakers will be able to carry out. As far as the rest of the world is concerned, countries in the Global South have so far been unsuccessful in holding Facebook accountable as we can see from the case of The Gambia that sought critical information from the company to charge Myanmar for atrocities against its Rohingya population in 2016 and 2017, only to be denied the request.
Until the US Congress acts, there is one simple thing that Facebook can do itself that will bring monumental change in the platform’s ability to proliferate hate: it can get rid of "frictionless sharing" — remove the 'share' button on any post after two levels of sharing. Anyone who wants to share something after this, will have to make the extra effort of copying and pasting the link. This simple change, according to the Facebook’s own Integrity Team researchers, is more effective than the billions of dollars spent on trying to find and remove harmful content.
One can hope this campaign pushes the company to make the right call and, until that happens, let’s be more thoughtful when we hit 'share'.
Until next week, take care and stay safe.
Hira - Editor - The Global Tiller
Your guide to Glasgow
Continuing our ongoing coverage of the upcoming COP26 Summit in Glasgow, here are some Pacific voices on what the summit should achieve:
…and now what?
It’s just click away…making Facebook safer, more welcoming and more able to foster positive behaviours.
And if it is that simple (all things considered), why haven’t they done it yet? What is more important than making sure Facebook works positively for everyone? The answer to this question comes from the company’s own former employee: “Until the incentives change, Facebook will not change,” said Frances Haugen, the Facebook whistleblower, during her testimony to the UK Parliament.
I believe her. And if my recent encounter with the company is any proof, even advertising revenue from small businesses is not enough incentive for them to care. Unless someone forces them to do good, they won’t do it on their own. Unless someone convinces Mark Zuckerberg (who still owns 55% of Facebook’s shares and, therefore, has an overwhelming control of the company) to change, nothing will. No matter how many statements they issue insisting otherwise.
Is it because they’re not smart? Or because they don’t have the ability to improve the platform? As we know now, the answer to both is no. The solution is simple and their own people told them so. It’s just that they don’t want to. It’s just that, like many organisations, they have values but no ethics.
Facebook makes a big deal about its high moral principles, their marketing strategies communicate on them and they use them as posture to defend themselves when they are called to testify. But they don’t live by them.
Values have to be implemented through ethics, by building the behaviours (ethos) and actions required to make sure our values are mirrored by what we do, what we say and how we make decisions. It’s easy to have values, everyone does. But it’s much harder to have ethics as it requires living by what you believe is right, every single day. It’s easier to say you care about road safety than to respect the speed limit if you’re all alone on the road. It’s easier to say you care about climate change than to sell your SUV and take the bus.
And that’s exactly what Frances Haughen told the UK Parliament that Facebook will have to be forced to do, to live by their values they profess to embody — "build community and bring the world closer together…share ideas, offer support…"
As a species, we have built institutions designed to help us be virtuous, stick to our words and our values but we struggle to do it. Is it because the frameworks that we’ve built are designed to let us do what we want until we get punished for crossing the line? Shouldn’t we question the frameworks that manage our existence through punishment rather than intrinsic motivation to do what is right? Isn’t it time to act for what we think is right without waiting for others to show the way or force the way?
Isn’t that eventually the core element of leadership: doing what you think should be done, taking ownership of the situation without waiting to be told what to do? Despite his power, his influence, it seems that Zuckerberg fails to show any leadership capacity as he neither takes charge nor ownership of the situation. Unless his real goal all along is to make profits, the society be damned? And if that’s the case, should we take part in creating value for this model?
Philippe - Founder - Pacific Ventury