Failures in Complex Software Systems
a case study.
This is an exercise for the AI Alignment Course Session 1, a case study on "Disinformation on Facebook and the Rohingya genocide".
For context, watch this video.
Prompts
What was the purpose and use-case of the software system?
(2011) Myanmar was emerging from the choke-hold, opening up to the world at an ever-increasing speed. People started using phones, which were rarely used just a few years ago.
These phones, preloaded with Facebook allowed people to connect to the internet without much effort. And this was supposed to create a connection between the people of the country, and the outside world.
Facebook wanted to spread their services, thus gaining more ads and revenue from its increasing use in the country. Facebook was the prime source of information and that in itself was a major cause of concern...
What were the faults, and how severe were the consequences?
Buddhist Extremists and Hate Speakers used that platform to spread hate for the Rohingya Muslims. And the Facebook algorithm supported it.
Because there was little to no moderation by Facebook in the ongoing activities, the posts spread like wildfire, like a contagious virus. And that led the extremists to exploit people's biases and feed them misformation. As a consequence, there are almost a million refugee Rohingyas whose country and people have failed them.
This ethnic cleansing was not directly caused by Facebook. It instead proved to be the catalyst in the reaction.
What other factors contributed to the incident?
- Availability Bias: The frequency of such hateful speech and misinformation on Facebook was a lot. And because of that availability, people started believing in lies.
- Neglect from Facebook HQ: Because the admins ignored to moderate the algorithm in Myanmar, and instead focused on supposed "growth", people conformed to the availability bias.
This means that misalignment can be caused by inaction (of us humans). - Lack of Fact-checking in algorithm: How will you classify misinformation if you don't check for it? And generally the algorithm doesn't do that (it cares about "trends").
- Morality in Humans: Sounds like a good thing to discuss over a evening cup of tea.
How did the incentives of the people involved affect the emergence of the fault?
Myanmar was becoming a more "open" country, which meant that social connection and networking had to be established. And Facebook tried to do just that.
It was a sort of win-win situation. Facebook experienced growth, and the people of Myanmar enjoyed the unmoderated discussions and news that had the capacity to ignite a country-wide fire.
And people wanted to connect. International bodies too ignored everything that was happening inside and instead focused the good-looking outside.
Result? A failed algorithm driven by primitive social instincts.
More such examples
- Anti-vaccine conspiracies
- WhatsApp "university": rumors, unscientific notions, only to confirm there preconceived notions (although it is more of a social phenomenon than a technical one?)
- Misinformation during elections
- Pandemic misinformation: consider every false information we got during COVID-19..
What do we learn?
Technology for us, is equivalent to the fire for the Homo erectus. It is helpful, but very, very dangerous if not used carefully.
Continue Reading
Next:
Thoughts on The Prestige (2006)
Previous:
Coin Toss Possibilities in an expression