..

Failures in Complex Software Systems

a case study.

#technical #people #rationality 2 min read

hero

generated by deepai.org

This is an exercise for the AI Alignment Course Session 1, a case study on "Disinformation on Facebook and the Rohingya genocide".

For context, watch this video.


Prompts

What was the purpose and use-case of the software system?

(2011) Myanmar was emerging from the choke-hold, opening up to the world at an ever-increasing speed. People started using phones, which were rarely used just a few years ago.

These phones, preloaded with Facebook allowed people to connect to the internet without much effort. And this was supposed to create a connection between the people of the country, and the outside world.

Facebook wanted to spread their services, thus gaining more ads and revenue from its increasing use in the country. Facebook was the prime source of information and that in itself was a major cause of concern...

What were the faults, and how severe were the consequences?

Buddhist Extremists and Hate Speakers used that platform to spread hate for the Rohingya Muslims. And the Facebook algorithm supported it.

Because there was little to no moderation by Facebook in the ongoing activities, the posts spread like wildfire, like a contagious virus. And that led the extremists to exploit people's biases and feed them misformation. As a consequence, there are almost a million refugee Rohingyas whose country and people have failed them.

This ethnic cleansing was not directly caused by Facebook. It instead proved to be the catalyst in the reaction.

What other factors contributed to the incident?

How did the incentives of the people involved affect the emergence of the fault?

Myanmar was becoming a more "open" country, which meant that social connection and networking had to be established. And Facebook tried to do just that.

It was a sort of win-win situation. Facebook experienced growth, and the people of Myanmar enjoyed the unmoderated discussions and news that had the capacity to ignite a country-wide fire.

And people wanted to connect. International bodies too ignored everything that was happening inside and instead focused the good-looking outside.

Result? A failed algorithm driven by primitive social instincts.

More such examples

  1. Anti-vaccine conspiracies
  2. WhatsApp "university": rumors, unscientific notions, only to confirm there preconceived notions (although it is more of a social phenomenon than a technical one?)
  3. Misinformation during elections
  4. Pandemic misinformation: consider every false information we got during COVID-19..

What do we learn?

Technology for us, is equivalent to the fire for the Homo erectus. It is helpful, but very, very dangerous if not used carefully.


Continue Reading

Next:
Previous: