When social media platforms were invented, no one knew the extent of the adverse effects they could have, including influencing democracies and amplifying hate speech.
This article first appeared in Digital Edge, The Edge Malaysia Weekly on August 30, 2021 - September 5, 2021
Facebook is huge, but it wants to be bigger.
In June, it closed above US$1 trillion (RM4.24 trillion) in market capitalisation for the first time, according to CNBC. It is the fifth US company to hit this milestone, alongside Apple, Amazon.com, Microsoft Corp and Alphabet’s Google.
Facebook founder and CEO Mark Zuckerberg has his focus on the next stage of expansion of his company beyond just the social media space: into the metaverse. In a quarterly report for investors, Zuckerberg shared his excitement in “building the next computing platform coming together to start to bring the vision of the metaverse to life”.
Essentially, the metaverse is a more embodied concept of the internet. This would combine augmented and virtual reality to a large extent, thanks to Facebook Reality Labs. The team believes that the sci-fi-esque innovations “will become as universal and essential as smartphones and personal computers are today”.
But before starting conversations about what the future would look like in 3D, there are certain problems to be addressed in Facebook’s current 2D situation.
Digital Edge speaks to student activists, A and N, regarding the situation in Myanmar, which foregrounds certain glitches in Facebook’s algorithms that were designed to eliminate hate speech and nefarious activities over its platform.
Only their initials will be used to protect their identities, as both are local coordinators of Students for Liberty Myanmar, a local network of pro-liberty students.
In February, thousands of people took part in a street protest in Yangon against the military’s coup d’etat. Following the military’s crackdown on the protestors, about 300 participants were detained and beaten, one of them being A himself.
A was arrested by the military and spent a total of 22 days in prison. He was also transferred to the infamous Insein prison, which does not have a good human rights record, during that time.
He does not know the real reason for his release but says there were several factors at play. “The most important one is that all the people detained were university students, among them children of high-ranking military officials. So, they did not want to keep us there any longer.”
Facebook has admitted in the past that it played a role in inciting violence during the military’s genocidal campaign against the Rohingya Muslim minority, and has said in numerous statements that it will continuously improve the community standards of its platforms.
A recent report by rights group Global Witness has found, however, that there are still large failures in Facebook’s content moderation systems.
The research suggests that, “despite protestations and public statements to the contrary, Facebook is not only failing to remove content that breaches its own rules, but its algorithms are actively promoting such content”.
The rights group found “dozens of instances of military propaganda, much of which also appears to be misinformation”. Essentially, the social media platform was promoting military content at a time when large numbers of people were being killed.
This propaganda would include types of content that incited and supported violence against civilians, glorified the suffering or humiliation of others as well as included misinformation that could lead to physical harm.
A says: “The ironic thing is, [Facebook] did develop its software, but not correctly. Whenever we used the word ‘Tayoke’, which means Burmese Chinese people in the Burmese language, we get banned for a month. It’s not even a provocative word.
“Meanwhile, I don’t think anyone who supports the military and the killings of the protestors got banned. At one point, I got really worried about the situation: Was Facebook biased in favour of the army?”
State-owned media was banned, A continues, but some pro-military pages and fake news accounts are still up. “Extremist posts like ‘kill them all, shoot them all’ are still there. I don’t know what’s wrong with the algorithm.”
The situation is made more dire considering how influential Facebook is in Myanmar.
The platform is synonymous with the internet, N says. “Soldiers would use Facebook to follow prominent protestors and track their location based on their photos. We have to be very careful of what we write and post. Soldiers can sue and detain whomever they want. A lot of celebrities have been arrested simply because they post photos of themselves protesting.”
Almost half the country’s population is estimated to use Facebook, the Global Witness report adds. “Mobile phones come pre-loaded with Facebook and many businesses do not have a website, only a Facebook page,” the report continues.
Because of the platform’s wide-scale use, Facebook is often used just to gain information. “Since public news outlets are banned, we use Facebook to know where the protests are happening, how many people died in the protest and so on. We use other social media alternatives to spread our message,” says N.
The activists say they use Twitter for its worldwide trends features and chat service Telegram to share ready-made content that can be easily reposted. “Sometimes we also use secret conversations — on Signal or Messenger — that are deleted when the time is up. We always make sure to use virtual private networks (VPNs) whenever we are online.”
Sometimes, protestors would also attempt to confuse the military by using photos. “Protestors would post photos of themselves in different towns, where the surrounding areas would be visible, without saying the actual location. But since it’s an old photo from a different place, it will allow them to know whether they are being tracked if the military goes to that location.”
In cases of internet shutdowns or nationwide electricity cut-offs, Bluetooth-based communication would be used. There are also instances where spyware would be installed on the phones of arrested protestors. “In some cases, they can even recover deleted photos. It may seem like there’s no way to communicate safely, but we try to do what we can,” says N.
Among other things, the rights group recommends legislation to regulate Big Tech companies, including “their use of secret algorithms that can spread disinformation and foment violence”.
The amplification of hate in Myanmar is one of Facebook’s many issues.
Another example is the debate on Facebook’s role in the Delhi riots, where India’s Supreme Court made reference to the platform’s role in Myanmar and Sri Lanka. The court also referenced Facebook’s role in the 2016 US presidential election as well as the Brexit referendum.
Commonly referred to as the Cambridge Analytica scandal, it is where a data firm improperly accessed the data of millions of Facebook users and used it for targeted ads.
When social media platforms were invented, no one knew the extent of the adverse effects they could have, including influencing democracies and amplifying hate speech. It remains clear, however, that these platforms need to do more to mitigate harm.
As recently as early this month, Facebook was reported to have banned personal accounts of academics who were researching ad transparency and misinformation on the platform.
“Facebook suspended my account and the accounts of several people associated with Cybersecurity for Democracy, our team at New York University,” tweeted Laura Edelson, a researcher involved in the project.
Over the last several years, the team has used the access to “uncover systemic flaws in the Facebook Ad Library, identify misinformation in political ads including [those] sowing distrust in our election system, and to study Facebook’s apparent amplification of partisan misinformation”, Edelson adds.
By suspending the accounts, she says, “Facebook has effectively ended all this work. It has also effectively cut off access to more than two dozen other researchers and journalists who get access to Facebook data through our project, including our work measuring vaccine misinformation … The work our team does to make data about disinformation on Facebook transparent is vital to a healthy internet and a healthy democracy”.
Save by subscribing to us for your print and/or digital copy.
P/S: The Edge is also available on Apple's App Store and Android's Google Play.