My news feed versus your news feed

The reality of how disinformation is created and dispensed through digital platforms is complex and rooted in the depths of human behaviour

My news feed versus your news feed

Of the approximately 7.7 billion people on earth, around 3.4 billion are active social media users, according to a report published on global world trends by Hootsuite and We Are Social. In Pakistan alone, there are over 37 million active social media users accounting for 18 per cent of the population. While the world is unarguably more connected through digital media platforms than ever before, modern day definitions of facts, truth, verification and information have also gained, what some may consider an alarming abstraction. This leads to confusion regarding what can be categorised as verifiable truth and what falls within the realm of disinformation. While it is easy to blame the algorithm-based business models of platforms such as Facebook and Twitter for overloading users with information in order to generate ad revenue, the ground reality of how disinformation is created and dispensed through digital platforms is complex, multi-faceted and rooted in the depths of human behaviour.

What digital platforms have done is increase the volume and speed at which all information is disseminated - from thought-provoking human interest stories, hate speech, news, DIY tutorials and puppy videos - these social websites act as bottomless vats churning out information at the speed of light. How then do we identify, understand and counter the spread of disinformation?

The first step is accepting that while we may blame social media for the ability to mass disseminate fake news, it is created by human intent to spread lies and untruths. According to Badar Khushnood, co-founder of Bramerz, who has previously a consultant for Google, Facebook and Twitter, "Social media is like a knife, you can use it as a tool for cutting fruit and vegetables or you can use it as a weapon for murder. It is nothing more than a medium whose use is solely at the user’s discretion". Usama Khilji, director of Bolo Bhi - a digital rights advocacy organisation - states that "disinformation and fake news are essentially a human issue, one that predates social media and has been prevalent on all types of media."

Hence, it is at the intersection of human propensity for propaganda and a digital platform’s ability to share information that fragments of fake news create a false narrative, which constructs an inaccurate context out of which uninformed human experiences and opinions are shaped. According to the Computational Propaganda Research Project at the Oxford Internet Institute, "the lies, the junk, the misinformation of traditional propaganda are widespread online."

Disinformation may be categorised in three identifiable forms: misrepresentation, fake news and alternative facts. Misrepresentation occurs in a situation where most of the facts remain true, and an essential detail is false. After the 2012 Rawalpindi riots, a picture of three dead bodies in a mosque circulated on social media, and more than three deaths were reported in the incident, the photo was from a blast in Kabul. While the photo did not challenge the reality, it misrepresented the bodies. Fake news on the other hand, is completely ungrounded in fact. According to Sadaf Khan, co-founder of Media Matters for Democracy, an example of this is fake accounts that mimic the official twitter handle of Inter-Services Public Relations (ISPR), and spread falsehoods about security alerts through forwarded messages. While it is true that ISPR does communicate matters of public safety through social media, the concerns highlighted through fake accounts are not.

Most complicated of these categories is that of alternative facts. In a hyper-connected world a people’s widely held beliefs may be directly opposed by another people’s broadly accepted set of facts. When companies such as Facebook and Google began operations around the world, they chose to adopt the law of the land, for the people of that land. However, as Khushnood explains, "one nation’s freedom fighter is another nation’s terrorist" and "the onus of propelling freedom of speech compels multi-national platforms to allow various narratives to exist, provided they do not infringe the human dignity and respect of any individual". However, this is easier said than done. For example, when Facebook gained a physical and legal presence in India, the Indian government insisted they show Jammu and Kashmir not as disputed territority but a part of India, whereas Pakistan proclaimed that the map should represent the UN-declared disputed territory and Line of Control. In the absence of a supra-regulatory authority which could resolve this issue, Facebook now shows two different maps of the South Asian region, to Pakistan and India based on their version of the story. Hence bringing to light the human discrepancies in agreeing to a common set of facts, and how history is representative of which side is telling the story.

For Khushnood, this is not an issue for digital platforms to resolve. "Facebook does not create any content, it is not advocating for a certain point of view to be accepted. It is merely providing a platform through which users democratically exchange and publish their views." Khan also notes that the issue is a lack of media literacy. "People can’t decipher contradictions presented as information, hence we must go back to the fundamentals: difference of opinions exists, and that’s okay."

While falsehood and the desire to claim access to the ‘only’ truth clearly originate within human nature, the way that digital platforms may encourage them is knowingly or unknowingly built within the algorithms of these sites. Khilji refers to this as "the notorious echo chamber effect", where one consumes information one is most likely to agree with".

Sadaf Khan explains that "the more pages, posts and tweets you like or follow the more the host platform knows about you, your preferences, and political affiliations. In order to keep you engaged, the algorithm is most likely to show you what you would like to see, thereby creating a custom echo-system for you. For example, if you are a PTI supporter then posts hailing Imran Khan as the saviour of the Ummah will appear on your newsfeed, and if you are a PML-N supporter you are more likely to be presented with posts championing Maryam Nawaz as the saviour of democracy."

Also read: Confronting Pakistan’s information disorder

When a piece of disinformation gains momentum within a certain echo-system, it is quite likely to be accepted as the truth because these like-minded people want it to be true. Khilji warns of the dangers of such spread of misinformation pointing out that it can have disastrous effects. "Due to low literacy rates, and even lower media literacy levels, the Pakistani public, like in any emerging economy, has a complex relationship with information on social media. There is a fascination with the easy availability of information, but also a penchant to believe whatever information one receives on WhatsApp and social media."

However, he goes on to say that the interactive nature of these platforms and their ability to propagate counter information is good news. "In order for fake news to stop being so damaging, there need to be more opportunities for counter speech."

 

The writer is a staff member and can be reached at syedamehrm@gmail.com

My news feed versus your news feed