This conference coverage report was produced for the Media and Mass Atrocity roundtable event hosted at Carleton University in 2017.
___
Millions of Africans are connecting to the internet on mobile phones, often using slow networks with limited data. Social media sites such as Twitter, which uses relatively little bandwidth, and Facebook, which introduced streamlined sites in Africa, are offering a new source for news. Social media is popular among African citizens, independent NGOs, political parties and opposition groups, and it can be highly politicized.
As panellist and University of Copenhagen professor Mette Mortensen mused during the Social Media — The New Media session at Carleton University’s Media and Mass Atrocity roundtable event, social media has been applauded for its democratic promise. However, skeptics also warn it can be used as a weapon for anti-democratic forces. Panellist and Digital Intelligence Lab researcher Nick Monaco went on to explain to the audience of students, international journalists and academics that even governments are using “patriotic trolling” tactics to advance their own ends.
Monaco presented his research on state-sponsored trolling, or patriotic trolling, to the roundtable during the Dec. 3 session. He defined patriotic trolling as “the use of targeted, state-sponsored online hate and harassment campaigns leveraged to silence and intimidate individuals.”
He and his research partner have been documenting patriotic trolling by diving into ethnography and history, doing a literature review, and talking to victims around the world. Quantitatively, they are also doing a social network analysis that includes particular use of the eigenvector centrality method to determine which social media users are most influential.
The pair have documented patterns of patriotic trolling in countries including Azerbaijan, Bahran, Ecuador, Turkey, the Philippines, Venezuela, and to some extent, the U.S.
Monaco said a common pattern is the use of social bots, computer programs that can be used to amplify attack campaigns.
He said another common patriotic trolling pattern is the use of memes, cartoons and disfigured or doctored images. Although memes and the like may commonly be perceived as innocuous, Monaco warned that images used and abused for patriotic trolling are “virulent and disturbing.” For example, he said National Review reporter David French was recently attacked by extreme alt-right trolls, who sent him images of his daughter in a gas chamber.
Perhaps the most salient pattern of all is the similar profile of targets, said Monaco. He listed journalists, politicians, business people and public servants as frequent targets, and added that targets are overwhelmingly women, “so there is a misogynistic, sexist aspect to these attacks.”
Other patterns identified include fake news stories, libelous accusations, vitriolic rape and death threats, and the weaponization of state surveillance systems.
“We see this as a new form of human rights abuse, and seek to develop an initial attribution framework and policy recommendations through which to hold states accountable and start conversation on solutions.” — Text from Monaco’s presentation slide deck
Monaco said state-sponsored campaigns in Rwanda show signs of patriotic trolling, such as the “doxing” of Diane Rwigara. Rwigara was the only female presidential challenger to the incumbent Paul Kagame in the August elections and days after she announced her candidacy, nude photos of her were leaked online. Rwigara was later disqualified from the race, accused of electoral fraud. Since then, she has publicly stated that the claims against her are false and the nude photos were doctored.
Monaco also noted an admission from the presidential office in Rwanda that a harassing account was being run from inside. The account user was relieved of duty, “but this is still a revealing admission nonetheless,” he said, adding that his fellow panellist and Globe and Mail correspondent Geoffrey York can attest to receiving harassment in Rwanda.
“If you want to stop someone from reporting about a corruption scandal or a case of political wrongdoing, you could try to intimidate, me let’s say, by sending an anonymous threat to me on email,” said York. “I might just ignore it or report you to the police, but if you target me on social media, it’s much more difficult to ignore. Media and politicians have large audiences, often large communities of followers, so it’s much more intimidating and more difficult to ignore, and more damaging.”
Looking forward, Monaco said “we can expect some more sophistication in the trolling attacks that are occurring in Rwanda.” Rather than censoring opposing voices via repression, he said content generation may be used as a censorial weapon.
York added that an example of social media being used to fuel racial hatred was the scandalized British multinational PR company Bell Pottinger. York said the company was hired in South Africa by the wealthy Gupta family and President Jacob Zuma’s son to create a campaign against “white monopoly capital,” effectively distracting attention away from their corruption.
York did however end on a more upbeat note, explaining that on the flip side, the use of social media eventually helped expose the campaign and Bell Pottinger was brought to justice.
During the question and answer portion of the panel, observer Jean-Paul Nyilinkwaya of PAGE-Rwanda, a Montreal-based group for relatives and friends of genocide victims, asked Monaco and York for data about the abuse of social media by non-governmental actors.
While Monaco and York did not have numbers on hand, York said, “you’re absolutely right that all sides of the debate use fake news.”
Observer Claire Whalen also posed a question to the panellists, pointing out that the president of the United States retweets propaganda.
Monaco said the patterns he and his research partner are seeing “are not only happening in autocratic regimes, they’re happening in democracies,” and added that he thinks their research will contribute to solving social media problems in the United States.
Appropriately, the panellists addressed a question from Twitter. Observer Matthew Bennet, editor of the Spain Report, asked the panellists how to get people to pay more attention to neutral truth and reporting.
@MediaAtrocity If there are two (or more) players using propaganda, trolls and fake news in a conflict, how do you get readers or viewers to want to pay more attention to a more neutral truth and reporting?
— Matthew Bennett (@matthewbennett) December 3, 2017
Monaco responded that a preliminary solution is to ensure bots are transparent online. On Twitter, he said, that would be a “super feasible, easy thing to do” as far as infrastructure is concerned.
“I think if you make bots transparent and you see that bots are either harassing you or promoting a message, I think that helps people pay more attention,” he said. “Insofar as people can see that this is a machine promoting a message versus a real human promoting a message.”