高清福利片

a hand tapping the YouTube play button on a mobile phone in a dark room
高清福利片_

Conspiracy theories thrive on YouTube, new study

5 October 2022
YouTube urged to do more to prevent misinformation
A new study by social media researchers at the University of Sydney and QUT has found conspiracy theories are thriving on YouTube despite the platform's efforts to harden posting rules and guidelines.

The study published in the Harvard Kennedy School聽, global leaders on misinformation research, examined YouTube comments on Covid-19 news videos featuring American business magnate and philanthropist Bill Gates and found conspiracy theories dominated.

The comments covered topics such as Bill Gates鈥 hidden agenda, his role in vaccine development and distribution, his body language, his connection to convicted sex offender Jeffrey Epstein, 5G network harms, and ideas around Gates controlling people through human microchipping and the 鈥榤ark of the beast鈥.

The results suggest that during the Covid-19 pandemic, YouTube鈥檚 comments feature, just like anonymous message boards 4chan and 8kun, may have played an underrated role in conspiracy theories growing and circulating.

The findings support previous studies that argue misinformation is a collective, socially produced phenomenon.

a woman with short blonde hair smiling

Dr Joanne Gray, digital cultures

Dr Joanne Gray, a University of Sydney researcher on digital platform policy and governance, said: 鈥淲e found that the process of developing a conspiracy theory is quite social. People come together and socially 鈥榡oin the dots鈥 or share new pieces of information that they use to build conspiratorial narratives. The social media platforms鈥 current approaches to content moderation (which are often automated) are not good at detecting this kind of social conspiracy theorising.鈥

Co-authors of the study include Lan Ha and Dr from Queensland University of Technology.

Conspiracy theories on YouTube

a graphic image of a circle with YouTube comments in it

Conversational strategies used in sample comments on YouTube.

Click image to enlarge.

During the Covid-19 pandemic, YouTube introduced new policies and guidelines aimed at limiting the spread of medical misinformation about the virus on the platform.

But the study found the comments feature remains relatively unmoderated and has low barriers to entry for posting publicly, with many posts violating the platform鈥檚 rules, for example, comments that proposed vaccines are used for mass sterilisation or to insert microchips into recipients.

The researchers studied a dataset of 38,564 YouTube comments drawn from three Covid-19-related videos posted by news media organisations Fox 高清福利片, Vox, and China Global Television Network. Each video featured Bill Gates and, at the time of data extraction, had between 13,000鈥14,500 comments posted between April 5, 2020, and March 2, 2021.聽

Through topic modelling and qualitative content analysis, the study found the comments for each video to be heavily dominated by conspiratorial statements.

graphic chart of topics in the comments, including words like pure evil, vaccine and microsoft

22 topics and percentage of affiliated comments.聽

Click the image to enlarge.

Some comments were considered 鈥渂orderline content,鈥 which YouTube defines as content that 鈥渂rushes up against鈥 but does not cross the lines set by its rules.

Examples of borderline content include comments that raise doubts about Bill Gates鈥檚 motives in vaccine development and distribution and the suggestion that he seeks to take control in a 鈥渘ew world order.鈥 These comments implied or linked to theories about using vaccines to control or track large populations of people.

YouTube comment moderation

The researchers said the platform should consider design and policy changes that respond to conversational strategies used by conspiracy theorists to prevent similar outcomes for future high-stakes public interest matters.聽

Three common conversational strategies include: strengthening a conspiracy theory (鈥渏oining the dots of disparate information鈥), discrediting an authority (鈥渃asting doubt鈥) and defending a conspiracy theory. These comments can be amplified when readers 鈥榣ike鈥 the comment.

鈥淵ouTube almost completely lacks the community-led or human moderation features that are needed to detect these kinds of strategies,鈥 said Dr Gray.

The researchers said that for YouTube to address this problem adequately, it must attend to both the conversational strategies that evade automated detection systems and to redesign the space to provide users with the tools they need to self-moderate effectively.

高清福利片 publishers and YouTube

a graph showing the number of likes and replies for different topics such as microchip or fake news

The average numbers of likes and replies within each topic.

Click the image to enlarge.

The study urges YouTube to develop best practice content moderation guidelines for news publishers that outline strategies used by conspiracy theorists that are invisible to automated moderation. In addition, news publishers could turn off comments on high stakes public interest videos to ensure they do not exacerbate the circulation of conspiracy theories.

鈥淎 major implication of our study is that YouTube needs to redesign the space to provide social moderation infrastructure,鈥 said Dr Gray. 鈥淥therwise, the discursive strategies of conspiracy theorists will continue to evade detection systems, pose insurmountable challenges for content creators, and play into the hands of content producers who benefit from and/or encourage such activity.鈥


Declaration: The authors declare no potential conflicts of interest with respect to the research, authorship, and/or publication of this article. YouTube data provided courtesy of Google鈥檚 YouTube Data API. Dr Timothy Graham is the recipient of an Australian Research Council DECRA Fellowship (project number DE220101435). Top Photo: Adobe Stock Images

Media Contact

Elissa Blake

Media Adviser

Related articles