From QAnon to crisis actors, stolen elections to Ivermectin, fake news and conspiracy theories thrive in our connected, online media. It seems the more outlandish or absurd the premise, the quicker it spreads, with serious real-world consequences.
Scott DeJong wants to understand why online falsehoods go viral and how they shape our relationships with social media — and each other. The communication studies PhD student created a board game that pits conspiracy theorists and their online troll amplifiers against moderators and educators. DeJong says Lizards & Lies, available as a free download, is designed as a model to help us grasp the movement of false content and the frustrations those fighting against it can experience.
As he explains it, the game explores the widely held supposition that the spread of conspiracy theories and misinformation online constitutes a new front in a global war on truth. It’s a precept that he feels is oversimplified and skims past the vitally important role human users play in propagating fake news, willingly or not.
“I wanted to see if these analogies function when we simulate them within a game,” he says. “A game functions as a rule system, where boundaries can be set and we can explore what happens within those spaces.”
Pushing lies and pushing back
Lizards & Lies is a three-round, two- or four-player asymmetrical game, meaning that each player has a distinct role. The gameplay revolves around the run-up to a theoretical election, with one side spreading conspiracies (specifically about reptilians and birds being spy drones) and the other doing its best to dispel them. Players can take on the role of conspiracy theorist, Edgelord/troll — who amplifies the theory — fact-checker or digital literacy educator. The two sides battle to see whether one side can successfully push these lies across their networks and if the other can prevent it.
“The characters are designed so that they reflect challenges in the real world,” DeJong explains. “Conspiracy theorists are trying to build up little pockets of supporters and using them to proliferate out. The Edgelords are less focused on the conspiracy and more on where users are vulnerable; they are looking to get them to react to their content.” At the same time, he notes, educators will find that education takes time, while the moderators are overworked: the amount of content they must deal with is endless.
“Their goals are meant to reflect the challenges that everyday people outside the game have by representing them inside of it.”
Users collect points at the end of each round, tabulated at the end to determine which side won. Coordinated teamplay can increase the chances of victory, he adds.
DeJong says the game is supposed to be fun, but also educational. As a research project that grew out of classwork, Lizards & Lies works to clarify the idea that social media operates as an ecosystem, with many different actors and many different spaces, each influencing the other. An idea that appears on Reddit, for instance, can quickly migrate to TikTok, Instagram or Facebook.
“The game looks at how the existing analogies failed to demonstrate how fake news and misinformation proliferate,” he says. “It’s not just an algorithm on a platform that’s responsible for spreading lies. It’s this ecosystem that users are a part of, where what we say and do and what we scroll past or choose to like or share all influence what is going to be popular in these spaces.”
Watch a video of Scott DeJong explaining his research: https://www.youtube.com/shorts/qjy3GzRRsA4
— By Patrick Lejtenyi
— Concordia University