There are two ways to try to understand the impact of content moderation and the algorithms that enforce these rules: based on what the platform says and asking the creators themselves. In Tyler’s case, TikTok apologized and blamed an automatic filter that was set up to mark words associated with hate speech, but apparently couldn’t understand the context.
Brooke Erin Duffy, an associate professor at Cornell University, teamed up with graduate student Colten Meisner to interview 30 creators on TikTok, Instagram, Twitch, YouTube and Twitter when Tyler’s video went viral. They wanted to know how creators, especially those from marginalized groups, navigate the algorithms and moderation practices of the platforms they use.
What they found: Creators invest a lot of work in understanding the algorithms that shape their experiences and relationships on these platforms. Because many creators use multiple platforms, they have to learn the hidden rules for each. Some creators adapt their entire approach to produce and promote content in response to the algorithmic and moderation biases they encounter.
Below is our conversation with Duffy about his upcoming research (edited and condensed for clarity).
The creators have long argued how algorithms and moderation affect their visibility on the platforms that made them famous. So what surprised you the most while doing these interviews?
We had the feeling that the creators ’experiences are shaped by their understanding of the algorithm, but after doing the interviews, we started to see how deep [this impact] it is in their daily life and work … the amount of time, energy and attention they devote to knowing these algorithms, investing in them. They have this kind of critical awareness that these algorithms are understood to be unequal. Despite this, they are still investing all this energy in the hope of understanding them. It really draws attention to the uneven nature of the creator’s economy.
How often do creators think about the possibility of being censored or that their content does not reach their audience due to suppression practices or algorithmic moderation?
I think it basically structures your content creation process and also your content promotion process. These algorithms change at will; there is no vision. There is no direct communication from the platforms, in many cases. And that completely affects, fundamentally, not just your experience, but your income.