Image credit: iStock
In the app’s Creator Marketplace, a feature that connects creators with brands interested in paying for sponsored content, a TikTok creator by the name of Ziggi Tyler posted a video last summer calling attention to a disturbing issue he discovered.
In his Marketplace profile, Tyler claimed he was unable to include words like “Black Lives Matter” and “supporting Black excellence.” However, terms like “supporting white excellence” and “white supremacy” were permitted.
If you frequent TikTok, you’ve probably seen creators talk about incidents that are similar to these.
You can use the platform’s assertions or ask the creators directly to try to understand the effects of content moderation and the algorithms that enforce those rules.
In Tyler’s case, TikTok issued an apology and placed the blame on an automatic filter that had been set up to flag words linked to hate speech but was ostensibly unable to comprehend context.
Around the time Tyler’s video went viral, associate professor at Cornell University Brooke Erin Duffy and graduate student Colten Meisner conducted 30 interviews with creators on TikTok, Instagram, Twitch, YouTube, and Twitter.
They were interested in learning how creators, especially those from disadvantaged groups, used the platforms’ algorithms and moderation policies.
They discovered that creators put in a lot of effort to understand the algorithms that shape their experiences and relationships on these platforms.
Because many creators work across multiple platforms, they must learn the nuances of each. In response to algorithmic and moderation biases, some creators change their entire approach to creating and promoting content.
Image credit: iStock
These algorithms are arbitrary and lack insight. In many instances, there is no direct communication from the platforms. And this has a profound, profound impact on your income as well as your experience.
People would claim that they interact with their creator community both online and offline, and they would discuss how to manipulate the algorithm, what is appropriate to say, and what might be reported.
There are some significant forms of collective organization that may not resemble what we would typically imagine as organized workers, but they are nonetheless effective strategies for creators to unite and somewhat oppose top-down power structures.
Part of what makes shadow banning so potent is its ambiguity. There is a lot of speculation because it is impossible to conclusively establish whether any particular user on these platforms was or was not shadow-banned.
But, whether it exists or not, it’s important to take seriously the fact that some people behave as though they are being punished by having their visibility restricted.
Platforms advertise the benefits to creators all over their websites and claim that if you are talented and have the right content, you can connect with audiences and earn a ton of money.
The creators drive a lot of revenue to these platforms through data and users, but they have little influence over their content moderation policies and how they are applied unevenly.
Radical transparency may seem idealistic, but creators should be more involved in decisions that have a significant impact on their companies.
Leave a Reply