Using code words is nothing new. It has been used in the military and banking for security purposes. It prevents critical information from leaking into the wrong hands, which is a good thing. However, what about avoiding the algorithm from banning certain topics? Will it be a good idea to use code terms on online social media that only certain groups can understand?
Tik Tokkers use commonly used or new words as their code terms to get around the algorithm. They do not want Tik Tok to suppress their videos, which is understandable. The political bias on social media is widespread. It censors users from talking about certain topics, such as COVID-19 and political issues. An organization pushed AI to censor online speech, which is to silence discussion or debate of certain topics. The increase in censorship and not knowing which social media network would ban certain videos or tweets can have a negative impact.
Then again, users use code words to hide what they are actually talking about. Users use them to fit in with a particular group or to voice their opinions to those who hear them. People are more likely to join the discussion or share their views with those who agreed with them than not. It did not matter if they were in person or in an online setting. They do not want others to know because of fear of rejection or disappointment. Peer approval and social status on social media have an impact on adolescents’ mental health.
Taking that into consideration, it raises the question of whether it is a good idea to use secret language in an online setting. Most likely not, especially when posting explicit or suicidal content. It is difficult to detect if the post contains harmful content that may have negative consequences. It is not safe since it may attract predators who want to victimize others as well as themselves. Additionally, changing a word’s definition when it is widely used confounds other people. For instance, “racist” means a particular group (either based on skin color or ethnicity) is superior or inferior to others. However, the term has lost its meaning and logic. An example is an anti-racism training program offered by Google that instructs its employees that even 3-month-old infants are racist. Another example is that math is racist.
Source:
The ‘algospeak’ code words TikTokkers use to post about sex, self-harm