Pizza slices, cupcakes and carrots are just a few emojis that anti-vaccine activists are using to speak in code and continue to spread misinformation about COVID-19 on Facebook.
Bloomberg reported that Facebook moderators failed to remove posts shared in anti-vaccine groups and on pages that would normally be considered content-violating, if not for code language. A group Bloomberg reviewed called “Suddenly Died” is a meeting place for anti-vaccine activists who are supposed to mourn a loved one who died after receiving vaccines, what they call “eating the cake.”
Facebook owner Meta told Bloomberg that “it has removed over 27 million pieces of content for violating its COVID-19 misinformation policy, an ongoing process,” but declined to tell Ars whether the posts relying on emojis and coded language were considered to be in violation of the policy.
According to Facebook’s Community Standards, the company says it will “remove misinformation during public health emergencies,” such as the pandemic, “when public health authorities conclude that the information is false and likely to directly contribute at risk of imminent physical harm”. Pages or groups risk being removed if they violate Facebook’s rules or if they “instruct or encourage users to use code words when discussing vaccines or COVID-19 to evade our detection” .
However, the policy remains vague regarding the day-to-day use of emojis and codewords. The only policy Facebook appears to have on books directly discussing the inappropriate use of emojis as coded language deals with community standards regarding sexual solicitation. It seems that while anti-vaccine users’ emoji language can be expected to be unmoderated, anyone using “context-specific and commonly sexual emojis or emoji strings” is actually at risk of having their posts deleted. if moderators determine that they are using emojis to request or offer sex. .
In total, Bloomberg reviewed six anti-vaccine groups created last year where Facebook users use emojis like peaches and apples to suggest that people they know have been harmed by vaccines. Meta’s apparent failure to moderate anti-vaccine emoji language suggests that blocking coded language is probably not currently a priority.
Last year, when the BBC discovered that anti-vaccine groups were using carrots to hide misinformation about the COVID-19 vaccine, Meta immediately removed the identified groups. However, the BBC reported that soon after the same groups reappeared, and more recently Bloomberg reported that some of the groups he followed appeared to change names frequently, perhaps to avoid detection.
Beyond anti-vaccine activists, many other online users have relied on emojis to create coded language and avoid detection, not just by moderators but also by law enforcement.
The United States Drug Enforcement Administration publishes on its website a guide to emojis used to reference drug exchanges, including a candy bar for Xanax, a crown to designate a dealer, or a rocket to indicate high potency. This summer, The Atlantic reported that online users were increasingly relying on emojis to mask abortion discussions as access to abortion became more restricted. Much like social media, law enforcement and courts also don’t know how to weigh emoji language as evidence when investigating violations, scholars have suggested.
Last month, Forbes reported on the effectiveness of using emojis and other coded “algospeaks” to avoid content moderation, suggesting that people make better content moderators than algorithms for this type. of content. However, if the policies of social platforms, like Meta’s, don’t explicitly consider emoticons as violating content, which they apparently rarely do, it probably won’t matter anyway if it is a human or machine review. Coded language likely won’t be considered a violation until social platforms specifically incorporate it into their policies, like Meta has done for sexual solicitation, but has yet to do for misinformation about the COVID-19.
#Antivaccine #groups #avoid #Facebook #bans #emojis