Big Tech & Body Image: Is Enough Being Done?

In September, Tik Tok announced that the company will ban advertisements for weight loss supplements and diet fasting apps. The video streaming social media platform also pledged to restrict advertisements that “promote a harmful and negative body image.” The policy change is a response to backlash from users for allowing advertisements that promote dangerous fasting diets to young girls, as more than a third of Tik Tok’s users are under the age of fifteen.

Studies conducted over the past decade show that increased use of social media is strongly linked to declining mental health rates among US teenagers and young adults , which has been linked to drastic increases in teen suicide rates. Between 2009 and 2017, teen suicide rates increased 60% for ages 14 to 17, 47% for ages 12 to 13 and 46% for ages 18 to 21. The more time that teenagers spend on social media, the more insecure and anxious they feel over time. The long term effects have the potential to devastate both their mental and physical health. Teenage girls are especially vulnerable to developing body insecurities and eating disorders as a result of the pressure to fit a certain standard of beauty portrayed on social media.

Tik Tok is not the first social media company to come under fire for promoting dangerous products and lifestyles to young adults for their own profit. For years, Instagram was notorious for promoting questionable weight loss products targeted at teens, such as appetite suppressant lollipops and fruity weight loss tea. Although Instagram now bans the promotion of such diet supplements, the policy change came only after its involvement in a federal lawsuit against the “fit tea” brand, Teami.  The Federal Trade Commission filed a complaint against Teami for making unsubstantiated claims about the tea’s health benefits and promoting the tea on Instagram without any scientific evidence to support the claims. Teami was fined one million dollars.

Although banning harmful content on TikTok is undoubtedly a step in the right direction, these policy changes alone may not have the desired effect due to the company’s limited ability to effectively enforce content regulation. Instagram attempted to strengthen its guidelines by limiting the types of diet products permitted for promotion and introducing minimum user age requirements for diet product ads. However, Instagram’s policies have thus far failed to serve their intended purpose. Scam diet companies, like Teami, have taken advantage of loopholes in the new guidelines by substituting blatantly problematic dieting phrases with more insidious words that serve the same purpose. For example, companies easily rebrand the same diet products as promoting “wellness” rather than “weight loss,” without violating Instagram’s newest guidelines. It is likely that Tik Tok will face similar challenges while attempting to ban dangerous diet companies from marketing on their platform.

The ease at which companies are able to manipulate community guidelines raises the question of whether Big Tech truly cares at all about user safety. Were these guideline changes a genuine effort to fix past mistakes or a forced performance to save face in light of a scandal? Worse yet is the question of whether these tech titans even have the ability to control the content on their platforms at all. Now that the power of social media is realized, it is time to consider the moral and ethical responsibilities these companies have to protect users and, ultimately, the well-being of the next generation.