On Thursday, YouTube announced that it would restrict teenagers’ exposure to videos that promote and idealize a specific fitness level or physical appearance
The safeguard was initially implemented in the United States last year and is currently being implemented for adolescents worldwide.
This announcement is made in response to the criticism that YouTube has received in recent years for its potential to harm adolescents and its exposure to content that could promote eating disorders.
YouTube will restrict access to content that compares physical attributes and idealizes specific body types, fitness levels, and weights. In addition, YouTube will restrict the visibility of videos that depict “social aggression” in the form of intimidation and non-contact battles.
The Google-owned platform acknowledges that this type of content may not be as detrimental as a single video; however, it could become problematic if it is repeatedly displayed to adolescents. To mitigate this issue, YouTube will restrict the frequency of videos associated with these subjects.
The company must implement these safeguards to safeguard adolescents from being repeatedly exposed to content that conforms to YouTube’s policies, as its recommendations are determined by the content that users frequently view and interact with.
“Dr. Garth Graham, YouTube’s global head of health, stated that the repeated consumption of content featuring idealized standards that begins to shape an unrealistic internal standard could lead some teens to form negative beliefs about themselves as they are developing thoughts about who they are and their standards for themselves.”
An announcement was made on Thursday, one day after YouTube introduced a new tool that enables parents to attach their accounts to their teen’s account to access insights about the teen’s activity on the platform. Once parents have established a connection between their account and their adolescent’s, they will receive notifications regarding their adolescent’s channel activity, including the number of submissions and subscriptions.
The utility is based on YouTube’s existing parental controls, which enable parents to test supervised accounts with children under the age of consent for online services, which is 13 in the United States. It is important to mention that other social media platforms, such as Snapchat, Instagram, Facebook, and TikTok, also provide supervised accounts associated with juvenile users’ parents.