TikTok tests age restrictions on certain content
In an effort to boost security features for minors, TikTok is beginning to test ways to age-limit some sorts of content in its app.
Everything is still in the early stages, but the idea is to safeguard younger viewers from adult content and other possibly harmful movies, according to the business.
TikTok, like other apps, has recently come under fire for how it handles user safety, particularly among youngsters and young teenagers. Last autumn, the vice president of the public policy firm hinted that the app operates in new methods of "enjoying age-based content" during a congressional hearing. The business has now revealed a few additional clues about how it might seem.
TikTok is working on features that would classify content based on "content maturity zones and thematic comfort zones," according to Tracy Elizabeth, TikTok's global problem policy leader.
"Once the system is completely operational, stuff that we've recognized as containing publicly mature issues might be restricted to teens," Tracy added.
Elizabeth stated that the company is in the "innovation phase," but did not elaborate on how the company judged the video's maturity level. However, she believes it will eventually be similar to the grades used in movies, television, and video games.
"We're aware that there's family content, teen stuff, and adult content."
Parents can also control their children's preferences through TikTok's "Family," she noted. By tinkering with the pairing.
More mature content for younger users
TikTok is also developing a feature that will allow producers to specify whether their films are intended for adults or children. This could assist TikTok's recommendations to keep more mature content out of the feeds of younger viewers to be more accurate.
Although TikTok is now doing a modest age limit test, the features may not be publicly available for some time. They're still being formed, according to Elizabeth. "We haven't determined how much of this we'll do yet," she explained.
TikTok has released an updated set of community guidelines in conjunction with these new features. In an effort to make politics more obvious, TikTok now assigns suicidal deceptions and dangerous challenges to its own section under the new regulations.
As part of its eating disorder policy, the firm is also broadening the types of content it prohibits. Videos that promote "eating disorders," such as obsessive calorie tracking, short-term fasting, excessive exercise, and other "insufficiently recognized symptoms of a possible problem," would be prohibited under the new guidelines.
The app's terms have also been updated to specifically prohibit "taking on another's gender identity, misogyny, and content that supports or promotes conversion therapy programs."
The company claims it has removed similar remarks in the past, although the regulations aren't explicitly stated in its public community guidelines.