TikTok feeds can sometimes be called the west as one can probably browse through strange and sometimes risky feeds. TikTok is currently moving to control the feeds. In a blog post Wednesday afternoon, the firm revealed the launch of a new system for rating content known as “Content Levels,” which it intends to roll out in a first release “in the coming weeks.” TikTok made a statement on February 1 that the company was looking into restricting feeds(New Content Rating System) on the basis of age. Content Levels is the first information on what this could mean.
New Content Rating System on Tiktok
The app users also will be able to have greater access to their videos and are able to turn off hashtags with a select number of hashtags. While Facebook, the world’s largest social network claims that the television, film and gaming industries employ similar technology, it will not be able to immediately show the ratings of video clips. The sorting and filtering process will occur in the background. “A maturity score will be assigned to the video to help prevent those under 18 from viewing it across the TikTok experience,” the company announced.
“When we find that a film contains complicated or mature themes, like fictional scenes that are too intense or frightening for young viewers. We’ve put a priority on improving teens’ security first. in the coming months we will be adding new features to offer complete filters to our customers.”
TikTok made clear it was announcing its upcoming content moderation system is in the early stages. “We also recognize that what we’re attempting to accomplish is complex, and we may make mistakes,” the company said. But, as we wait for a comprehensive top-down content filtering based on age, users of apps can now restrict their content. Hashtags and phrases that appear in “For You” or “Following” feeds can be reduced, making scrolls more well-organized than they were previously. The platform announced that these changes , along with additional efforts to diversify the recommended videos will be made over the next few weeks.
The platform has already put in place policies on content in the works. Based on user feedback as well as the feedback of employees charged with reviewing posts, the platform prohibits specific types of video. The two former TikTok moderators brought a suit against the company in March. They claimed it was their attempts to delete violent or inappropriate footage from TikTok have caused trauma to them. According to the suit moderators aren’t adequately protected or offered mental health care through TikTok. This could be bad information for TikTok’s planned expansion of moderated use.
Parents who claim that their children suffered harm or even killed as a result of TikTok Content are also taking action against the app. A 10-year-old girl’s parent brought a suit with the firm in late May. She was she claimed that her child had suffered asphyxiation after attempting an exercise called the “Blackout Challenge,” which was made popular by the app. Similar lawsuits were filed by a number of parents this month. Parents’ lawsuits based on accusations of addiction to social media may become a reality in California due to new laws.