The team were told to flag content from users who appeared to have autism, Down's syndrome or facial disfigurements.that the policy has since been updated."Early on, in response to an increase in bullying on the app, we implemented a blunt and temporary policy," a"This was never designed to be a long-term solution, and while the intention was good, it became clear that the approach was wrong," the statement continued.
And in some cases, vulnerable users whose content attracted a few thousand views would end up being listed as"not recommended." That meant the videos would no longer be chosen by the app's algorithms to appear on the main"For You" feed of public uploads.users typically share short-form videos under a minute in length, meaning moderators would likely have had very little time to determine if a user was disabled.
for more-recent moderation and data policies. The U.S. team is led out of California and creates tailored rules for the American
that‘s awful