In an update to its community standards page, the world's largest online social network gave users more guidance on why, for example, it might take down a post that featured sexual violence and exploitation, hate speech, criminal activity or bullying. The Menlo Park, California-based company said it isn't changing how it regulates the content of posts, and that while some of the guidance for users is new, "it is consistent with how we've applied our standards in the past." "People from different backgrounds may have different ideas about what's appropriate to share — a video posted as a joke by one person might be upsetting to someone else, but it may not violate our standard," wrote Monika Bickert, head of global policy management, and Chris Sonderby, deputy general counsel, in the post.