Snap’s Latest Transparency Report
At Snap, our goal is to design products and build technology that nurtures and supports real friendships in a healthy, safe and fun environment. We are constantly working to improve the ways we do that — from our policies and Community Guidelines, to our tools for preventing, detecting and enforcing against harmful content, as well as initiatives that help educate and empower our community.
We are committed to providing more transparency about the prevalence of content that violates our guidelines, how we enforce our policies, how we respond to law enforcement and government requests for information, and where we seek to provide more insight in the future. We publish transparency reports twice a year to provide insight into these efforts, and are also committed to making these reports more comprehensive and helpful to the many stakeholders who care deeply about online safety and transparency.
Today we’re releasing our transparency report for the second half of 2020, which covers the period of July 1 - December 31 of that year, which you can read in full here. As with our previous reports, it shares data about violations globally during this period; the number of content reports we received and enforced against across specific categories of violations; how we responded to requests from law enforcement and governments; and our enforcements broken down by country.
As part of our ongoing efforts to improve our transparency efforts, this report also includes several new elements. For the first time, we are sharing our Violative View Rate (VVR) which is the proportion of all Snaps (or views) that contained content that violated our guidelines. During this period, our VVR was 0.08 percent, which means that out of every 10,000 views of content on Snap, eight contained content that violated our guidelines. Every day, more than five billion Snaps are created using our Snapchat camera on average. During the second half of 2020, we enforced against 5,543,281 pieces of content globally that violated our guidelines.
Additionally, our report shares new insights about our enforcement against false information globally — an effort that was especially important as the world continued to battle a global pandemic, and efforts to undermine democratic institutions. During this time frame, we took action against 5,841 pieces of content and accounts for violations of our guidelines prohibiting the spread of misinformation and conspiracy theories that can cause harm.
We have always believed that when it comes to harmful content, it isn’t enough just to think about policies and enforcement, platforms need to think about their fundamental architecture and product design. Across our app, Snapchat limits virality, which removes incentives for harmful and sensationalized content, and opportunities to organize. Our report shares more details about our product design decisions, and our work to promote factual news and information to Snapchatters.
Going forward, we are focused on providing greater insights in future reports, such as expanding on subcategories of violating data. We are constantly evaluating how we can strengthen our comprehensive efforts to combat harmful content and bad actors, and are grateful to the many security and safety partners who are always helping us improve.