July 1, 2019 – December 31, 2019
Released: May 27, 2020
Updated: September 24, 2020
Snapchat Transparency Reports are released twice a year. These reports provide important insight into the volume and nature of governmental requests for Snapchatters' account information and other legal notifications.
Since November 15, 2015, our policy has been to notify Snapchatters when we receive legal process seeking their account information, with exceptions for cases where we are legally prohibited from doing so, or when we believe there are exceptional circumstances (like child exploitation or an imminent risk of death or bodily injury).
As technology and platforms have evolved, so too has the practice of providing important information to the public. Starting with this Transparency Report, we are providing insights into the volume and nature of accounts reported on Snapchat for violations of our Terms of Service or Community Guidelines.
We believe these disclosures provide our community with useful information on the volume and types of content reported and enforced on Snapchat. This knowledge will assist us in creating effective solutions to address harmful content.
United States Criminal Legal Requests
Requests for User Information pursuant to U.S. legal process.
|Category||Requests||Account Identifiers||Percentage of requests where some data was produced|
International Government Information Requests
Requests for User Information from government entities outside the United States.
|Country||Emergency Requests||Account Identifiers for Emergency Requests||Percentage of emergency requests where some data was produced||Other Information Requests||Account Identifiers for Other Requests||Percentage of other information request where some data was produced|
|United Arab Emirates||8||10||38%||0||0||0%|
* “Account Identifiers” reflects the number of identifiers (e.g. username, email address, phone number, etc.) specified by law enforcement in legal process when requesting user information. Some legal process may include more than one identifier. In some instances, multiple identifiers may identify a single account. In instances where a single identifier is specified in multiple requests, each instance is included.
United States National Security Requests
Requests for User Information pursuant to national security legal process.
|National Security||Requests||Account Identifiers*|
|NSLs and FISA Orders/Directives||O-249||1250-1499|
Government Content Removal Requests
This category identifies demands by a government entity to remove content that would otherwise be permissible under our Terms of Service or Community Guidelines.
|Removal Requests||Percentage of requests where some content was removed|
Note: Although we do not formally track when we remove content that violates our policies when a request has been made by a governmental entity, we believe it is an extremely rare occurrence. When we believe it is necessary to restrict content that is deemed unlawful in a particular country, but does not otherwise violate our policies, we seek to restrict access to it geographically when possible, rather than remove it globally.
This category identifies demands by a government entity to remove content that would be a violation under our Terms of Service or Community Guidelines.
|Country||Number of requests||Number of posts removed or restricted or number of accounts suspended|
Copyrighted Content Takedown Notices (DMCA)
This category reflects any valid takedown notices we received under the Digital Millennium Copyright Act.
|DMCA Takedown Notices|
|DMCA Counter-Notices||Percentage of requests where some content was reinstated|
Account / Content Violations
We enforced against 3,788,227 pieces of content, globally, for violations of our Community Guidelines, which amounts to less than .012% of total Story postings. Our teams take action on such violations, whether it’s to remove content, delete accounts, report information to the National Center for Missing & Exploited Children (NCMEC), or escalate to law enforcement. In the vast majority of cases, we enforce against content within 2 hours of receiving an in-app report.
|Reason||Content Reports*||Content Enforced||Unique Accounts Enforced|
|Harassment and Bullying||918,902||221,246||185,815|
|Sexually Explicit Content||5,428,455||2,930,946||747,797|
*The Content Reports reflect alleged violations via our in app reporting product.
|Region||Content Reports*||Content Enforced||Unique Accounts Enforced|
|Rest of World||3,177,170||1,120,306||258,407|
Child Sexual Abuse Materials (CSAM) Takedown
The exploitation of any member of our community, especially minors, is absolutely unacceptable and criminal. Preventing, detecting, and eliminating abuse on our platform is a top priority and we work hard to combat this type of illegal activity, informed by our partnerships with NCMEC, law enforcement, and trusted experts that make up Snap’s Safety Advisory Board. In addition to taking action on reported content, we utilize proactive detection methods to stop the spread of CSAM before it occurs. Of the total accounts enforced against for Community Guidelines violations, we removed 2.51% for CSAM takedown.
|Rest of World||11,766|
Transparency Report Archives
- January 1, 2020 - June 30, 2020
- July 1, 2019 - December 31, 2019
- January 1, 2019 - June 30, 2019
- July 1, 2018 - December 31, 2018
- January 1, 2018 - June 30, 2018
- July 1, 2017 - December 31, 2017
- January 1, 2017 - June 30, 2017
- July 1, 2016 - December 31, 2016
- January 1, 2016 - June 30, 2016
- July 1, 2015 - December 31, 2015
- January 1, 2015 - June 30, 2015
- November 1, 2014 - February 28, 2015