Transparency Report

July 1, 2019 – December 31, 2019

Released: May 27, 2020

Updated: September 24, 2020

Snapchat Transparency Reports are released twice a year. These reports provide important insight into the volume and nature of governmental requests for Snapchatters' account information and other legal notifications.

Since November 15, 2015, our policy has been to notify Snapchatters when we receive legal process seeking their account information, with exceptions for cases where we are legally prohibited from doing so, or when we believe there are exceptional circumstances (like child exploitation or an imminent risk of death or bodily injury).

As technology and platforms have evolved, so too has the practice of providing important information to the public. Starting with this Transparency Report, we are providing insights into the volume and nature of accounts reported on Snapchat for violations of our Terms of Service or Community Guidelines.

We believe these disclosures provide our community with useful information on the volume and types of content reported and enforced on Snapchat. This knowledge will assist us in creating effective solutions to address harmful content.

For more information about how we handle law enforcement data requests, please take a look at our Law Enforcement GuidePrivacy Policy, and Terms of Service.

United States Criminal Legal Requests

Requests for User Information pursuant to U.S. legal process.

CategoryRequestsAccount IdentifiersPercentage of requests where some data was produced
Total11,90319,21478%
Subpoena2,3984,81275%
PRTT9214185%
Court Order20647582%
Search Warrant7,62811,45281%
EDR1,4031,66867%
Wiretap Order173582%
Summons15963186%

International Government Information Requests

Requests for User Information from government entities outside the United States.

CountryEmergency RequestsAccount Identifiers for Emergency RequestsPercentage of emergency requests where some data was producedOther Information RequestsAccount Identifiers for Other RequestsPercentage of other information request where some data was produced
Total77592464%1,1961,73236%
Argentina000%120%
Australia202630%33576%
Austria11100%770%
Bahrain110%000%
Belgium44100%29360%
Canada19723671%297059%
Denmark2250%38570%
Estonia000%110%
Findland3433%310%
France668752%9410749%
Germany9610763%1491971%
Greece000%220%
Hungary000%110%
Iceland22100%000%
India4550%39540%
Ireland4550%360%
Israel6750%000%
Italy000%110%
Jordan110%550%
Macedonia000%110%
Malaysia000%110%
Maldives000%110%
Malta000%220%
Mexico000%120%
Netherlands212676%220%
New Zealand000%590%
Norway9744%55660%
Pakistan000%110%
Poland3533%11190%
Qatar7743%200%
Romania000%230%
Singapore000%220%
Slovenia000%110%
Spain000%110%
Sweden61033%31550%
Switzerland101360%17300%
Turkey000%110%
United Arab Emirates81038%000%
United Kingdom30435868%61391960%

* “Account Identifiers” reflects the number of identifiers (e.g. username, email address, phone number, etc.) specified by law enforcement in legal process when requesting user information. Some legal process may include more than one identifier. In some instances, multiple identifiers may identify a single account. In instances where a single identifier is specified in multiple requests, each instance is included.

United States National Security Requests

Requests for User Information pursuant to national security legal process.

National SecurityRequestsAccount Identifiers*
NSLs and FISA Orders/DirectivesO-2491250-1499

Government Content Removal Requests

This category identifies demands by a government entity to remove content that would otherwise be permissible under our Terms of Service or Community Guidelines.

Removal RequestsPercentage of requests where some content was removed
0N/A

Note: Although we do not formally track when we remove content that violates our policies when a request has been made by a governmental entity, we believe it is an extremely rare occurrence. When we believe it is necessary to restrict content that is deemed unlawful in a particular country, but does not otherwise violate our policies, we seek to restrict access to it geographically when possible, rather than remove it globally.

This category identifies demands by a government entity to remove content that would be a violation under our Terms of Service or Community Guidelines.

CountryNumber of requestsNumber of posts removed or restricted or number of accounts suspended
Australia4255
France4667
Iraq22
New Zealand1929
Qatar11
United Kingdom1720

Copyrighted Content Takedown Notices (DMCA)

This category reflects any valid takedown notices we received under the Digital Millennium Copyright Act.

DMCA Takedown Notices
57
DMCA Counter-NoticesPercentage of requests where some content was reinstated
00%

Account / Content Violations

We enforced against 3,788,227 pieces of content, globally, for violations of our Community Guidelines, which amounts to less than .012% of total Story postings. Our teams take action on such violations, whether it’s to remove content, delete accounts, report information to the National Center for Missing & Exploited Children (NCMEC), or escalate to law enforcement. In the vast majority of cases, we enforce against content within 2 hours of receiving an in-app report.

ReasonContent Reports*Content EnforcedUnique Accounts Enforced
Harassment and Bullying918,902221,246185,815
Hate Speech181,78946,93641,381
Impersonation1,272,93429,97228,101
Regulated Goods467,822248,581140,583
Sexually Explicit Content 5,428,4552,930,946747,797
Spam579,76763,91734,574
Threats/Violence/Harm1,056,437246,629176,912
Total9,906,1063,788,2271,355,163

*The Content Reports reflect alleged violations via our in app reporting product.

RegionContent Reports*Content EnforcedUnique Accounts Enforced
Europe2,471,867879,788366,609
North America4,257,0691,788,133730,147
Rest of World3,177,1701,120,306258,407
9,906,1063,788,2271,355,163

Child Sexual Abuse Materials (CSAM) Takedown

The exploitation of any member of our community, especially minors, is absolutely unacceptable and criminal. Preventing, detecting, and eliminating abuse on our platform is a top priority and we work hard to combat this type of illegal activity, informed by our partnerships with NCMEC, law enforcement, and trusted experts that make up Snap’s Safety Advisory Board. In addition to taking action on reported content, we utilize proactive detection methods to stop the spread of CSAM before it occurs. Of the total accounts enforced against for Community Guidelines violations, we removed 2.51% for CSAM takedown.

RegionAccounts Deleted
Europe10,667
North America12,397
Rest of World11,766
Total34,830

Transparency Report Archives