Safer Internet Day 2022: Your report matters!

Today is international Safer Internet Day (SID), an annual event dedicated to people coming together around the world to make the internet safer and healthier for everyone, especially young people. SID 2022 marks 19 straight years of Safer Internet Day celebrations, and the world is again rallying around the theme, “Together for a better internet.”

At Snap, we’re taking this opportunity to highlight the benefits and importance of letting us know when you see something on Snapchat that may be of concern to you. Snapchat is about sharing and communicating with close friends, and we want everyone to feel safe, confident and comfortable sending Snaps and Chats. Still, there may be times when people may share content or behave in a way that conflicts with our Community Guidelines.

When it comes to staying safe online, everyone has a role to play, and we want all Snapchatters to know that reporting abusive or harmful content and behavior – so that we can address it – improves the community experience for everyone. In fact, this is one of the most important things Snapchatters can do to help keep the platform free of bad actors and harmful content.

Reporting reluctance

Research shows young people may be unwilling to report content or behaviors for a variety of reasons. Some of these may be rooted in social dynamics, but platforms can also do a better job of debunking certain myths about reporting to foster comfort in contacting us. For example, in November 2021, we learned that just over a third of young people surveyed (34%) said they worry what their friends will think if they take action against bad behavior on social media. In addition, almost one in four (39%) said they feel pressure not to act when someone they personally know behaves badly. These findings come from Managing the Narrative: Young People’s Use of Online Safety Tools, conducted by Harris Insights and Analytics for the Family Online Safety Institute (FOSI) and sponsored by Snap. 

The FOSI research polled several cohorts of teens, aged 13 to 17, and young adults, aged 18 to 24, in the U.S. In addition to the quantitative components, the survey sought participants’ general views on reporting and other topics. One comment from an 18-year-old summed up a number of young people’s perspectives, “I guess I didn’t think the offense was extreme enough to report.” 

Fast Facts about reporting on Snapchat

The FOSI findings suggest possible misconceptions about the importance of reporting to platforms and services in general. For Snapchatters, we hope to help clear those up with this handful of Fast Facts about our current reporting processes and procedures. 

  • What to report:  In the conversations and Stories portions of Snapchat, you can report photos, videos and accounts; in the more public Discover and Spotlight sections, you can report content. 

  • How to report:  Reporting photos and videos can be done directly in the Snapchat app (just press and hold on the content); you can also report content and accounts via our Support Site (simply complete a short webform).  

  • Reporting is confidential:  We don’t tell Snapchatters who reported them.

  • Reports are vital:  To improve the experiences of Snapchatters, reports are reviewed and actioned by our safety teams, which operate around the clock and around the globe. In most instances, our teams action reports within two hours. 

  • Enforcement can vary:  Depending on the type of Community Guidelines or Terms of Service violation, enforcement actions can range from a warning, up to and including account deletion. (No action is taken when an account is found not to have violated Snapchat’s Community Guidelines or Terms of Service.) 

We’re always looking for ways to improve, and we welcome your feedback and input. Feel free to share your thoughts with us using our Support Site webform

To commemorate Safer Internet Day 2022, we suggest all Snapchatters review our Community Guidelines and Terms of Service to brush up on acceptable content and conduct. We’ve also created a new reporting Fact Sheet that includes a helpful FAQ, and we updated a recent “Safety Snapshot” episode on reporting. Safety Snapshot is a Discover channel that Snapchatters can subscribe to for fun and informative safety- and privacy-related content. For some added enjoyment to mark SID 2022, check out our new global filter, and look for additional improvements to our in-app reporting features in the coming months.    

New resource for parents 

Finally, we want to highlight a new resource we’re offering for parents and caregivers. In collaboration with our partners at MindUp: The Goldie Hawn Foundation, we’re pleased to share a new digital parenting course, “Digital Well-Being Basics,” which takes parents and caregivers through a series of modules about supporting and empowering healthy digital habits among teens. 

We look forward to sharing more of our new safety and digital well-being work in the coming months. In the meantime, consider doing at least one thing this Safer Internet Day to help keep yourself and others safe. Making a personal pledge to report would be a great start! 

- Jacqueline Beauchere, Global Head of Platform Safety

Data Privacy Day: Supporting the Privacy and Wellbeing of Snapchatters

Copy of Template 1 (2)

Today marks Data Privacy Day, a global effort to raise awareness about the importance of respecting and safeguarding privacy. Privacy has always been central to Snapchat’s primary use case and mission.

We first built our app to help people connect with their real friends and feel comfortable expressing themselves authentically – without feeling pressure to curate a perfect image or measure themselves against others. We wanted to reflect the natural dynamics between friends in real life, where trust and privacy are essential to their relationships.

We designed Snapchat with fundamental privacy features baked into the app’s architecture, to help our community develop that trust with their real-life friends, and support their safety and wellbeing:

  • We focus on connecting people who were already friends in real life and require that, by default, two Snapchatters opt-in to being friends in order to communicate.

  • We designed communications to delete by default to reflect the way people talk to their friends in real life, where they don’t keep a record of every single conversation for public consumption.

  • New features go through an intensive privacy- and safety-by-design product development process, where our in-house privacy experts work closely with our product and engineering teams to vet the privacy impacts.

We’re also constantly exploring what more we can do to help protect the privacy and safety of our community, including how to further educate them about online risks. To help us continue to do that, we recently commissioned global research to better understand how young people think about their online privacy. Among other things, the findings confirmed that almost 70% of participants said privacy makes them feel more comfortable expressing themselves online, and 59% of users say privacy and data security concerns impact their willingness to share on online platforms You can read more of our findings here.

We feel a deep responsibility to help our community develop strong online privacy habits – and want to reach Snapchatters where they are through in-app education and resources. 

We regularly remind our community to enable two-factor authentication and use strong passwords -- two important safeguards against account breaches, and today are launching new content on our Discover platform with tips about creating unique account credentials and how to set up two-factor authentication. 

We are launching new privacy-focused creative tools, including our first-ever privacy-themed Bitmoji, stickers developed  with the International Association of Privacy Professionals (IAPP), a new Lens in partnership with Future Privacy Forum that shares helpful privacy tips.

In the coming months, we will continue to leverage our research findings to inform additional in-app privacy tools for our community.  

Meet Our Head of Global Platform Safety

Hello, Snapchat community! My name is Jacqueline Beauchere and I joined Snap last Fall as the company’s first Global Head of Platform Safety. 

My role focuses on enhancing Snap’s overall approach to safety, including creating new programs and initiatives to help raise awareness of online risks; advising on internal policies, product tools and features; and listening to and engaging with external audiences – all to help support the safety and digital well-being of the Snapchat community. 

Since my role involves helping safety advocates, parents, educators and other key stakeholders understand how Snapchat works and to solicit their feedback, I thought it might be useful to share some of my initial learnings about the app; what surprised me; and some helpful tips, if you or someone close to you is an avid Snapchatter. 

 Initial Learnings – Snapchat and Safety 

After more than 20 years working in online safety at Microsoft, I’ve seen significant change in the risk landscape. In the early 2000s, issues like spam and phishing highlighted the need for awareness-raising to help educate consumers and minimize socially engineered risks. The advent of social media platforms – and people’s ability to post publicly – increased the need for built-in safety features and content moderation to help minimize exposure to illegal and potentially more harmful content and activity.  

Ten years ago, Snapchat came onto the scene. I knew the company and the app were “different,” but until I actually started working here, I didn’t realize just how different they are. From inception, Snapchat was designed to help people communicate with their real friends – meaning people they know “in real life” – rather than amassing large numbers of known (or unknown) followers. Snapchat is built around the camera. In fact, for non-first-generation Snapchatters (like me), the app’s very interface can be a bit mystifying because it opens directly to a camera and not a content feed like traditional social media platforms. 

There’s far more that goes into Snapchat’s design than one might expect, and that considered approach stems from the tremendous value the company places on safety and privacy. Safety is part of the company’s DNA and is baked into its mission: to empower people to express themselves, live in the moment, learn about the world and have fun together. Unless people feel safe, they won’t be comfortable expressing themselves freely when connecting with friends.

The belief that technology should be built to reflect real-life human behaviors and dynamics is a driving force at Snap. It’s also vital from a safety perspective. For example, by default, not just anyone can contact you on Snapchat; two people need to affirmatively accept each other as friends before they can begin communicating directly, similar to the way friends interact in real life.

Snap applies privacy-by-design principles when developing new features and was one of the first platforms to endorse and embrace safety-by-design, meaning safety is considered in the design phase of our features – no retro-fitting or bolting on safety machinery after the fact. How a product or feature might be misused or abused from a safety perspective is considered, appropriately so, at the earliest stages of development.  

What Surprised Me – Some Context Behind Some Key Features 

Given my time in online safety and working across industry, I’d heard some concerns about Snapchat. Below are a handful of examples and what I’ve learned over the past few months. 

Content that Deletes by Default 

Snapchat is probably most known for one of its earliest innovations: content that deletes by default. Like others, I made my own assumptions about this feature and, as it turns out, it’s something other than I’d first presumed. Moreover, it reflects the real-life-friends dynamic.

Snapchat’s approach is rooted in human-centered design. In real life, conversations between and among friends aren’t saved, transcribed or recorded in perpetuity. Most of us are more at ease and can be our most authentic selves when we know we won’t be judged for every word we say or every piece of content we create. 

One misperception I’ve heard is that Snapchat’s delete-by-default approach makes it impossible to access evidence of illegal behavior for criminal investigations. This is incorrect. Snap has the ability to, and does, preserve content existing in an account when law enforcement sends us a lawful preservation request. For more information about how Snaps and Chats are deleted, see this article

Strangers Finding Teens

A natural concern for any parent when it comes to online interactions is how strangers might find their teens. Again, Snapchat is designed for communications between and among real friends; it doesn’t facilitate connections with unfamiliar people like some social media platforms. Because the app was built for communicating with people we already know, by design, it’s difficult for strangers to find and contact specific individuals. Generally, people who are communicating on Snapchat have already accepted each other as friends. In addition, Snap has added protections to make it even more difficult for strangers to find minors, like banning public profiles for those under 18. Snapchat only allows minors to surface in friend-suggestion lists (Quick Add) or Search results if they have friends in common. 

A newer tool we want parents and caregivers to be aware of is Friend Check-Up, which prompts Snapchatters to review their friend lists to confirm those included are still people they want to be in contact with. Those you no longer want to communicate with can easily be removed. 

Snap Map and Location-Sharing

Along those same lines, I’ve heard concerns about the Snap Map – a personalized map that allows Snapchatters to share their location with friends, and to find locally relevant places and events, like restaurants and shows. By default, location-settings on Snap Map are set to private (Ghost Mode) for all Snapchatters. Snapchatters have the option of sharing their location, but they can do so only with others whom they’ve already accepted as friends – and they can make location-sharing decisions specific to each friend. It’s not an “all-or-nothing” approach to sharing one’s location with friends. Another Snap Map plus for safety and privacy: If people haven’t used Snapchat for several hours, they’re no longer visible to their friends on the map.  

Most importantly from a safety perspective, there’s no ability for a Snapchatter to share their location on the Map with someone they’re not friends with, and Snapchatters have full control over the friends they choose to share their location with or if they want to share their location at all.

Harmful Content

Early on, the company made a deliberate decision to treat private communications between friends, and public content available to wider audiences, differently. In the more public parts of Snapchat, where material is likely to be seen by a larger audience, content is curated or pre-moderated to prevent potentially harmful material from “going viral.” Two parts of Snapchat fall into this category: Discover, which includes content from vetted media publishers and content creators, and Spotlight, where Snapchatters share their own entertaining content with the larger community.

On Spotlight, all content is reviewed with automated tools, but then undergoes an extra layer of human moderation before it is eligible to be seen, currently, by more than a couple dozen people. This helps to ensure the content complies with Snapchat’s policies and guidelines, and helps to mitigate risks that may have been missed by automoderation. By seeking to control virality, Snap lessens the appeal to publicly post illegal or potentially harmful content, which, in turn, leads to significantly lower levels of exposure to hate speech, self-harm and violent extremist material, to name a few examples – as compared with other social media platforms.

Exposure to Drugs

Snapchat is one of many online platforms that drug dealers are abusing globally and, if you’ve seen any media coverage of parents and family members who’ve lost children to a fentanyl-laced counterfeit pill, you can appreciate how heartbreaking and terrifying this situation can be. We certainly do, and our hearts go out to those who’ve lost loved ones to this frightening epidemic.

Over the past year, Snap has been aggressively and comprehensively tackling the fentanyl and drug-related content issue in three key ways:

  • Developing and deploying new technology to detect drug-related activity on Snapchat to, in turn, identify and remove drug dealers who abuse the platform;

  • Reinforcing and taking steps to bolster our support for law enforcement investigations, so authorities can quickly bring perpetrators to justice; and

  • Raising awareness of the dangers of fentanyl with Snapchatters via public service announcements and educational content directly in the app. (You can learn more about all of these efforts here.)

We’re determined to make Snapchat a hostile environment for drug-related activity and will continue to expand on this work in the coming months. In the meantime, it’s important for parents, caregivers and young people to understand the pervasive threat of potentially fatally fake drugs that has spread across online platforms, and to talk with family and friends about the dangers and how to stay safe. 

Snap has much planned on the safety and privacy fronts in 2022, including launching new research and safety features, as well as creating new resources and programs to inform and empower our community to adopt safer and healthier digital practices. Here’s to the start of a productive New Year, chock-full of learning, engagement, safety and fun!   

- Jacqueline Beauchere, Global Head of Platform Safety

Expanding our Work to Combat the Fentanyl Epidemic

Head Up Portal

Late last year, the CDC announced that more than 100,000 people died from drug overdoses in the US over a 12 month period -- with fentanyl being a major driver of this spike. This staggering data hits home – we recognize the horrible human toll that the opioid epidemic is taking across the county, and the impact of fentanyl and adulterated drugs (often masked as counterfeit prescription drugs) is having on young people and their families in particular. We also know that drug dealers are constantly searching for ways to exploit messaging and social media apps, including trying to find new ways to abuse Snapchat and our community, to conduct their illegal and deadly commerce.

Our position on this has always been clear: we have absolutely zero tolerance for drug dealing on Snapchat. We are continuing to develop new measures to keep our community safe on Snapchat, and have made significant operational improvements over the past year toward our goal of eradicating drug dealers from our platform. Moreover, although Snapchat is just one of many communications platforms that drug dealers seek to abuse in order to distribute illicit substances, we still have a unique opportunity to use our voice, technology and resources to help address this scourge, which threatens the lives of our community members.

In October, we shared updates on the progress we have been making to crack down on drug-related activity and to promote broader public awareness about the threat of illicit drugs. We take a holistic approach that includes deploying tools that proactively detect drug-related content, working with law enforcement to support their investigations, and providing in-app information and support to Snapchatters who search for drug-related terms through a new education portal, Heads Up. 

Today, we’re expanding on this work, in several ways. First, we will be welcoming two new partners to our Heads Up portal to provide important in-app resources to Snapchatters: Community Anti-Drug Coalitions of America (CADCA), a nonprofit organization that is committed to creating safe, healthy and drug-free communities; and Truth Initiative, a nonprofit dedicated to achieving a culture where all young people reject smoking, vaping and nicotine. Through their proven-effective and nationally recognized truth public education campaign, Truth Initiative has provided content addressing the youth epidemics of vaping and opioids, which they’ve taken on in recent years. In the coming days we will also release the next episode of our special Good Luck America series focused on fentanyl, which is featured on our Discover content platform. 

Second, we’re sharing updates on the progress we’ve made in proactively detecting drug-related content and more aggressively shutting down dealers. Over the past year:

  • We have increased our proactive detection rates by 390% -- an increase of 50% percent since our last public update in October. 

  • 88% of drug related content we uncover is now proactively detected by our machine learning and artificial intelligence technology, with the remainder reported by our community. This is an increase of 33% since our previous update. When we find drug dealing activity, we promptly ban the account, use technology to block the offender from creating new accounts on Snapchat, and in some cases proactively refer the account to law enforcement for investigation. 

  • We have grown our law enforcement operations team by 74%. While we’ve always cooperated with law enforcement investigations by preserving and disclosing data in response to valid requests, this increased capacity helped us significantly improve our response times to law enforcement inquiries by 85% over the past year, and we continue to improve these capabilities. You can learn more about our investments in our law enforcement work here

Since this fall, we have also seen another important indicator of progress: a decline in community-reported content related to drug sales. In September, over 23% of drug-related reports from Snapchatters contained content specifically related to sales, and as a result of proactive detection work, we have driven that down to 16% as of this month. This marks a decline of 31% in drug-related reports. We will keep working to get this number as low as possible. 

Additionally, we continue to work with experts to regularly update the list of slang and drug-related terms we block from being visible in Snapchat. This is a constant, ongoing effort that not only prohibits Snapchatters from getting Search results for those terms, but then also proactively surfaces the expert educational resources in our Heads Up tool. 

Third, we’re continuing to make our underlying products safer for minors. As a platform built for close friends, we designed Snapchat to make it difficult for strangers to find and connect with minors. For example, Snapchatters cannot see each other’s friend lists, we don’t allow browsable public profiles for anyone under 18 and, by default, you cannot receive a message from someone who isn’t already your friend. While we know that drug dealers seek to connect with potential customers on platforms outside of Snapchat, we want to do everything we can to keep minors from being discovered on Snapchat by people who may be engaging in illegal or harmful behavior. 

We recently added a new safeguard to Quick Add, our friend suggestion feature, to further protect 13 to 17 year olds. In order to be discoverable in Quick Add by someone else, users under 18 will need to have a certain number of friends in common with that person -- further ensuring it is a friend they know in real life. 

In the coming months, we will be sharing more details about the new parental tools we are developing, with the goal of giving parents more insight into who their teens are talking to on Snapchat, while still respecting their privacy. 

And we will continue to build on this critical work, with additional partnerships and operational improvements underway.

Investing in and Expanding our Law Enforcement Operations

LEO Asset

When we first launched this blog, we explained that one of our goals was to do a better job of talking to the many stakeholders who care deeply about the health and wellbeing of our community -- parents and other family members, educators and mentors, safety advocates, and law enforcement.  In this post, we wanted to provide information about steps we’ve taken to facilitate better communications with the law enforcement community.

Law enforcement at every level are crucial partners in our efforts to combat illegal or harmful activity on our platform. As part of our ongoing work to keep our community safe, we have an in-house Law Enforcement Operations team dedicated to reviewing and responding to law enforcement requests for data related to their investigations. For example:

  • While content on Snapchat is ephemeral, designed to reflect the nature of real life conversations between friends, we have long offered law enforcement agencies the ability to, consistent with applicable laws, preserve available account information and content for law enforcement in response to valid legal requests. 

  • We have always proactively escalated to law enforcement authorities any content that could involve imminent threats to life. 

  • Once we have received a valid legal request for Snapchat account records, we respond in compliance with applicable laws and privacy requirements.

Over the past year, we have been investing in growing this team and continuing to improve their capabilities for timely responding to valid law enforcement requests. The team has expanded by 74%, with many new team members joining across all levels, including some from careers as prosecutors and law enforcement officials with experience in youth safety. As a result of these investments, we have been able to significantly improve our response times for law enforcement investigations by 85% year-over-year.  In the case of emergency disclosure requests -- some of the most critical requests, which involve the imminent danger of death or serious bodily injury -- our 24/7 team usually responds within 30 minutes. To learn more about the types of law enforcement requests Snap receives and the volume of requests, we publish a Transparency Report every six months to provide the public with these important insights. You can read our latest report, covering the first half of 2021, here

Recognizing that Snapchat is built differently than traditional social media platforms, and many members of law enforcement may not be as familiar with how our products work and what capabilities we have for supporting their work, one of our top priorities is to provide more -- and ongoing -- educational resources to help this community better learn how our services and processes work. We recently took two important steps forward as part of this larger focus.

First, we welcomed Rahul Gupta to serve as our first Head of Law Enforcement Outreach. Rahul joined Snap after a distinguished career as a local prosecutor in California, with an expertise in cybercrime, social media, and digital evidence. In this new role, Rahul will develop a global law enforcement outreach program to raise awareness about Snap’s policies for responding to legal data requests. He will also build relationships and seek regular feedback from law enforcement agencies as we continue to identify areas for improvement. 

Second, in October, we held our first-ever Snap Law Enforcement Summit to help build stronger connections and explain our services to U.S. law enforcement officials. More than 1,700 law enforcement officials from federal, state and local agencies participated. 

To help measure how useful our inaugural event was and identify areas for opportunity, we surveyed our attendees before and after the Summit. Prior to the Summit, we found that:

  • Only 27% of those surveyed were familiar with safety measures Snapchat had;

  • 88% wanted to learn what kind of data Snapchat can provide in support of their investigations; and

  • 72% wanted to know what the process is for how best to work with Snapchat.

After the Summit:

  • 86% of attendees said they have a better understanding of our work with law enforcement;

  • 85% said they had a better understanding of the process to submit legal requests for data; and

  • 78% would want to attend future Snap law enforcement summits.  

We are deeply grateful for all of those who attended, and in light of their feedback, are pleased to share that we will be making our Snap Law Enforcement Summit an annual event in the U.S. We are also planning to expand our outreach to law enforcement agencies in certain countries outside the U.S.

Our long-term goal is to have a world-class Law Enforcement Operations team -- and we know we have to continue to make meaningful improvements to get there. We hope our inaugural Summit was the start of an important dialogue with law enforcement stakeholders about how we can continue to build on the progress we’re seeing -- and help keep Snapchatters safe. 

Our Transparency Report for the First Half of 2021

Today, we’re releasing our transparency report for the first half of 2021, which covers the period of January 1 - June 30 of this year. As with recent reports, this installment shares data about violations of our Community Guidelines globally during the period; the number of content reports we received and enforced against across specific categories of violations; how we responded to requests from law enforcement and governments; our enforcements broken down by country; the Violative View Rate of Snapchat content; and incidences of false information on the platform. 

We’re adding several updates to our reporting this period, including noting our median turnaround time in minutes from hours to provide more detail about our operational practices and efficacy.

Every day, on average more than five billion Snaps are created using our Snapchat camera. From January 1 - June 30, 2021, we enforced against 6,629,165 pieces of content globally that violated our Guidelines. During this period, our Violative View Rate (VVR) was 0.10 percent, which means that out of every 10,000 views of content on Snap, 10 contained content that violated our Guidelines. Additionally, we significantly improved our time responding to reports of violations, in particular for sexually explicit content, harassment and bullying, illegal and counterfeit drugs, and other regulated goods. 

Our Work To Combat Child Sexual Abuse Material 

The safety of our community is a top priority. As a platform built for communicating with real friends, we intentionally designed Snapchat to make it harder for strangers to find young people. For example, Snapchatters cannot see each others’ friend lists, and by default, cannot receive a message from someone who isn’t already a friend.

We have zero tolerance for abuse directed at any member of our community, especially minors, which is illegal, unacceptable and prohibited by our Community Guidelines. We work diligently to combat these violations by evolving our capabilities to prevent, detect and eradicate abuse on our platform including Child Sexual Abuse Material (CSAM) and other types of child sexually exploitative content.

Our Trust and Safety teams use proactive detection tools, such as PhotoDNA and Child Sexual Abuse Imagery (CSAI) Match technology to identify known illegal images and videos of CSAM and report them to the National Center for Missing and Exploited Children (NCMEC). NCMEC then, in turn, coordinates with domestic or international law enforcement. 

In the first half of 2021, 5.43 percent of the total number of accounts we enforced against globally contained CSAM. Of this, we proactively detected and actioned 70 percent of CSAM violations. This increased proactive detection capability combined with a rise in CSAM-spreading coordinated spam attacks resulted in a notable increase in this category for this reporting period. 

We have continued to expand our partnerships with safety experts as well as our in-app features to help educate Snapchatters about the risks of contact with strangers and how to use in-app reporting to alert our Trust and Safety teams to any type of concern or abuse. Additionally, we have continued to add partners to our trusted flagger program, which provides vetted safety experts with a confidential channel to report emergency escalations, such as an imminent threat to life or a case involving CSAM. We work closely with these partners to provide safety education, wellness resources, and other reporting guidance so they can help support the Snapchat community. 

Our Approach to the Spread of False Information

The period of time this transparency report covers further underscores how critical it is to ensure that the public has access to accurate and credible information. We regularly assess and invest in new means of protecting our community of Snapchatters from the spread of false information related to democratic processes, public health, and COVID-19.

In the first half of 2021, globally, we enforced against a combined total of 2,597 accounts and pieces of content for violations of our false information guidelines, almost half the number of violations from the previous reporting period. Since content on Discover and Spotlight are proactively moderated to prevent distribution of violating content at scale, the majority of these violations came from private Snaps and Stories, and the majority of these violations were made known to us via our own active moderation efforts, as well as reports from Snapchatters.  

We have always believed that when it comes to harmful content, it isn’t enough just to think about policies and enforcement — platforms need to consider their fundamental architecture and product design. From the beginning, Snapchat was built differently than traditional social media platforms, to support our primary use case of talking with close friends — rather than an open newsfeed where anyone has the right to distribute anything to anyone. Snapchat’s very design limits virality, which removes incentives for content that appeals to people’s worst instincts thereby limiting concerns associated with the spread of illegal and harmful content.

This approach also carries into our work to prevent the spread of extremist content. During the reporting period, we removed five accounts for violations of our prohibition of terrorist and extremist content, a slight decrease from the last reporting cycle. At Snap, we’re regularly monitoring developments in this space, seeking to mitigate any potential vectors for abuse on our platform. Both our platform architecture and the design of our Group Chat functionality help to limit the spread of harmful content and opportunities to organize. We offer Group Chats, but they are limited in size, are not recommended by algorithms, and are not discoverable on our platform for anyone not a member of that particular Group. 

During this period, we continued to proactively promote factual public safety information about COVID-19 to our community, including through coverage provided by our Discover editorial partners, through public service announcements (PSAs), as well as Q&As with public health officials, agencies and medical experts, and through creative tools, such as Augmented Reality Lenses and filters — all designed to remind Snapchatters of expert public health guidance. Earlier this year, as vaccines became available for young people in the U.S., we launched a new initiative with the White House to help Snapchatters answer common questions and, in July, we teamed up with the UK’s National Health Service on a similar effort. 

Going forward, we are committed to continuing to make our transparency reports more comprehensive and helpful to the many stakeholders that care deeply about online safety, transparency and multi-sector accountability. We are constantly evaluating how we can strengthen our comprehensive efforts to combat harmful content and bad actors, and are grateful to the many security and safety partners and collaborators that regularly help us to improve.

Empowering Snapchatters to Speak Out and Play a Part in Designing Our—and Their—Future

Today, as part of the Knight Foundation’s virtual symposium Lessons from the First Internet Ages, Snap’s CEO Evan Spiegel published an essay on the technology we are building to make it easier for young people to vote, educate themselves about the issues they care about, and even run for local office to make a difference in their communities through our Run for Office Mini

You can read Evan’s full essay below, which was originally published by the Knight Foundation here.

***

My co-founder Bobby Murphy and I met at Stanford University a little over a decade ago. I was a freshman studying product design and Bobby was a junior working on a degree in mathematical and computational science. Our first project together was Future Freshman, which we believed would forever change the way high schoolers applied to college. We were wrong, and it ended up a total failure, but we learned something important—we loved working together. 

Shortly after, we started working on what would eventually become Snapchat. At the time, most social media platforms were well established, but they didn’t really provide a space for our friends to authentically express themselves. We wanted to build something to help people communicate their full range of human emotions with their friends—not just what appears to be pretty or picture-perfect. So, we designed Snapchat differently than other social media platforms at the time: our app opened to a camera that helped people talk to their close friends, instead of a newsfeed that invited people to broadcast content more widely.

Looking back to our early days when few understood our app, we could never have imagined how large the Snapchat community would eventually become. Today, over 500 million people around the world use Snapchat each month. While our business has evolved, one thing that hasn’t changed is our desire to solve problems for our community. This determination, alongside our team’s curiosity and creativity, has led to some of our most successful innovations—including our core feature of ephemerality, Stories, and augmented reality. 

We also believe one of the most powerful forms of self-expression is exercising the right to vote and—especially for our community members in the United States—participating in American democracy. This passion, combined with our problem-solving mindset, is why we are so focused on building technology to make it easier for young people to vote, educate themselves about the issues they care about, hold public officials accountable and even run for office. 

Snapchatters have always been eager to get involved and help make a difference in their communities, but our democratic processes haven’t evolved to meet the needs of younger voters. Civic engagement has not caught up with the way young people get involved with the causes that matter most to them—through their phones and with their close friends. For young first-time voters—who typically learn about voting on college campuses, or don’t attend college and therefore don’t benefit from the civic infrastructure many campuses provide—reaching them where they are is more important and more challenging than ever. During the 2020 election, when many in-person voter engagement efforts were disrupted due to the COVID-19 pandemic, we were shown just how impactful mobile-first experiences can be. 

Snapchat reaches 90 percent of people ages 13–24 in the United States, giving us a meaningful opportunity to provide this age group with a civic on-ramp that makes it easier to participate in our democracy. Since 2016, we have built several mobile tools to remove technological barriers and help Snapchatters through every stage of the voting process—including voter registration, voter education and voter participation. In recent election cycles, we partnered with TurboVote and BallotReady to help Snapchatters register to vote, view their sample ballot and look up their polling place—and then encourage their friends to do the same. We rolled out a voter guide connecting Snapchatters with resources from the NAACP, ACLU, When We All Vote, the Lawyers’ Committee for Civil Rights Under Law, Latino Community Foundation and APIAVote. 

This work has been encouraging: In 2020 alone, our team helped over 1.2 million Snapchatters register to vote. According to data from Tufts University’s Center for Information & Research on Civic Learning and Engagement (CIRCLE), of those Snapchatters we helped register in 2020, half were first-time voters and more than 80 percent were under the age of thirty. 

But we also know that inspiring the next generation of leaders needs to be an always-on effort—not just for high-profile elections. So, we developed a feature that prompts Snapchatters to register to vote on their eighteenth birthday. More broadly, our voter engagement tools are available year-round, and our hope is that they help lay the groundwork for a lifetime of self-expression through civic engagement. 

Looking ahead, we continue to innovate based on the feedback we receive from Snapchatters. After the 2020 presidential election, we heard from Snapchatters who were disappointed with the lack of candidates running on issues they care about. It makes sense. Representation matters, but for many young people, running for office seems unapproachable, confusing and financially unrealistic. According to the National Conference of State Legislatures (NCSL), legislators from the baby boomer generation have a disproportionate influence in America’s legislatures, with nearly twice as many members as their overall share of the US population. As a consequence, the gap between those who are governing us and their representation of the next generation of Americans keeps getting wider. Moreover, according to the Pipeline Initiative, over half of candidates didn’t think about running until they were recruited or encouraged by a trusted peer.

We want to do our part to make it easier for Snapchatters to make a difference in their local communities on the issues they care most about by running for office. Recently, we launched a new feature in Snapchat to help young people learn about upcoming electoral races in their community—and to nominate friends they want to see in leadership. Snapchatters can explore local opportunities sorted by various policy issues, see what each position entails and create a centralized campaign dashboard that includes a “checklist” of all the elements the candidate needs to accomplish before successfully running for public office. We’ve initially partnered with a bipartisan group of ten candidate recruitment organizations that work with potential candidates to give them the resources they need to get started, including leadership workshops and campaign training. Through encouragement with friends and training from these partner organizations, we see this as a fun and impactful way for Snapchatters to step into leadership and have their voices heard.

Every day on our app, we see the Snapchat Generation show incredible passion, creativity and innovation that’s helping make the world a better place. We will continue to do our part to help remove the barriers that have historically kept young people from showing up to vote, and we are committed to empowering future generations to speak out and play a part in designing our—and their—future. 

Senate Congressional Testimony — Our Approach to Safety, Privacy and Wellbeing

Today, our VP of Global Public Policy, Jennifer Stout, joined other tech platforms in testifying before the Senate Commerce Committee’s Subcommittee on Consumer Protection, Product Safety and Data Security about Snap’s approach to protecting young people on our platform. 

We were grateful for the opportunity to explain to the Subcommittee how we intentionally built Snapchat differently from traditional social media platforms, how we work to build safety and privacy directly into the design of our platform and products, and where we need to continue to improve to better protect the wellbeing of our community. We have always believed that we have a moral responsibility to put their interests first — and believe that all tech companies must take responsibility and actively protect the communities they serve. 

We welcome the Subcommittee’s ongoing efforts to investigate these critical issues — and you can read Jennifer’s full opening statement below. A PDF of the full testimony is available here.

****

Testimony of Jennifer Stout Vice President of Global Public Policy, Snap Inc

Introduction

Chairman Blumenthal, Ranking Member Blackburn, and members of the Subcommittee, thank you for the opportunity to appear before you today. My name is Jennifer Stout and I serve as the Vice President of Global Public Policy at Snap Inc., the parent company of Snapchat. It’s an honor and privilege to be back in the Senate 23 years after first getting my start in public service as a Senate staffer, this time in a much different capacity — to speak about Snap’s approach to privacy and safety, especially as it relates to our youngest community members. I have been in this role for nearly five years, after spending almost two decades in public service, more than half of which was spent in Congress. I have tremendous respect for this institution and the work you and your staff are doing to make sure that tech platforms ensure that our youth are having safe and healthy online experiences. 

To understand Snap’s approach to protecting young people on our platform, it’s helpful to start at the beginning. Snapchat’s founders were part of the first generation to grow up with social media. Like many of their peers, they saw that while social media was capable of making a positive impact, it also had certain features that negatively impacted their friendships. These platforms encouraged people to publicly broadcast their thoughts and feelings, permanently. Our founders saw how people were constantly measuring themselves against others through “likes” and comments, trying to present a version of themselves through perfectly curated images, and carefully scripting their content because of social pressure. Social media also evolved to feature an endless feed of unvetted content, exposing people to a flood of viral, misleading, and harmful content. 

Snapchat was built as an antidote to social media. In fact, we describe ourselves as a camera company. Snapchat’s architecture was intentionally designed to empower people to express a full range of experiences and emotions with their real friends, not just the pretty and perfect moments. In the formative years of our company, there were three major ways our team pioneered new inventions to prioritize online privacy and safety. 

First, we decided to have Snapchat open to a camera instead of a feed of content. This created a blank canvas for friends to visually communicate with each other in a way that is more immersive and creative than sending text messages. 

Second, we embraced strong privacy principles, data minimization, and the idea of ephemerality, making images delete-by-default. This allowed people to genuinely express themselves in the same way they would if they were just hanging out at a park with their friends. Social media may have normalized having a permanent record of conversations online, but in real life, friends don’t break out their tape recorder to document every single conversation for public consumption or permanent retention. 

Third, we focused on connecting people who were already friends in real life by requiring that, by default, both Snapchatters opt-in to being friends in order to communicate. Because in real life, friendships are mutual. It’s not one person following the other, or random strangers entering our lives without permission or invitation. 

A Responsible Evolution

Since those early days, we have worked to continue evolving responsibly. Understanding the potential negative effects of social media, we made proactive choices to ensure that all of our future products reflected those early values. 

We didn’t need to reinvent the wheel to do that. Our team was able to learn from history when confronting the challenges posed by new technology. As Snapchat evolved over time, we were influenced by existing regulatory frameworks that govern broadcast and telecommunications when developing the parts of our app where users could share content that has the potential to reach a large audience. For instance, when you talk to your friends on the phone, you have a high expectation of privacy, whereas if you are a public broadcaster with the potential to influence the minds and opinions of many, you are subject to different standards and regulatory requirements. 

That dichotomy helped us to develop rules for the more public portions of Snapchat that are inspired by broadcast regulations. These rules protect our audience and differentiate us from other platforms. For example, Discover, our closed content platform where Snapchatters get their news and entertainment, exclusively features content from either professional media publishers who partner with us, or from artists, creators, and athletes who we choose to work with. All of these content providers have to abide by our Community Guidelines, which apply to all of the content on our platform. But Discover publisher partners also must abide by our Publisher Guidelines, which include requiring that content is fact-checked or accurate and age-gated when appropriate. And for individual creators featured in Discover, our human moderation teams review their Stories before we allow them to be promoted on the platform. While we use algorithms to feature content based on individual interests, they are applied to a limited and vetted pool of content, which is a different approach from other platforms.

On Spotlight, where creators can submit creative and entertaining videos to share with the broader Snapchat community, all content is first reviewed automatically by artificial intelligence before gaining any distribution, and then human-reviewed and moderated before it can be viewed by more than 25 people. This is done to ensure that we reduce the risk of spreading misinformation, hate speech, or other potentially harmful content.

We don’t always get it right the first time, which is why we redesign parts of Snapchat when they aren’t living up to our values. That’s what happened in 2017 when we discovered that one of our products, Stories, was making Snapchatters feel like they had to compete with celebrities and influencers for attention because content from celebrities and friends were combined in the same user interface. As a result of that observation, we decided to separate “social” content created by friends from “media'' content created by celebrities to help reduce social comparison on our platform. This redesign negatively impacted our user growth in the short-term, but it was the right thing to do for our community.

Protecting Young People on Snapchat

Our mission — to empower people to express themselves, live in the moment, learn about the world, and have fun together — informed Snapchat’s fundamental architecture. Adhering to this mission has enabled us to create a platform that reflects human nature and fosters real friendships. It continues to influence our design processes and principles, our policies and practices, and the resources and tools we provide to our community. And it undergirds our constant efforts to improve how we address the inherent risks and challenges associated with serving a large online community. 

A huge part of living up to our mission has been building and maintaining trust with our community and partners, as well as parents, lawmakers, and safety experts. Those relationships have been built through the deliberate, consistent decisions we have made to put privacy and safety at the heart of our product design process. 

For example, we have adopted responsible design principles that consider the privacy and safety of new products and features right from the beginning of the development process. And we've made those principles come to life through rigorous processes. Every new feature in Snapchat goes through a defined privacy and safety review, conducted by teams that span Snap — including designers, data scientists, engineers, product managers, product counsel, policy leads, and privacy engineers — long before it sees the light of day.

While more than 80% of our community in the United States is 18 or older, we have spent a tremendous amount of time and resources to protect teenagers. We’ve made thoughtful and intentional choices to apply additional privacy and safety policies and design principles to help keep teenagers safe. That includes:

  • Taking into account the unique sensitivities and considerations of minors when we design products. That’s why we intentionally make it harder for strangers to find minors by banning public profiles for people under 18 and are rolling out a feature to limit the discoverability of minors in Quick Add (friend suggestions). And why we have long deployed age-gating tools to prevent minors from viewing age-regulated content and ads. 

  • Empowering Snapchatters by providing consistent and easy-to-use controls like turning location sharing off by default and offering streamlined in-app reporting for users to report concerning content or behaviors to our Trust and Safety teams. Once reported, most content is actioned in under 2 hours to minimize the potential for harm. 

  • Working to develop tools that will give parents more oversight without sacrificing privacy — including plans to provide parents the ability to view their teen's friends, manage their privacy and location settings, and see who they're talking to.

  • Investing in educational programs and initiatives that support the safety and mental health of our community — like Friend Check Up and Here for You. Friend Check Up prompts Snapchatters to review who they are friends with and make sure the list is made up of people they know and still want to be connected with. Here for You provides support to users who may be experiencing mental health or emotional crises by providing tools and resources from experts.

  • Preventing underage use. We make no effort — and have no plans — to market to children, and individuals under the age of 13 are not permitted to create Snapchat accounts. When registering for an account, individuals are required to provide their date of birth, and the registration process fails if a user inputs an age under the age of 13. We have also implemented a new safeguard that prevents Snapchat users between 13-17 with existing accounts from updating their birthday to an age of 18 or above. Specifically, if a minor attempts to change their year of birth to an age over 18, we will prevent the change as a way to ensure that users are not accessing age-inappropriate content within Snapchat.

Conclusion and Looking Ahead

We're always striving for new ways to keep our community safe, and we have more work left to do. We know that online safety is a shared responsibility, spanning a host of sectors and actors. We are committed to doing our part in concert with safety partners including our Safety Advisory Board, technology industry peers, government, and civil society. From technology-focused and awareness-raising initiatives, to research and best practice sharing, we are actively engaged with organizations dedicated to protecting minors online. We also know that there are many complex problems and technical challenges across our industry, including age verification of minors, and we remain committed to working with partners and policymakers to identify robust industry-wide solutions.         

Protecting the wellbeing of Snapchatters is something we approach with both humility and steadfast determination. Over 500 million people around the world use Snapchat every month and while 95% of our users say Snapchat makes them feel happy, we have a moral responsibility to take into account their best interests in everything we do. That’s especially true as we innovate with augmented reality — which has the potential to positively contribute to the way we work, shop, learn, and communicate. We will apply those same founding values and principles as we continue to experiment with new technologies like the next generation of augmented reality. 

As we look to the future, computing and technology will become increasingly integrated into our daily lives. We believe that regulation is necessary but given the speed at which technology develops and the rate at which regulation can be implemented, regulation alone can’t get the job done. Technology companies must take responsibility and actively protect the communities they serve. 

If they don't, the government must act swiftly to hold them accountable. We fully support the Subcommittee’s efforts to investigate these issues and welcome a collaborative approach to problem solving that keeps our society safe. 

Thank you again for the opportunity to appear before you today and discuss these critical issues. I look forward to answering your questions.

How Snap is Responding to the Fentanyl Crisis

Updated Heads Up

Drugs laced with fentanyl have contributed to an alarming increase in overdose deaths in the United States in recent years. Fentanyl is a potent opioid, deadly in quantities as small as one grain of sand. Drug dealers often use fentanyl to make counterfeit prescription pills, like Vicodin or Xanax, which when ingested can lead to death. 

We have heard devastating stories from families impacted by this crisis, including cases where fentanyl-laced counterfeit pills were purchased from drug dealers on Snapchat. We are determined to remove illegal drug sales from our platform, and we have been investing in proactive detection and collaboration with law enforcement to hold drug dealers accountable for the harm they are causing our community. 

We believe it is our responsibility to keep our community safe on Snapchat and we have made significant operational improvements over the past year to eradicate drug sales from our platform and we are continually working to improve. Our work here is never done, but we want to communicate updates as we make progress so that our community can monitor our progress and hold us accountable.

Our most important investments over the past year have included significant investments in our Law Enforcement Operations, growing our team who supports valid law enforcement requests to meaningfully improve how quickly we can respond. While we still have work to do, across all types of law enforcement requests we receive, our response times have improved 85% year over year, and in the case of emergency disclosure requests, our 24/7 team usually responds within 30 mins.

We have significantly improved our proactive detection capabilities to remove drug dealers from our platform before they are able to harm our community. Our enforcement rates have increased by 112% during the first half of 2021, and we have increased proactive detection rates by 260%. Nearly two-thirds of drug-related content is detected proactively by our artificial intelligence systems, with the balance reported by our community and enforced by our team. We’ve also worked to improve our in-app reporting tools to make it easier and faster for our community to report drug-related content.

We will continue to work to strike the right balance between safety and privacy on our platform so that we can empower our community to express themselves without fear of harm. By design, Snapchatters control who can contact them and must opt-in to new conversations with friends. If a member of our community reports inappropriate content, it is escalated to our Trust & Safety team so that we are able to take appropriate action. We are also working on new family safety tools to provide more ways for parents to partner together with their teenagers to stay safe on Snapchat.

We also want to play a role in educating our community about the dangers of fentanyl. To inform our efforts, we commissioned research from Morning Consult to understand how young people perceive prescription drugs and fentanyl, and are sharing those findings here. We learned that teenagers are suffering from high levels of stress and anxiety, and are experimenting with the use of prescription drugs without a prescription as a coping strategy. It was also clear from the research that many people either don’t know enough about fentanyl to assess the danger, or believe fentanyl is less dangerous than heroin or cocaine. This lack of awareness can have devastating consequences when just one counterfeit pill laced with fentanyl can kill.

Heads Up Graphic

We have developed a new in-app education portal called Heads Up that distributes content from expert organizations such as Song for Charlie, Shatterproof, and Substance Abuse and Mental Health Services Administration (SAMHSA), with additional resources from the Centers for Disease Control and Prevention to be added in the coming weeks. This means that if someone on Snapchat searches for drug-related keywords, Heads Up will show relevant educational content designed to prevent harm to our community.

In partnership with Song for Charlie, we have developed a video advertising campaign that has already been viewed over 260 million times on Snapchat, and we are rolling out a new national filter that raises awareness of the dangers of fentanyl and counterfeit pills and directs Snapchatters to the new Heads Up educational portal. A new episode of Good Luck America, a Snap Original news show, will premiere soon, continuing a special edition series of episodes devoted to educating our community about the fentanyl crisis.

We hope that our ongoing operational improvements and educational efforts will help to keep our community safe from the devastating impacts of the fentanyl crisis. We are heartbroken that drugs have taken the lives of people in our community. We deeply appreciate the generosity and kindness of families who have come forward to share their stories, collaborate, and hold us accountable for making progress. We will work tirelessly to do better and do more to keep our community safe.

- Team Snap

Our Approach to Preventing the Spread of False Information

As the world continues to battle the latest developments of the COVID-19 pandemic, it is more important than ever to ensure the public has access to accurate, credible information. The rapid spread of false information can pose serious threats to our institutions and public health, and we believe we’re in a moment in which companies, organizations, and individuals should take stock of their efforts to help prevent it.

In that spirit, we thought it would be helpful to walk through our long held approach to preventing the spread of false information on Snapchat, and the ways we are working to improve. 

Our approach has always started with the architecture of our platform. Snapchat was originally built to help people talk to their close friends, rather than provide the opportunity to broadcast messages across the app. And we have always felt a deep responsibility to make sure that the news and information our community sees on Snapchat is credible, from trusted and clear sources. 

These underlying principles have informed our product design and policy decisions as Snapchat has continued to evolve over the years. 

  • Across our app, we don’t allow unvetted content the opportunity to ‘go viral.’ Snapchat does not offer an unmoderated open newsfeed where unvetted individuals or publishers can broadcast false information. Our content platform, Discover, only features content from vetted media publishers and content creators. Our entertainment platform, Spotlight, is proactively moderated before content can reach a large audience. We offer Group Chats, but they are limited in size, are not recommended by algorithms, and are not discoverable on our platform if you are not a member of that Group.

  • Our guidelines have long prohibited the spread of false information. Both our Community Guidelines, which apply equally to all Snapchatters, and our content guidelines, which apply to our Discover partners, prohibit the spread of misinformation that can cause harm, including conspiracy theories, denying the existence of tragic events, unsubstantiated medical claims, or undermining the integrity of civic processes.  We regularly review and update our policies as new forms of misinformation become more prevalent: for example, ahead of the 2020 election, we updated our guidelines to make clear that manipulated media intended to mislead -- or deepfakes -- were prohibited.

  • Our approach to enforcing against content that includes false information is straightforward -- we don’t label it, we completely remove it. When we find content that violates our guidelines, our policy is to simply take it down, which immediately reduces the risk of it being shared more widely. 

  • We evaluate the safety and privacy impacts of all new features during the front end of the product development process -- which includes examining potential vectors for misuse. We have internal measures in place to evaluate the potential impact of a new feature on the safety, privacy, and wellbeing of both Snapchatters, our individual users and society during the product development process -- and if we think it will become an avenue for bad actors to share false information, it doesn’t get released.

  • We use human review to fact check all political and advocacy ads. As with all content on Snapchat, we prohibit false information and deceptive practices in our advertising. All political ads, including election-related ads, issue advocacy ads, and issue ads, must include a transparent “paid for” message that discloses the sponsoring organization. We use human review to fact check all political ads, and provide information about all ads that pass our review in our Political Ads library.

  • We are committed to increasing transparency into our efforts to combat false information. Our most recent Transparency Report, which covered the second half of 2020, included several new elements, including data about our efforts to enforce against false information globally. During this period, we took action against 5,841 pieces of content and accounts for violations of our policies on false information -- and we plan to provide more detailed breakdowns of these violations in our future reports. 

As we keep working to remove incentives for sharing false information, both through our product design choices and our policies, we’re also focused on partnering with experts to promote factual health and safety information. Since the beginning of the pandemic, we have worked closely with public health officials and agencies, including the World Health Organization and the Centers for Disease Control and Prevention, to publish regular safety updates, and our news partners around the world have produced constant coverage of the pandemic. Earlier this Spring, as vaccines became available for young people in the US, we launched a new effort with the White House to help Snapchatters answer common questions, and in July, we teamed up with the UK’s National Health Service on a similar effort. 

Doing our part to help our community stay safe and healthy is an ongoing priority for us, and we will continue to explore innovative approaches to reach Snapchatters where they are, while strengthening our efforts to protect Snapchat from the false information epidemic.