Learning more about our Child Safety Standards policy

Google Play takes seriously the safety of children on our platform and is committed to working to keep our store free of child sexual abuse and exploitation. We require apps in the Social and Dating categories to comply with our Child Safety Standards policy.

Overview

The Child Safety Standards policy requires apps to:

Make sure to read the policy in full and ensure you understand and comply. Developers who are not in compliance by the deadline may be subject to enforcement actions. For more guidance, check out Tech Coalition’s best practices for Combating Online CSEA.

Timeline information

We anticipate the following timeline for rollout of the Child Safety Standards policy. Note that this is subject to change; updates will be posted in this article. 

  • April 2024: We announced the new Play Child Safety Standards policy 
  • Declaration form will be available for in-scope apps in Play Console this December. 
  • January 22, 2025: In-scope apps must comply with the new Play Child Safety Standards policy. Non-compliant apps may face additional enforcement actions in the future, such as the removal of your app from Google Play.

Frequently asked questions

Click on a question below to expand or collapse it.

What category of apps are in scope for this policy?

Apps in the Social and Dating categories are currently in scope for this policy.

How do I know if my app is considered part of the Social or Dating category?

A Social app is an app that declares itself as a "Social" app in Play Console, or lists itself within the "Social" category on Google Play.

A Dating app is an app that declares itself as a "Dating" app in Play Console, or lists itself within the "Dating" category on Google Play.

What if my app is not for kids or does not allow kid users? What if my app is just for adults? What if my app is age-gated?

The presence or absence of child users in your app is irrelevant to this policy. If your app meets the criteria above, then it is within the scope of this policy and must comply with its requirements.

How do you define CSAE?

CSAE refers to child sexual abuse and exploitation, including content or behavior that sexually exploits, abuses, or endangers children. This includes, for example, grooming a child for sexual exploitation, sextorting a child, trafficking of a child for sex, or otherwise sexually exploiting a child.

How do you define CSAM?

CSAM stands for child sexual abuse material. It is illegal and our Terms of Service prohibit using Google products and services to store or share this content. CSAM consists of any visual depiction, including but not limited to photos, videos, and computer-generated imagery, involving the use of a minor engaging in sexually explicit conduct. For more information, visit the Transparency Report Help Center.

What are the requirements for the CSAE published standards?

The published standards should be a web resource that is globally accessible for any individual to learn about your policies and standards around CSAE. The web resource must be: 

  • functional (for example, loads without error); 
  • relevant in scope (for example, mention CSAE or child safety); and 
  • reference the app or developer name (that is, as it appears on your store listing on Google Play). 

Certainly, you can offer this in many ways through a help center, policy page, terms of service, community guidelines, or similar. We recommend using anchor links and clearly laying out these standards. You must provide a link to these published standards in Play Console.

What kinds of in-app mechanisms should my app have? Can users report through an email or form?

By in-app feedback mechanism, we are referring to any mechanism that is available within your app for users to communicate their concerns to you. You may choose your preferred in-app method so long as users can access it without leaving the app. This may include but is not limited to a comprehensive in-app user feedback experience, a support email, or chat channel for reports. You must certify that you have an in-app mechanism in Play Console.

What does it mean to take “appropriate action” to address CSAM?

“Taking appropriate action to address CSAM” means acting in accordance with your published standards and relevant laws. For example, removing CSAM when you obtain actual knowledge of it in your app.

Our standards do not mandate a particular methodology, but we do expect developers to act in accordance with their stated policies, procedures, and relevant applicable laws.

Do these standards align with global norms on child safety standards?

Yes. Our standards are also inspired by the Tech Coalition Child Safety Standards. Please seek guidance from your legal team(s) or advisor(s) for regulatory compliance on CSAM and child safety matters.

Who should be my designated CSAM point of contact? Does it need to be a specific role?

Please provide a name and contact information for an individual who is ready and able to speak to your organization’s CSAM prevention practices and compliance with this policy should our team need to be in touch. Your CSAM point of contact can serve in a variety of positions or teams within your company. You must designate your point of contact in Play Console.

Was this helpful?

How can we improve it?
Search
Clear search
Close search
Google apps
Main menu
1092465398027757513
true
Search Help Center
true
true
true
true
true
92637
false
false