Child Safety & CSAE Policy
Last updated: 2025-11-02
This page sets out Moment's publicly available standards against Child Sexual Abuse and Exploitation (CSAE) and our commitments to prevent, detect, remove, and report such material. These standards apply to all users and all content on Moment.
1. Zero‑tolerance statement
Moment has zero tolerance for any form of child sexual abuse or exploitation, including but not limited to possession, distribution, creation, solicitation, grooming, or viewing of Child Sexual Abuse Material (CSAM). Content or behavior that sexualizes minors is strictly prohibited and will be removed. Accounts that engage in CSAE will be suspended or terminated and reported to the appropriate authorities.
2. Definitions
- Child / Minor: Any person under the age of 18.
- CSAM: Content that depicts or represents sexual abuse of a child, including real, manipulated, or generated content that sexualizes minors.
- CSAE: Any action that facilitates or attempts to facilitate the sexual abuse or exploitation of a child, including grooming, sextortion, trafficking, or solicitation.
3. Prohibited content & behaviors
- Uploading, posting, sharing, or linking to CSAM of any kind.
- Sexualization of minors, including eroticized comments, emojis, fantasies, or requests involving minors.
- Grooming or attempting to obtain sexual content from a child; coercion or sextortion.
- Using Moment to advertise, coordinate, or facilitate the sexual exploitation of children.
- Impersonation of a minor to solicit sexual content or contact.
4. Age safety
Moment is intended for users 13+. If we learn that an account is used by a child under the applicable minimum age, we will take appropriate action, which may include removal of the account.
5. Reporting CSAE concerns
Users can report concerning content or behavior in‑appusing the Report option found on posts, comments, profiles, and messages. Reports are reviewed by our Trust & Safety team.
You can also email our designated child‑safety contact at [email protected].
6. Detection, review & enforcement
- Proactive detection: We may use a combination of user reports, automated signals, and industry hash‑matching technologies to detect suspected CSAM where lawful.
- Review: A trained team reviews reports and escalates potential CSAM immediately.
- Removal & account action: Confirmed CSAM is removed. Accounts involved in CSAE may be suspended or permanently terminated.
- Evidence preservation: We preserve minimal necessary information to comply with legal obligations and law‑enforcement requests.
7. Legal compliance & reporting to authorities
Where required by law, we report CSAM to relevant authorities (for example, hotlines within the INHOPE network or regional/national law‑enforcement). We cooperate with lawful requests and take steps to protect victims.
8. Privacy & data handling for reports
Information provided in CSAE reports is processed to investigate and enforce this policy and comply with legal obligations. We limit access to authorized personnel and retain data only as long as necessary for these purposes and applicable law. For general privacy practices, see our Privacy Policy.
9. Guidance & resources
If you believe a child is in immediate danger, contact your local emergency services. For additional resources and reporting options in your region, consult your national child protection hotline.
10. Designated child‑safety contact
Email: [email protected]
Phone: +260 974971026
11. Changes to these standards
We may update these standards from time to time. Material updates will be published on this page with a new "Last updated" date.
Developer: Moment — Publicly available standards against CSAE for Google Play compliance.