Child Safety
Child Sexual Abuse & Exploitation (CSAE) Policy
Effective Date: March 2026 · Last Updated: March 2026
Actuall has zero tolerance for child sexual abuse and exploitation (CSAE) in any form. This policy applies to all users, all content, and all interactions on the Actuall platform — including Circles, Moments, direct messages, and private groups.
1. Our Commitment
Actuall is built for India's Gen Z — a platform centred on authentic, real-time connection. The safety of every person on the platform, especially minors, is a non-negotiable foundation of everything we build.
We are committed to:
- Maintaining a platform that is completely free from content that sexually exploits or abuses children
- Proactively preventing grooming, exploitation, or any predatory behaviour targeting minors
- Cooperating fully with law enforcement and child protection organisations
- Continuously improving our detection, reporting, and response systems
2. Who This Policy Applies To
For the purposes of this policy, a minor is any person under the age of 18 years, consistent with Indian law and international child protection standards. This policy applies globally to all users of Actuall, regardless of location.
3. Prohibited Content and Behaviour
The following are strictly prohibited on Actuall and will result in immediate account termination and reporting to relevant authorities:
3.1 Prohibited Content
- Child Sexual Abuse Material (CSAM) — any visual, audio, or written content depicting the sexual exploitation or abuse of a minor
- Sexualised depictions of minors, including digitally generated or AI-generated content
- Content that normalises, glorifies, or encourages the sexual exploitation of children
- Any media that exposes a minor's private body parts in a sexual context
3.2 Prohibited Behaviour
- Grooming — using the platform to build manipulative relationships with minors for the purpose of sexual exploitation
- Soliciting sexual content, images, or communication from minors
- Sharing contact information (phone numbers, external links, location) with a minor in a predatory context
- Attempting to move a minor off the platform to a less-monitored channel for harmful purposes
- Any form of coercion, blackmail, or manipulation targeting a minor
- Sharing or distributing CSAM in any channel, including direct messages or private groups
Zero Tolerance: There are no exceptions to this policy. No context, claimed artistic intent, or fictional framing will exempt content that sexualises or exploits minors.
4. How We Detect and Prevent CSAE
4.1 Product Design as a Safety Layer
Actuall's core product design includes deliberate safeguards that reduce exploitation risk:
- No global search by username — users cannot be discovered and targeted by strangers through search
- Intentional connection only — users must share their username directly; connections cannot be made by unknown parties without consent
- Ephemeral spaces — Circles expire automatically, preventing persistent monitoring or contact by predators
- In-Circle protections — any user can mute or vote to remove another user; report and block are available on all content and profiles
4.2 Content Review
- All reported content is reviewed by our Trust & Safety team within a defined response window
- High-severity reports involving potential CSAM are escalated immediately and treated as a priority
- We use automated detection tools to identify known CSAM hashes in alignment with industry standards
4.3 Account Action
Any account found to have shared CSAM or engaged in grooming behaviour will be:
- Permanently banned immediately upon confirmation
- Reported to the Cyber Crime Cell and relevant national authorities
- Reported to the National Center for Missing & Exploited Children (NCMEC) CyberTipline where applicable
5. How to Report
If you encounter any content or behaviour that you believe exploits or endangers a child, please report it immediately using one of the following methods:
- In-app: Tap the three-dot menu on any post, Circle, message, or profile → Select "Report" → Choose "Child Safety"
- Email: safety@actuall.app — include as much detail as possible, including screenshots if safe to do so
Reports are treated with confidentiality. You will not be penalised for making a good-faith report.
6. Cooperation with Authorities
Actuall will cooperate fully and promptly with law enforcement and child protection authorities, including:
- Providing account data, content, and IP information when legally required
- Filing mandatory reports with NCMEC's CyberTipline when CSAM is identified, as required under applicable law
- Complying with the Information Technology Act, 2000 and the Protection of Children from Sexual Offences (POCSO) Act, 2012
7. No Underage Accounts
Actuall is intended for users aged 18 and above. We do not knowingly allow accounts for users under the age of 18. If we become aware that an account belongs to a minor, the account will be suspended pending verification and may be permanently removed.
If you believe a minor is using the platform, please report the account to safety@actuall.app.
8. Policy Updates
This policy will be reviewed and updated regularly as our platform grows and as legal requirements evolve. Material updates will be communicated to users through in-app notifications.