Bermuda Child Safety & CSAE Policy
This policy applies to the Bermuda app (developer: Mylos Inc.) and complies with
Google Play's Child Safety Standards Policy and Child Endangerment Policy,
as well as the Tech Coalition's Combating Online Child Sexual Exploitation and Abuse recommendations.
Bermuda applies a Zero Tolerance policy toward
Child Sexual Abuse and Exploitation (CSAE) and Child Sexual Abuse Material (CSAM).
1. Overview of Child Protection Policy
Bermuda is a service available only to adults aged 18 and over.
Children and adolescents under the age of 18 are not permitted to use this service,
and any content or conduct that targets or involves minors is strictly prohibited.
- We operate age verification and adult authentication procedures at signup.
- Accounts discovered to belong to users under 18 are immediately and permanently banned.
- Attempts to bypass the age restriction may result in additional sanctions and legal action.
- If any conduct endangering minors is identified, we respond immediately and report to the relevant authorities when necessary.
2. Published Standards on CSAE Prohibition
In accordance with Google Play's Child Safety Standards Policy, Bermuda explicitly prohibits the following content and conduct.
These standards are published together with our Terms of Service, Community Guidelines, and Privacy Policy,
and can be reviewed by anyone at any time through this page.
- Creation, storage, sharing, transmission, or redistribution of Child Sexual Abuse Material (CSAM)
- Sexual solicitation (grooming) targeting minors, including requests for sexual messages or imagery
- Child nudity or any content that sexualizes minors
- Sexual extortion (sextortion) targeting minors, including demands for money or content
- Attempting or facilitating offline meetings with minors
- Non-sexual child abuse and any other content that endangers child safety
3. CSAM Guidance and External Reporting
⚠ CSAM Notice
Bermuda applies a Zero Tolerance policy toward CSAM and CSAE.
If you encounter content or conduct that appears to be CSAM or CSAE,
please report it immediately through the in-app reporting feature or to the external organizations listed below.
- The content will be removed immediately and the associated account permanently suspended.
- Relevant data will be preserved to the extent permitted by law and may be provided to law enforcement.
- Confirmed CSAM is reported to the National Center for Missing & Exploited Children (NCMEC) and to the competent local authority.
- Such conduct constitutes a serious criminal offense in most jurisdictions.
※ Storing, sharing, or retransmitting CSAM is also unlawful and is strictly prohibited.
3-1. International Reporting Organizations
3-2. Country-Specific Reporting Organizations
Below is a list of major country-specific organizations for reporting child sexual abuse.
The full list is available at
Google Support - Child Sexual Abuse Reporting Organizations by Country.
3-3. Bermuda Customer Support
CSAM and CSAE reports can also be submitted directly to Bermuda's customer support team.
Email: childsafety@mylosinc.com
4. Prohibited Content and Conduct
1) Child Sexual Abuse Material (CSAM)
Any content that depicts or promotes the sexual exploitation of children is prohibited.
2) Child Sexual Solicitation (CSAE / Grooming)
- Requesting sexual content from minors
- Demanding nude images or videos
- Inducing sexual conversations or grooming through the formation of intimate relationships
3) Inappropriate Interactions
- Planning or attempting offline meetings with minors
- Sending sexual messages or imagery to minors
4) Sextortion
- Threats to publicly disclose images or videos
- Demands for money or additional sexual content
5) Child Nudity
Nudity of minors or any content that sexualizes minors is prohibited.
6) Non-Sexual Child Abuse
All forms of child abuse, including physical and emotional violence, are prohibited.
5. In-app Reporting Mechanism
Users can report violations related to child safety without leaving the app.
Bermuda self-certifies that it operates an in-app reporting mechanism.
- A dedicated CSAE/CSAM category is provided in the report reasons.
- Reports are available across all areas of the service, including profiles, 1:1 chats, and video calls.
- Child safety reports are processed at the highest priority (P0).
- Reports can also be submitted directly by email to childsafety@mylosinc.com.
6. Child Protection Operations
- AI-based detection of image, video, and text content
- Face-recognition-based age estimation monitoring
- CSAM detection for profile pictures and images transmitted in chat rooms
- Continuous monitoring based on user reports
- Immediate content removal and permanent account suspension upon confirmed violation
- Final determination by human reviewers and a formal appeal process
7. CSAM Response Procedure
- Detection via user reports or automated detection systems
- Urgent review by the Child Safety Operations team
- Immediate removal of the content and permanent suspension of the associated account
- Preservation of evidence data to the extent permitted by applicable law
- Reporting to NCMEC (CyberTipline) in the United States and to the competent local authority
- Cooperation with law enforcement requests following due legal process
- Policy and system improvements to prevent recurrence
8. Child Safety and CSAM Response Point of Contact
In accordance with Google Play's Child Safety Standards Policy, Bermuda has designated
a Child Safety and CSAM Response point of contact.
Google Play and relevant authorities may contact the individuals below directly.
[Responsible Officer]