Child Safety Standards

Last updated: March 28, 2026

Ami has zero tolerance for child sexual abuse and exploitation, including any content, behavior, communication, or activity that sexualizes, exploits, endangers, or harms minors.

Our standards

Users may not use Ami to:

  • create, upload, store, share, request, promote, or distribute child sexual abuse material;
  • groom, manipulate, solicit, exploit, or sexually extort minors;
  • seek sexual conversations, sexual images, meetings, or relationships involving minors;
  • impersonate minors or misrepresent age in order to exploit or contact minors;
  • use the platform in any way that facilitates child trafficking, abuse, or exploitation.

Any violation of these standards may result in immediate content censoring or removal, account suspension, or permanent ban.

Reporting and enforcement

Ami provides in-app tools that allow users to report profiles, dialogs, and photos.

If we receive a report or otherwise become aware of potential child sexual abuse or exploitation, we may:

  • review the reported account, content, and related activity;
  • censor or remove violating content;
  • suspend or permanently ban the account;
  • conduct additional review where necessary.

Age restrictions

The Ami app published on Google Play is intended only for adults aged 18 and older. People under 18 are not permitted to use the app.

If we determine that an account belongs to an underage user, we may remove the account.

Moderation

To help keep Ami safe, we use a combination of:

  • AI photo moderation;
  • manual review;
  • account bans for violations;
  • video user verification.

Scope

These standards apply to all user-generated content and interactions on Ami, including profiles, photos, and chat messages.

Contact

For child safety concerns or reports, contact: [email protected], Instagram, Telegram


Public page URL:

https://ami-app.me/blog/uk/child-safety