Skip to main content

Evegram - Minor Safety Standards


Last updated: 20/01/2025

Evegram is committed to providing a safe environment for all users, especially minors. We implement the following measures to ensure safety and prevent any form of abuse:

Content Moderation

  • All user-generated content is monitored using artificial intelligence algorithms and human moderators to detect inappropriate material, including explicit or offensive content.
  • Accounts that violate our policies are promptly suspended or removed.

Abuse Reporting

  • Users can report inappropriate profiles, content, or messages directly within the app. Reports are reviewed within 24 hours by a dedicated team.
  • If serious abuse, such as child sexual abuse material, is detected, we immediately cooperate with the appropriate authorities.

Collaboration with Authorities

  • Evegram reports cases of abuse to national or regional centers dedicated to combating the sexual exploitation of minors, such as the NCMEC (National Center for Missing & Exploited Children).

User and Team Training

  • Our moderators receive thorough training on how to identify harmful content or behavior.
  • We provide guidelines to users on how to keep their accounts secure and report issues effectively.

For more information, please contact our team at compliance@evegram.eu.