This is a remote position.

About the company: We are an early-stage social media company experiencing rapid user growth, with over 2 million new users to date. We are hiring experienced Content Moderators to help ensure a safe, respectful, and well-managed platform. This is a new role focused on reviewing and moderating user-generated content and supporting day-to-day community operations. Please note: This role involves reviewing user-generated content that may be disturbing, offensive, or emotionally challenging, including material related to violence, harassment, hate speech, or other sensitive topics.
About the role: Please note: This role involves reviewing user-generated content that may be disturbing, offensive, or emotionally challenging, including material related to violence, harassment, hate speech, or other sensitive topics.
Work Schedule & Commitment
Part-time: Approximately 4+ hours per day, and chance of Full-timework after a probationary period
Location: Remote
Timezones:Open to MENA, North America, EMEA (to ensure regional coverage)
Work Environment
High-volume review environment with performance metrics.
Exposure to sensitive or potentially disturbing content.
Structured onboarding and ongoing policy training.
Wellness and resilience resources provided.

Key Responsibilities
Content Review & Moderation
  • Review user-generated content including text posts, images, videos, and comments.
  • Identify and remove content that violates community guidelines (e.g., harassment, hate speech, misinformation, graphic content, spam).
  • Enforce platform policies consistently and fairly.
  • Escalate complex or high-risk cases to senior moderation or policy teams.

Policy Enforcement & Decision-Making

  • Interpret and apply content policies to real-world scenarios.
  • Document moderation decisions and maintain review accuracy.
  • Provide feedback to improve moderation processes and policy clarity.
Community Safety & Risk Monitoring
  • Identify emerging harmful trends or coordinated abuse patterns.
  • Help detect fraudulent accounts, bots, or suspicious behavior.
  • Support user reports and appeals processes.
Quality & Performance
  • Meet accuracy, speed, and quality benchmarks.
  • Participate in regular calibration sessions and training updates.
  • Maintain confidentiality when handling sensitive data.

Requirements

  • Experience in content moderation, Trust & Safety, customer support, or community management.
  • Familiarity with online safety standards and platform policies
  • Experience working in fast-paced or high-volume environments.
  • Strong written and verbal English communication skills. Multilingual abilities are a strong asset. Ability to clearly understand and assess nuanced content
  • Ability to make objective decisions using policy guidelines.
  • High attention to detail and consistency.
  • Ability to handle exposure to sensitive or disturbing content.
  • Strong time management and organizational skills.