Policies & Compliance

Home/Policies

Content Moderation Policy

Last Updated: January 12, 2026

Purpose

This Content Moderation Policy establishes how we review, moderate, and enforce standards for user-generated content on Elevate for Humanity platforms. Our goal is to maintain a safe, respectful, and productive learning environment while respecting freedom of expression.

What We Moderate

Content Subject to Moderation

  • Discussion forum posts and comments
  • Course submissions and assignments
  • Profile information and photos
  • Messages and communications
  • Reviews and feedback
  • User-uploaded files and media
  • Any publicly visible content

Moderation Standards

Prohibited Content

Content will be removed if it contains:

  • Illegal Content: Violates laws or promotes illegal activities
  • Hate Speech: Attacks individuals or groups based on protected characteristics
  • Harassment: Bullying, threats, or targeted abuse
  • Violence: Graphic violence, gore, or threats of harm
  • Sexual Content: Pornography or sexually explicit material
  • Spam: Repetitive, unsolicited, or commercial content
  • Misinformation: Deliberately false or misleading information
  • Privacy Violations: Sharing others' private information

Restricted Content

Content may be flagged, hidden, or require review:

  • Profanity or vulgar language
  • Controversial or sensitive topics
  • Political or religious content (if disruptive)
  • Off-topic discussions
  • Self-promotion or advertising
  • Copyrighted material without permission

Moderation Methods

Automated Moderation

We use automated systems to:

  • Filter spam and obvious violations
  • Detect prohibited keywords and patterns
  • Flag potentially problematic content for review
  • Block known malicious links or files
  • Identify duplicate or repetitive content

Human Moderation

Trained moderators review:

  • Flagged content from automated systems
  • User-reported content
  • High-risk content types
  • Appeals and disputes
  • Context-dependent situations

Community Reporting

Users can report content through:

  • "Report" button on content
  • Email to moderation team
  • Contact form
  • Direct message to moderators

Moderation Process

1

Detection

Content flagged by automated systems, user reports, or proactive monitoring.

2

Review

Moderator reviews content against policy standards. Considers context, intent, and severity.

3

Action

Appropriate enforcement action taken based on violation severity and user history.

4

Notification

User notified of action taken, reason, and appeal rights (if applicable).

5

Documentation

Action logged for record-keeping and pattern analysis.

Enforcement Actions

Content-Level Actions

  • Content Removal: Deleted from platform
  • Content Hidden: Visible only to author and moderators
  • Content Edited: Problematic portions removed
  • Content Flagged: Warning label added

User-Level Actions

  • Warning: Notification of violation
  • Posting Restriction: Limited posting frequency or features
  • Temporary Suspension: 1-30 days unable to post
  • Permanent Ban: Indefinite loss of posting privileges
  • Account Termination: Complete account closure

Factors Considered

When determining enforcement action:

  • Severity of violation
  • Intent (accidental vs. deliberate)
  • User's history and prior violations
  • Harm caused or potential harm
  • Context and circumstances
  • User's response and cooperation

Appeals Process

How to Appeal

  1. Submit appeal within 5 business days of action
  2. Explain why you believe action was incorrect
  3. Provide any relevant context or evidence
  4. Appeal reviewed by different moderator
  5. Decision rendered within 3-5 business days

Appeal Outcomes

  • Upheld: Original decision stands
  • Overturned: Content restored, action reversed
  • Modified: Lesser action applied
  • Escalated: Referred to senior moderators

Moderator Guidelines

Moderator Standards

Our moderators are trained to:

  • Apply policies consistently and fairly
  • Consider context and nuance
  • Respect user privacy and dignity
  • Avoid bias and conflicts of interest
  • Document decisions clearly
  • Escalate complex cases appropriately

Moderator Accountability

Moderators are subject to oversight and quality review. Users can report moderator misconduct to our contact form.

Transparency

Moderation Reports

We publish quarterly transparency reports including:

  • Number of content items reviewed
  • Types of violations found
  • Actions taken
  • Appeal statistics
  • Response times

Special Circumstances

Emergency Situations

Content involving immediate safety threats (violence, self-harm, child safety) is prioritized and may result in immediate action and law enforcement notification.

Legal Requests

We comply with valid legal requests for content removal or user information. See our Privacy Policy for details on legal disclosures.

Contact Moderation Team

For moderation questions, reports, or appeals:

  • Email: Contact Us
  • Phone: (317) 314-3757
  • Report Button: Available on all content
  • Response Time: 24-48 hours for most reports

Questions about our policies? Contact us at our contact form