Content Moderation Policy
Content Moderation Policy
Last Updated: March 2026
RealScroll is committed to maintaining a safe, authentic, and respectful platform for users.
This policy explains how RealScroll reviews content, enforces rules, and applies moderation actions.
Moderation Overview
Content moderation on RealScroll may involve a combination of:
Automated systems
Machine learning tools
Third party moderation services
User reports
Human review
These systems help identify content that may violate our Terms of Service or Community Guidelines.
However, RealScroll does not review all content uploaded to the platform before it is published.
Users are responsible for the content they upload and share.
Automated Detection Systems
RealScroll may use automated systems and third party technologies to analyze uploaded media including photos, videos, and audio.
These systems may detect signals related to:
Spam or bot activity
Explicit or graphic imagery
Potential manipulation or synthetic media
Policy violations
These systems may apply informational labels, blur content, restrict visibility, or flag content for review.
Automated analysis tools are not perfect and may occasionally produce inaccurate results.
Explicit Content Handling
Content that contains graphic imagery such as:
Open wounds
Heavy blood
Medical injuries
Disturbing visuals
may be automatically marked as Explicit Content.
When this occurs, the content may be blurred and require users to click a confirmation before viewing.
This system is designed to allow certain informational or contextual content while protecting users from unexpected exposure to graphic material.
Content that violates platform rules may still be removed.
Content Removal
RealScroll may remove or restrict content that violates our Terms of Service or Community Guidelines.
Examples include content that involves:
Harassment or abuse
Violence or threats
Illegal activity
Spam or manipulation
Copyright infringement
We may also remove content when required by law or legal request.
Visibility Restrictions
In some cases, content may not be removed but may have limited visibility.
This may include:
Reduced feed distribution
Informational labels
Explicit content warnings
Temporary review states
These actions help reduce harmful or misleading content while allowing certain content to remain available where appropriate.
Account Enforcement
Accounts that repeatedly violate platform rules may face enforcement actions including:
Temporary restrictions
Posting limits
Feature limitations
Account suspension
Permanent account removal
Severe violations may result in immediate account termination.
Reporting Content
Users can report content through the reporting tools provided on the platform.
Reports are reviewed by moderation systems and may trigger automated or manual review processes.
Submitting false reports or abusing reporting tools may result in restrictions.
Appeals
If you believe content was incorrectly moderated, you may contact our support team to request review.
Moderation decisions are made to protect the safety and integrity of the platform.
Platform Liability
RealScroll acts as a platform for user generated content.
We do not create, edit, or control most content uploaded by users.
Users are solely responsible for ensuring that their content complies with applicable laws and platform policies.
Last Updated: March 2026
RealScroll is committed to maintaining a safe, authentic, and respectful platform for users.
This policy explains how RealScroll reviews content, enforces rules, and applies moderation actions.
Moderation Overview
Content moderation on RealScroll may involve a combination of:
Automated systems
Machine learning tools
Third party moderation services
User reports
Human review
These systems help identify content that may violate our Terms of Service or Community Guidelines.
However, RealScroll does not review all content uploaded to the platform before it is published.
Users are responsible for the content they upload and share.
Automated Detection Systems
RealScroll may use automated systems and third party technologies to analyze uploaded media including photos, videos, and audio.
These systems may detect signals related to:
Spam or bot activity
Explicit or graphic imagery
Potential manipulation or synthetic media
Policy violations
These systems may apply informational labels, blur content, restrict visibility, or flag content for review.
Automated analysis tools are not perfect and may occasionally produce inaccurate results.
Explicit Content Handling
Content that contains graphic imagery such as:
Open wounds
Heavy blood
Medical injuries
Disturbing visuals
may be automatically marked as Explicit Content.
When this occurs, the content may be blurred and require users to click a confirmation before viewing.
This system is designed to allow certain informational or contextual content while protecting users from unexpected exposure to graphic material.
Content that violates platform rules may still be removed.
Content Removal
RealScroll may remove or restrict content that violates our Terms of Service or Community Guidelines.
Examples include content that involves:
Harassment or abuse
Violence or threats
Illegal activity
Spam or manipulation
Copyright infringement
We may also remove content when required by law or legal request.
Visibility Restrictions
In some cases, content may not be removed but may have limited visibility.
This may include:
Reduced feed distribution
Informational labels
Explicit content warnings
Temporary review states
These actions help reduce harmful or misleading content while allowing certain content to remain available where appropriate.
Account Enforcement
Accounts that repeatedly violate platform rules may face enforcement actions including:
Temporary restrictions
Posting limits
Feature limitations
Account suspension
Permanent account removal
Severe violations may result in immediate account termination.
Reporting Content
Users can report content through the reporting tools provided on the platform.
Reports are reviewed by moderation systems and may trigger automated or manual review processes.
Submitting false reports or abusing reporting tools may result in restrictions.
Appeals
If you believe content was incorrectly moderated, you may contact our support team to request review.
Moderation decisions are made to protect the safety and integrity of the platform.
Platform Liability
RealScroll acts as a platform for user generated content.
We do not create, edit, or control most content uploaded by users.
Users are solely responsible for ensuring that their content complies with applicable laws and platform policies.
Last Updated: March 2026
RealScroll is committed to maintaining a safe, authentic, and respectful platform for users.
This policy explains how RealScroll reviews content, enforces rules, and applies moderation actions.
Moderation Overview
Content moderation on RealScroll may involve a combination of:
Automated systems
Machine learning tools
Third party moderation services
User reports
Human review
These systems help identify content that may violate our Terms of Service or Community Guidelines.
However, RealScroll does not review all content uploaded to the platform before it is published.
Users are responsible for the content they upload and share.
Automated Detection Systems
RealScroll may use automated systems and third party technologies to analyze uploaded media including photos, videos, and audio.
These systems may detect signals related to:
Spam or bot activity
Explicit or graphic imagery
Potential manipulation or synthetic media
Policy violations
These systems may apply informational labels, blur content, restrict visibility, or flag content for review.
Automated analysis tools are not perfect and may occasionally produce inaccurate results.
Explicit Content Handling
Content that contains graphic imagery such as:
Open wounds
Heavy blood
Medical injuries
Disturbing visuals
may be automatically marked as Explicit Content.
When this occurs, the content may be blurred and require users to click a confirmation before viewing.
This system is designed to allow certain informational or contextual content while protecting users from unexpected exposure to graphic material.
Content that violates platform rules may still be removed.
Content Removal
RealScroll may remove or restrict content that violates our Terms of Service or Community Guidelines.
Examples include content that involves:
Harassment or abuse
Violence or threats
Illegal activity
Spam or manipulation
Copyright infringement
We may also remove content when required by law or legal request.
Visibility Restrictions
In some cases, content may not be removed but may have limited visibility.
This may include:
Reduced feed distribution
Informational labels
Explicit content warnings
Temporary review states
These actions help reduce harmful or misleading content while allowing certain content to remain available where appropriate.
Account Enforcement
Accounts that repeatedly violate platform rules may face enforcement actions including:
Temporary restrictions
Posting limits
Feature limitations
Account suspension
Permanent account removal
Severe violations may result in immediate account termination.
Reporting Content
Users can report content through the reporting tools provided on the platform.
Reports are reviewed by moderation systems and may trigger automated or manual review processes.
Submitting false reports or abusing reporting tools may result in restrictions.
Appeals
If you believe content was incorrectly moderated, you may contact our support team to request review.
Moderation decisions are made to protect the safety and integrity of the platform.
Platform Liability
RealScroll acts as a platform for user generated content.
We do not create, edit, or control most content uploaded by users.
Users are solely responsible for ensuring that their content complies with applicable laws and platform policies.

