Acceptable Use Guidelines Enforcement Report

To view the latest version of our Enforcement Report, please visit https://safety.zoom.us/enforcement

Overview

At Zoom, we are committed to bringing happiness to our users while maintaining a culture of trust, safety, and respect. Our Acceptable Use Guidelines outline the types of content and behavior that are prohibited on our platform. Through these Guidelines, along with our Terms of Service, we strive to maintain a vibrant environment and minimize the risk of harm and disruption. For more detailed information about our approach to Trust and Safety and how we enforce our Acceptable Use Guidelines, please visit our Safety Center.

Zoom’s Acceptable Use Guidelines undergo updates as needed. Our Policy team and Appeals Panel conduct regular reviews of each standard to ensure alignment with our products, user needs and analyst application efficiency. Additionally, we refine these guidelines through consistent engagement with third-party stakeholders, including civil society organizations. 

After review and approval, we post the update in our Acceptable Use Guidelines change log, which you can access here.

This Acceptable Use Guidelines Enforcement Report presents data regarding reports processed by our Trust & Safety team within a specific month, rather than the total number of received reports. Furthermore, the Report details actions taken in response to violations of our Acceptable Use Guidelines, such as user blocks and user strikes. This information is updated monthly.

Please note that since reporters select the “issue type”, there may occasionally be discrepancies between the reported and actual issue types. 

For information on government requests, please refer to our separate Government Requests Transparency Report, available here.

Reports Actioned and Appeals Actioned

Our Review Process

When users report violations of our Acceptable Use Guidelines or Terms of Service, our Trust and Safety team investigates as quickly as possible and implements appropriate measures when necessary.

Our tiered review process begins with a team of analysts who evaluate various types of reports and flags. Reports are systematically sorted into queues based on issue or reporter type, and to maintain comprehensive expertise, team members routinely rotate between different queues. As reports are resolved, classification and resolution information feeds into a dashboard, providing essential data that enables us to identify trends, evaluate abuse-prevention tools, and enhance our processes based on usage patterns.

For cases that are complex or require additional consideration, analysts escalate them to higher tiers. At the highest level, the Appeals Panel consists of members serving one-year terms, bringing diverse perspectives from various backgrounds, experience levels, tenures, and departments within Zoom. 

Users who have been suspended from Zoom may appeal the action here. However, appeals related to certain issues, such as those involving confirmed CSEA or references to terrorism and violent extremism, are not eligible for consideration.

Definitions

Issue Type

Issue types are further defined in our Acceptable Use Guidelines.

Special Issues

CSEA

Zoom has a zero-tolerance policy on child sexual exploitation and abuse (CSEA) material. We permanently suspend accounts that transmit, display, store, share, or promote CSEA on our platform. Our trust form includes a dedicated section for users to report child sexual exploitation and abuse (CSEA). These cases are subsequently prioritized for review by our analysts. Additionally, we have tools that help us detect and report such content, after human review, to the National Center for Missing & Exploited Children (NCMEC) through NCMEC’s CyberTipline. In this reporting period, we sent

CyberTipline reports to NCMEC.

Violent Extremist Groups and Glorification of Violence

In partnership with the Federal Bureau of Investigation (FBI), we utilize an API to report content related to violent extremist groups and/or depicting glorification of violence directly to the FBI’s National Threat Operations Center (NTOC). In this reporting period, we escalated

cases to the FBI’s National Threat Operations Center (NTOC).

To learn more about how we work to address abuse on our platform, please visit Zoom’s Safety Center.

Abuse Report Action Taken

Blocked User(s): The user was deactivated and/or blocked. They are prohibited from using Zoom unless they successfully appeal the decision.

Disable App: We removed an app from the Zoom Marketplace. The app is no longer searchable, new users cannot download the app and the access of all the current users has been revoked.

Dismissed: No action was taken. A report may be dismissed for various reasons, such as when it is determined to be accidental or false, when there is insufficient evidence to establish a violation, or when it is identified as a false positive. The specific definitions and examples for each dismissal reason are detailed below:

  • Accidental Report by User: The report was submitted unintentionally or contains information that required correction.
    • Example:
      • The report was inadvertently submitted or included an incorrect user. 
  • Action Not Listed: None of the available actions appropriately matched the type of report or unique information submitted.
    • Example:
      • The report was submitted, but the required action was performed in a system or tool that exists beyond the scope of the trust form.
  • False Positive: The report claims a violation but is subsequently determined to be inaccurate or contains information that conflicts with available evidence.
    • Example:
      • The submitted report includes images that were incorrectly flagged by internal detection systems.
  • Incorrect Form: the report was submitted through the wrong channels or report category.
    • Example:
      • The report was submitted using our abusive content or behavior form while trying to report an account as compromised.
  • Insufficient Evidence or Information Provided: The report lacks necessary details, therefore no action can be taken.
    • Example:
      • A report submission contains minimal or inadequate information, which prevents conducting a thorough investigation. 
  • Internal Test: the report was submitted for internal testing purposes.
    • Example:
      • An automated report was generated to verify that newly-deployed code functioned as expected.
  • No Violation: the reported content and/or behavior does not violate Zoom’s Terms of Service and Acceptable Use Guidelines.
    • Example:
      • A report is submitted regarding someone joining a waiting room without any indication of an Acceptable Use Guidelines violation.
  • Not Actionable: unable to take action on a report due to technical limitations with internal tooling or reporting systems.
    • Example:
      • A report was filed against an application that was authorized by one of the meeting’s participants.*
      • Technical limitations with internal tooling or reporting.
  • Spam/Phishing: the report is not Trust and Safety related and points to external websites or links.
    • Example:
      • The report was submitted with a link to see a host’s external website that is not relevant to Zoom.

*Note about applications: During a meeting, a participant can allow the use of an application built for the Zoom platform (known as Marketplace apps). Some Marketplace apps will appear as a participant in the meeting. If the Marketplace app accesses meeting or webinar content, such as video, audio, chat, and/or meeting files during a meeting, it will appear in the Active Apps Notifier in the upper left corner of the meeting window. Zoom developed the Active Apps Notifier to help you make informed decisions about how you use Zoom. During meetings where a Marketplace app has access to meeting content, you may mute your microphone, turn off the video, not send chats, or leave a meeting entirely. Zoom is committed to fostering a developer-friendly environment and enabling the creation of innovative applications on its platform.

If you encounter any problems or have any concerns about a Marketplace app, please don’t hesitate to let us know through our reporting function. For more information on reporting a Marketplace app, please read here.

Duplicate: Two or more reports about the same issue from the same reporter. In this instance, we consolidate duplicate reports and take action once.

Removed from Zoom Event Lobby: One or more Zoom Event Participants were removed from the event’s lobby.

Removed Message(s) (Lobby ZE): One or more messages were removed from a Zoom Event lobby.

Remove ZE Recording: One or more recordings were removed from a Zoom Event Account.

Suspend App: A Zoom Marketplace App was suspended, we removed the listing from the Marketplace, the app is no longer searchable, and it prevents new users from downloading the app, but current users are still allowed to use the app.

Suspended Developer: The developer was deactivated and/or blocked from our Marketplace. They are prohibited from using Zoom’s Marketplace unless they successfully appeal the decision.

Suspended Event: An event was ended or prohibited from taking place.

Suspended Meeting: When a Zoom Meeting was suspended from happening.

Suspended User from ZE: One or more Zoom Events users were deactivated and/or blocked.

Suspended Whiteboard: A user’s Zoom Whiteboard function was suspended.

OnZoom/Zoom Events Host(s) Suspended: One or more OnZoom/Zoom Events hosts were deactivated and/or blocked.

Striked User(s): The user received a strike. Strikes expire after 180 days and do not affect the user’s ability to use the platform unless they accumulate. Depending on the reason for the strike, one or two additional strikes within the same 180-day period will lead to the user being blocked.

Appeals Action Taken

Appeal Approved: Platform access was restored after the user acknowledged and agreed to follow Zoom's Acceptable Use Guidelines and Terms of Service.

Appeal Rejected: Access remained blocked, either because the user had previously been granted an appeal but repeated the original violation, or because their violation involved a non-appealable issue.

Dismissed: No action was required or taken. An appeal may be dismissed for various reasons, including where it is determined to be accidental or false, the account is already active, or an incorrect form was submitted. The specific definitions and examples for each dismissal reason are outlined below:

  • Accidental Report by User: An instance where an appeal is submitted unintentionally or contains information that requires correction.
    • Example:
      • An appeal that has been inadvertently submitted or assigned to an incorrect user.
  • Already Active: the appeal was submitted and the account was already active.
  • Inactive User: the appeal was submitted when the user was deactivated by the account admin, which was not related to any Trust and Safety actions.
  • Incorrect Form: the appeal was submitted through the wrong channels or report category.
    • Example:
      • The appeal was submitted for an account while trying to appeal a device block.
  • Internal Test: the appeal was submitted specifically for internal testing purposes.
    • Example:
      • An automated appeal was generated to verify that newly-deployed code functioned as expected.
  • Reason Not Listed: when none of the available actions appropriately match the specific information provided.
    • Example:
      • The appeal has been submitted, but the required action was performed in a system or tool that exists beyond the scope of a Trust and Safety form.
  • Spam/Phishing: the appeal is not Trust and Safety related and points to external websites or links.
    • Example:
      • The appeal was submitted with a link to see a host’s external website that is not relevant to Zoom.

Duplicate: Two or more appeals from the same user. In this instance, we consolidate duplicate appeals and take action once.

Reporter Country

This geographical information is derived from the reporter’s IP address at the time of submission. The location typically aligns with the reporter’s actual position, unless they are accessing through a virtual private network (VPN) or proxy server.