Microsoft has released its latest Xbox Transparency Report to show what all they’ve been doing over the last year to combat various toxicity, cheating and, account issues on its platforms, and the report is a fascinating insight into the way things are slowly changing because of both automated and report-oriented moderation tools.
The full document, which Microsoft published for free online, details the company’s approach to moderation through various factors, such as player choice, parental controls, and proactive and reactive moderation. Broadly, players have control over what they can see, and parental controls allow a parent or guardian to choose what their child sees, and these are rather obvious forms of user-input moderation. However, the other two forms of moderation are the most interesting aspects.
Reactive moderation is when someone reports something, and the document details how the company responds to those reported issues on a case-by-case basis. The far more impressive part of their moderation practices is in how their proactive moderation tools have increased in functionality. Proactive moderation tools are the type that exist on the backend, and they ensure that objectionable content never reaches players at all.
During the first half of 2022, the moderation tools managed to flag millions upon millions of problematic materials, and in 4.5 million cases, the offending account was banned. The vast majority of these banned accounts were because of proactive moderation tools. This means that no one saw the bad content before the system grabbed it and removed it and/or the account.
Many of these issues are not necessarily because of people uploading problematic content, as 4.33 million enforcement actions were taken because of cheating and inauthentic accounts. However, there are still large instances of problematic content, and the system did flag over a million instances of banned profanity, over 800 thousand instances of sexual content, etc. Basically, the system is working as intended.
This is further evidenced by the fact that player reporting, also known as reactive moderation, has been on the decline. Reported issues decrease every year because of how advanced the proactive tools have become. Overall, this means that the Xbox online space is becoming a safer space year after year.
The full report is worth reading if analytics are your thing.
Justin van Huyssteen (@LC_Lupus)
Senior Editor, NoobFeed