Mixer released an interesting new blog post this week in which the staff address digital safety on their platform in regards to harmful content. In recent months there has been a new wave of content that, to be blunt, if it were found on Twitch or YouTube it would have been banned immediately. But because Mixer is a newer platform with its own growing pains, a lot of content you normally wouldn't see elsewhere has been cropping up in streams. So to combat this, the company has released a new post of guidelines and changes coming to the community. We have some of the changes for you below, but the key to making it work will most likely be Microsoft actually doing as the others do and hiring a staff to actually check for this content and be active around the clock on their own platform. We'll see if any of these new changes pushed the more risky content off Mixer moving forward.
- We adjusted the Mixer channel page to make it conspicuous and easy to report abuse right from the player window – for example, to report inappropriate content, including violent or extremist content, or violations of Mixer's Rules of User Conduct.
- We updated our Mixer Rules of User Conduct to expressly prohibit terrorist and violent extremist content.
- We're implementing a new Streamer Review system to improve validation and monitoring of new streamers on Mixer. The initial part of this system will go into place on August 29, and streamers will need to login with their Microsoft account to enable additional screening of new streaming accounts. There will also be a 24-hour waiting period before a streamer can start their first camera-capable stream.
- We are creating a new Toxicity Screen system to give streamers information and control over interactions on their channel. This will be coming later this year and will enable new filtering options to give you control over chat participation, including more automated moderation tools and restricted chat modes.
- We recognize that community channel moderators are the unsung heroes that help to keep our streamers' channels safe and fun. They work tirelessly behind the scenes, removing content that doesn't adhere to a channel's unique guidelines, and removing people from channels when necessary due to continued poor behavior. For these channel moderators, we're working on a new Moderator Program that provides improved tools and options to support their work, while also publicly recognizing and rewarding their invaluable contributions to the community.
- We are also investing in new software capabilities and our professional moderation team to actively review broadcasts, chat and activity on the service, which will better position us to quickly take action to remove or remedy harmful content.
- Because digital safety affects the entire livestreaming industry, we are committed to sharing and learning alongside other platforms to help increase safety and confidence for all livestreamers and viewers.