Posted in: Games, Mixer, Video Games | Tagged: ,


Mixer Addresses Digital Safety On Their Platform In New Post

Mixer released an interesting new blog post this week in which the staff address digital safety on their platform in regards to harmful content. In recent months there has been a new wave of content that, to be blunt, if it were found on Twitch or YouTube it would have been banned immediately. But because Mixer is a newer platform with its own growing pains, a lot of content you normally wouldn't see elsewhere has been cropping up in streams. So to combat this, the company has released a new post of guidelines and changes coming to the community. We have some of the changes for you below, but the key to making it work will most likely be Microsoft actually doing as the others do and hiring a staff to actually check for this content and be active around the clock on their own platform. We'll see if any of these new changes pushed the more risky content off Mixer moving forward.

Microsoft Reportedly Laid Off Several Content Creation Employees

  1. We adjusted the Mixer channel page to make it conspicuous and easy to report abuse right from the player window – for example, to report inappropriate content, including violent or extremist content, or violations of Mixer's Rules of User Conduct.
  2. We updated our Mixer Rules of User Conduct to expressly prohibit terrorist and violent extremist content.
  3. We're implementing a new Streamer Review system to improve validation and monitoring of new streamers on Mixer. The initial part of this system will go into place on August 29, and streamers will need to login with their Microsoft account to enable additional screening of new streaming accounts. There will also be a 24-hour waiting period before a streamer can start their first camera-capable stream.
  4. We are creating a new Toxicity Screen system to give streamers information and control over interactions on their channel. This will be coming later this year and will enable new filtering options to give you control over chat participation, including more automated moderation tools and restricted chat modes.
  5. We recognize that community channel moderators are the unsung heroes that help to keep our streamers' channels safe and fun. They work tirelessly behind the scenes, removing content that doesn't adhere to a channel's unique guidelines, and removing people from channels when necessary due to continued poor behavior. For these channel moderators, we're working on a new Moderator Program that provides improved tools and options to support their work, while also publicly recognizing and rewarding their invaluable contributions to the community.
  6. We are also investing in new software capabilities and our professional moderation team to actively review broadcasts, chat and activity on the service, which will better position us to quickly take action to remove or remedy harmful content.
  7. Because digital safety affects the entire livestreaming industry, we are committed to sharing and learning alongside other platforms to help increase safety and confidence for all livestreamers and viewers.

Enjoyed this? Please share on social media!

Stay up-to-date and support the site by following Bleeding Cool on Google News today!

Gavin SheehanAbout Gavin Sheehan

Gavin is the current Games Editor for Bleeding Cool. He has been a lifelong geek who can chat with you about comics, television, video games, and even pro wrestling. He can also teach you how to play Star Trek chess, be your Mercy on Overwatch, recommend random cool music, and goes rogue in D&D. He also enjoys hundreds of other geeky things that can't be covered in a single paragraph. Follow @TheGavinSheehan on Facebook, Twitter, Instagram, and Vero, for random pictures and musings.
twitterfacebookinstagram
Comments will load 20 seconds after page. Click here to load them now.