Read the text below.
YouTube is planning to launch a new feature that will ask users to reconsider posting potentially offensive comments. With the new feature, YouTube’s system will scan comments before they get posted. If the platform finds any comment offensive, a pop-up will prompt the user to think twice before posting it. The pop-up will include a link to YouTube’s community guidelines and a button that will allow users to edit the comment.
YouTube clarified that the pop-up will not appear for all offensive comments, as the system may not be able to detect every single one. However, harmful comments that end up getting posted will be removed by YouTube if they are found to have violated the website’s guidelines.
Hate speech has been one of YouTube’s biggest problems. The company said that the number of offensive comments removed from videos daily has increased by 4600% since early 2019. In addition to this, YouTube took down over 54,000 channels between July and September last year due to hate speech. In a statement, YouTube said that this was the largest number of hate speech-related terminations it had ever done in a single quarter.
YouTube’s new update will also feature a better comment filtering system for content creators. It will hide potentially offensive comments so creators can choose not to read them anymore.
Aside from hate speech, the company also plans to combat other issues that affect creators like discrimination and content demonetization. Some YouTube uploaders have complained that the platform’s system discriminates against their videos by hiding them from viewers or not letting them feature paid ads. The company is planning to address the issue by monitoring which uploaders are experiencing those problems. In line with this, content makers will be given the option to voluntarily provide their personal information, such as their race and gender.