Intel to fight hate speech with its new AI-powered tool called Bleep; responds to criticism

Bleep uses AI to detect offending words in the “incoming audio” and then “redact” them as per the user preferences. Here is everything we know about the Intel Bleep, an AI-powered tool to take on hate speech.

Semiconductor giant Intel recently showcased a new AI-powered tool called Bleep in its GDC presentation. The showcase demonstrated that the company plans to take on hate speech in the online gaming space with Bleep. Checking out the presentation, it looks like the company is focusing on hate speech in the voice chat. Bleep uses AI to detect offending words in the “incoming audio” and then “redact” them as per the user preferences. However, right after the initial showcase, a large part of the Internet criticized the implementation of the software. This criticism has forced the company to issue a response outlining the thought behind the implementation and more. Here is everything we know about the Intel Bleep, an AI-powered tool to take on hate speech.

Intel Bleep showcased to take on hate speech in multiplayer gaming

First, let’s talk about the actual implementation of Intel Bleep that invited criticism. As per the showcase, Bleep offers a screen full of sliders allowing users to take control of the software and its capabilities. This tool includes several categories including “Aggression”, “LGBTQ+ Hate”, “Misogyny”, “Racism and Xenophobia”, “Ableism and Body-Shaming”, and more. It also includes categories for “Name-calling”, “Swearing”, “Sexually Explicit Language”, the “N-word”, and even “white nationalism”. The interface makes it clear that the company wants users to have full control over the experience with the ability to customize the experience. To offer customization, Bleep offers sliders with range options including None, Some, Most, and All.

Advertisement

Popular Games

The criticism about the sliders and the response

As you may have guessed that the issue lies in the slider mechanism. This is because games don’t want to hear “Some”, “Most”, or “All” of the toxic, hate speech during their gaming sessions. It is worth noting that the company first talked about this concept back in 2019 at GDC. Intel is working with Spirit AI, an AI moderation specialist to create the Bleep software.

After the criticism, Intel tried to share its vision behind the design of the software and the slider mechanism. According to a report from Polygon, Marcus Kennedy, the General Manager at Intel Gaming, the company wants to offer “nuanced control in the hands” of the users. These sliders will ensure that gamers can tweak their experience based on the context or the situation. It is likely that some kind of “shit talk” may be acceptable amongst friends. Intel stated that the difference between the different states of the sliders is “complicated”. Besides, the company also noted that the state of the software and the technology is not final, and we could see changes in the final version.