Skip to content

Latest commit

 

History

History

firestore-perspective-toxicity

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 

Analyze Toxicity with Perspective API

Author: Jigsaw (https://jigsaw.google.com)

Description: We’ve partnered with the Jigsaw team to build the Analyze Toxicity extension, which leverages machine learning to classify the level of toxicity, threat and profanity of your user comments. The Analyze Toxicity extension uses machine learning to classify the level of toxicity, threat, and profanity of user comments in Cloud Firestore. This extension uses Perspective API (https://perspectiveapi.com) which is trusted by platforms like the New York Times and Reddit to promote healthy dialogue online. Please check it out here.


🧩 Install this extension

To install this extension visit the repository conversationai/firestore-perspective-toxicity