Loading model...

Results

Your input text and predictions will display below.

Text Identity attack Insult Obscene Severe toxicity Sexually explicit Threat Toxic

Input

Type some text to evaluate its toxicity.


Evaluate the toxicity of multiple phrases

Upload a text file, where each line, separated by a newline, is a phrase for the model to evaluate.