Comment Toxicity Classifier | By in ❤️ by Prasad

This is a demo of the toxicity identifier model, which classifies text according to whether it exhibits offensive attributes (i.e. profanity, sexual explicitness).

Enter text below and click 'Classify' to add it to the table.