Skip to content

Boukebya/NLP_ToxicComment

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

24 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NLP_ToxicComment

This project is conducted by two engineering students in order to validate a NLP course.

Toxic Comment Classification

This project is based from a Kaggle competition, the purpose of the project is to train a model to classificate a sentence in some catgegories (Toxic, Obscene, Threat...). It will feature a Notebook in which the pipeline of the solution will be developped.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •