This one is like a one stop for all the things you need to know in Natural Language Processing.
Contents:
-
- Classical Machine Learning approach for NLP
- Text Preprocessing (Level 1)- Tokenization,Lemmatization,StopWords,POS
- Text Preprocessing (Level 2)- Bag Of Words, TFIDF, Unigrams,Bigrams,n-grams
-
- Deep Learning for NLP
-
Using Word Embeddings:
- Understanding Recurrent Neural Networks, LSTM,GRU.
- Text Preprocessing- Gensim,Word2vec,AvgWord2vec.
- GLovE, Fasttext.
-
Using Transformers:
- Bidirectional LSTM RNN, Encoders And Decoders, Attention Models.
- Transformers.
- BERT.