Corner stone seq2seq with attention (using bidirectional ltsm )
- uses an encoder with multi layer RNN with LSTM
- Decoder is built using bahadau attention model
- Inference was cstom built by hand
- arxivscraper research papers
- Conceptnet Numberbatch's (CN) embeddings, similar to GloVe, but probably better (https://github.com/commonsense/conceptnet-numberbatch)