Train a bidirectional or normal LSTM recurrent neural network to generate text on a free GPU using any dataset. Just upload your text file and click run!
The code internally looks very similar, interestingly, even the same errors pop up when stress-testing it.
However, I can see that you added a python notebook and some configuration options (which really help quite a lot).
Is this more of a fork, or what's the difference between this repository and textgenrnn?