GithubHelp home page GithubHelp logo

mos's People

Contributors

billy-inn avatar zihangdai avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mos's Issues

questions about bptt

Hi author,I'm a newbie on NLP. I don't understand the use of "bptt"(args.bptt = 70) and some expressions related to it,such as followings:

bptt = args.bptt if np.random.random() < 0.95 else args.bptt / 2.
seq_len = max(5, int(np.random.normal(bptt, 5)))
seq_len = min(seq_len, args.bptt + args.max_seq_len_delta)
optimizer.param_groups[0]['lr'] = lr2 * seq_len / args.bptt

To my understanding, seq_len is the time-step of RNN. So what's the relationship between "bptt" and "seq_len"? Could you please help to explain some of it? Thanks

TypeError: iteration over a 0-d tensor

Hello! I'm running into an issue with training the Penn Treeback model:
python main.py --data data/penn --dropouti 0.4 --dropoutl 0.29 --dropouth 0.225 --seed 28 --batch_size 12 --lr 20.0 --epoch 1000 --nhid 960 --nhidlast 620 --emsize 280 --n_experts 15 --save PTB --single_gpu

Traceback (most recent call last):
  File "main.py", line 260, in <module>
    train()
  File "main.py", line 202, in train
    hidden[s_id] = repackage_hidden(hidden[s_id])
  File "/mos/utils.py", line 12, in repackage_hidden
    return tuple(repackage_hidden(v) for v in h)
  File "/mos/utils.py", line 12, in <genexpr>
    return tuple(repackage_hidden(v) for v in h)
  File "/mos/utils.py", line 12, in repackage_hidden
    return tuple(repackage_hidden(v) for v in h)
  File "/mos/utils.py", line 12, in <genexpr>
    return tuple(repackage_hidden(v) for v in h)
  File "/mos/utils.py", line 12, in repackage_hidden
    return tuple(repackage_hidden(v) for v in h)
  File "/mos/utils.py", line 12, in <genexpr>
    return tuple(repackage_hidden(v) for v in h)
  File "/mos/utils.py", line 12, in repackage_hidden
    return tuple(repackage_hidden(v) for v in h)
  File "/mos/utils.py", line 12, in <genexpr>
    return tuple(repackage_hidden(v) for v in h)
  File "/mos/utils.py", line 12, in repackage_hidden
    return tuple(repackage_hidden(v) for v in h)
  File "/mos/utils.py", line 12, in <genexpr>
    return tuple(repackage_hidden(v) for v in h)
  File "/mos/utils.py", line 12, in repackage_hidden
    return tuple(repackage_hidden(v) for v in h)
  File "/usr/local/lib/python3.5/dist-packages/torch/tensor.py", line 360, in __iter__
    raise TypeError('iteration over a 0-d tensor')
TypeError: iteration over a 0-d tensor

Pre-trained model?

Hi! Thanks for sharing this code base! Do you have a pre-trained model that we could use? We want to test the idea of incorporating a language model into our project first, so it would be great if we could utilize a pre-trained model for this purpose rather than train a new model from scratch.

Thanks so much!

Runtime Warning: RNN module weights are not part of single contiguous chunk of memory.

I am trying to train the model under Windows 10 with CUDA 9.2 (NVIDIA 1070 GPU).

Pytorch version 1.0.0

I get the following error when trying to build from wikitext-2:

C:\Users\vlad\Anaconda3\envs\py36\python.exe C:/Users/vlad/Documents/GitHub/mos/main.py --epochs 1000 --data data/wikitext-2 --save WT2 --dropouth 0.2 --seed 1882 --n_experts 15 --nhid 1150 --nhidlast 650 --emsize 300 --batch_size 15 --lr 15.0 --dropoutl 0.29 --small_batch_size 5 --max_seq_len_delta 20 --dropouti 0.55 --single_gpu
Experiment dir : WT2-20181223-005932
torch.Size([139241, 15])
torch.Size([21764, 10])
torch.Size([245569, 1])
Applying weight drop of 0.5 to weight_hh_l0
Applying weight drop of 0.5 to weight_hh_l0
Applying weight drop of 0.5 to weight_hh_l0
param size: 34909528
Args: Namespace(alpha=2, batch_size=15, beta=1, bptt=70, clip=0.25, continue_train=False, cuda=True, data='data/wikitext-2', dropout=0.4, dropoute=0.1, dropouth=0.2, dropouti=0.55, dropoutl=0.29, emsize=300, epochs=1000, log_interval=200, lr=15.0, max_seq_len_delta=20, model='LSTM', n_experts=15, nhid=1150, nhidlast=650, nlayers=3, nonmono=5, save='WT2-20181223-005932', seed=1882, single_gpu=True, small_batch_size=5, tied=True, wdecay=1.2e-06, wdrop=0.5)
Model total parameters: 34909528
C:\Users\vlad\Anaconda3\envs\py36\lib\site-packages\torch\nn\modules\rnn.py:179: RuntimeWarning: RNN module weights are not part of single contiguous chunk of memory. This means they need to be compacted at every call, possibly greatly increasing memory usage. To compact weights again call flatten_parameters().
  self.dropout, self.training, self.bidirectional, self.batch_first)
Traceback (most recent call last):
  File "C:/Users/vlad/Documents/GitHub/mos/main.py", line 261, in <module>
    train()
  File "C:/Users/vlad/Documents/GitHub/mos/main.py", line 205, in train
    log_prob, hidden[s_id], rnn_hs, dropped_rnn_hs = parallel_model(cur_data, hidden[s_id], return_h=True)
  File "C:\Users\vlad\Anaconda3\envs\py36\lib\site-packages\torch\nn\modules\module.py", line 489, in __call__
    result = self.forward(*input, **kwargs)
  File "C:\Users\vlad\Documents\GitHub\mos\model.py", line 84, in forward
    raw_output, new_h = rnn(raw_output, hidden[l])
  File "C:\Users\vlad\Anaconda3\envs\py36\lib\site-packages\torch\nn\modules\module.py", line 489, in __call__
    result = self.forward(*input, **kwargs)
  File "C:\Users\vlad\Documents\GitHub\mos\weight_drop.py", line 47, in forward
    return self.module.forward(*args)
  File "C:\Users\vlad\Anaconda3\envs\py36\lib\site-packages\torch\nn\modules\rnn.py", line 179, in forward
    self.dropout, self.training, self.bidirectional, self.batch_first)
RuntimeError: shape '[5290000, 1]' is invalid for input of size 4600

Performance discrepancy for torch 0.2.0 and 0.4.0

Hi Mr.Dai,

It seems the performance discrepancy has lead to slightly worse performance on the penn treebank in the readme page. I am able to reproduce the result for the Penn Treebank. However, for the WT2 benchmark. Before finetune I am able to get 65.66/62.94, and After finetune 64.41/61.77, for default dynamic evaluation setting I am getting 43.34/41.49. I also observed similar slight performance decrease(about 0.6) for pt0.4.0 and 0.2.0 in a following work of mos(ChengyueGongR/Frequency-Agnostic#2). I am wondering is this result in-line with your benchmark in pytorch 0.4.0? Thank you for your time!

NotImplementedError while running the testing code in model.py

I try to run the script model.py. But a NotImplementedError occurs.
Traceback (most recent call last): File "/Users/TONY/Downloads/mos-master/model.py", line 131, in <module> model(input, hidden) File "/Users/TONY/anaconda/lib/python3.6/site-packages/torch/nn/modules/module.py", line 491, in __call__ param size: 4426 result = self.forward(*input, **kwargs) File "/Users/TONY/Downloads/mos-master/model.py", line 71, in forward emb = embedded_dropout(self.encoder, input, dropout=self.dropoute if self.training else 0) File "/Users/TONY/Downloads/mos-master/embed_regularize.py", line 19, in embedded_dropout X = embed._backend.Embedding.apply(words, masked_embed_weight, File "/Users/TONY/anaconda/lib/python3.6/site-packages/torch/nn/backends/backend.py", line 10, in __getattr__ raise NotImplementedError NotImplementedError

cannot reproduce result?

Hi,

When I try to reproduce the results, I find that the model converges ~100 epoch and the valid ppl is 65.32, which is much higher than the published results. The only thing I changed is the number of experts (which I increased from 15 to 16), but I don't think that can explain the large difference between my result and yours.

My python is 3.6.7 and pytorch version is 0.4.0

For the training log. I trained for 300 epoch and the last 200 epoch has almost identical output as the 100th, so I didn't put that up.

CUDA_VISIBLE_DEVICES=1 python3 main.py --data data/penn --dropouti 0.4 --dropoutl 0.29 --dropouth 0.225 --seed 28 --batch_size 12 --lr 20.0 --epoch 1000 --nhid 960 --nhidlast 620 --emsize 280 --n_experts 16 --save PTB --single_gpu

Experiment dir : PTB-20190520-055046
torch.Size([77465, 12])
torch.Size([7376, 10])
torch.Size([82430, 1])
ModuleList(
(0): LSTM(280, 960)
(1): LSTM(960, 960)
(2): LSTM(960, 620)
)
param size: 21675120
Args: Namespace(alpha=2, batch_size=12, beta=1, bptt=70, clip=0.25, continue_train=False, cuda=True, data='data/penn', dropout=0.4, dropoute=0.1, dropouth=0.225, dropouti=0.4, dropoutl=0.29, emsize=280, epochs=1000, log_interval=200, lr=20.0, max_seq_len_delta=40, model='LSTM', n_experts=16, nhid=960, nhidlast=620, nlayers=3, nonmono=5, save='PTB-20190520-055046', seed=28, single_gpu=True, small_batch_size=12, tied=True, wdecay=1.2e-06, wdrop=0.5)
Model total parameters: 21675120

| epoch 1 | 200/ 1106 batches | lr 20.00 | ms/batch 106.34 | loss 6.92 | ppl 1015.01
| epoch 1 | 400/ 1106 batches | lr 20.00 | ms/batch 105.66 | loss 6.49 | ppl 657.92
| epoch 1 | 600/ 1106 batches | lr 20.00 | ms/batch 106.05 | loss 6.18 | ppl 481.95
| epoch 1 | 800/ 1106 batches | lr 20.00 | ms/batch 104.37 | loss 6.01 | ppl 405.54
| epoch 1 | 1000/ 1106 batches | lr 20.00 | ms/batch 106.01 | loss 5.85 | ppl 347.65

| end of epoch 1 | time: 124.73s | valid loss 5.59 | valid ppl 267.77

Saving Normal!
| epoch 2 | 200/ 1106 batches | lr 20.00 | ms/batch 106.34 | loss 5.67 | ppl 290.82
| epoch 2 | 400/ 1106 batches | lr 20.00 | ms/batch 109.11 | loss 5.53 | ppl 252.54
| epoch 2 | 600/ 1106 batches | lr 20.00 | ms/batch 107.30 | loss 5.43 | ppl 229.14
| epoch 2 | 800/ 1106 batches | lr 20.00 | ms/batch 107.73 | loss 5.44 | ppl 230.31
| epoch 2 | 1000/ 1106 batches | lr 20.00 | ms/batch 106.33 | loss 5.38 | ppl 216.74

| end of epoch 2 | time: 126.23s | valid loss 5.21 | valid ppl 183.90

Saving Normal!
| epoch 3 | 200/ 1106 batches | lr 20.00 | ms/batch 106.41 | loss 5.32 | ppl 203.51
| epoch 3 | 400/ 1106 batches | lr 20.00 | ms/batch 106.44 | loss 5.21 | ppl 183.52
| epoch 3 | 600/ 1106 batches | lr 20.00 | ms/batch 109.32 | loss 5.17 | ppl 175.25
| epoch 3 | 800/ 1106 batches | lr 20.00 | ms/batch 108.57 | loss 5.19 | ppl 178.93
| epoch 3 | 1000/ 1106 batches | lr 20.00 | ms/batch 109.00 | loss 5.16 | ppl 174.74

| end of epoch 3 | time: 126.55s | valid loss 5.03 | valid ppl 152.53

Saving Normal!
| epoch 4 | 200/ 1106 batches | lr 20.00 | ms/batch 106.13 | loss 5.14 | ppl 169.92
| epoch 4 | 400/ 1106 batches | lr 20.00 | ms/batch 107.05 | loss 5.03 | ppl 152.43
| epoch 4 | 600/ 1106 batches | lr 20.00 | ms/batch 108.75 | loss 4.99 | ppl 147.03
| epoch 4 | 800/ 1106 batches | lr 20.00 | ms/batch 108.35 | loss 5.00 | ppl 148.85
| epoch 4 | 1000/ 1106 batches | lr 20.00 | ms/batch 107.50 | loss 5.01 | ppl 149.48

| end of epoch 4 | time: 126.22s | valid loss 4.89 | valid ppl 132.47

Saving Normal!
| epoch 5 | 200/ 1106 batches | lr 20.00 | ms/batch 108.48 | loss 4.98 | ppl 145.85
| epoch 5 | 400/ 1106 batches | lr 20.00 | ms/batch 108.12 | loss 4.87 | ppl 130.26
| epoch 5 | 600/ 1106 batches | lr 20.00 | ms/batch 106.91 | loss 4.84 | ppl 126.39
| epoch 5 | 800/ 1106 batches | lr 20.00 | ms/batch 107.36 | loss 4.87 | ppl 130.46
| epoch 5 | 1000/ 1106 batches | lr 20.00 | ms/batch 108.77 | loss 4.87 | ppl 130.37

| end of epoch 5 | time: 126.82s | valid loss 4.80 | valid ppl 121.11

Saving Normal!
| epoch 6 | 200/ 1106 batches | lr 20.00 | ms/batch 110.61 | loss 4.87 | ppl 129.69
| epoch 6 | 400/ 1106 batches | lr 20.00 | ms/batch 107.66 | loss 4.74 | ppl 114.68
| epoch 6 | 600/ 1106 batches | lr 20.00 | ms/batch 107.44 | loss 4.73 | ppl 113.10
| epoch 6 | 800/ 1106 batches | lr 20.00 | ms/batch 106.40 | loss 4.75 | ppl 115.72
| epoch 6 | 1000/ 1106 batches | lr 20.00 | ms/batch 107.34 | loss 4.77 | ppl 118.23

| end of epoch 6 | time: 127.32s | valid loss 4.70 | valid ppl 110.19

Saving Normal!
| epoch 7 | 200/ 1106 batches | lr 20.00 | ms/batch 108.23 | loss 4.76 | ppl 116.90
| epoch 7 | 400/ 1106 batches | lr 20.00 | ms/batch 108.03 | loss 4.65 | ppl 104.18
| epoch 7 | 600/ 1106 batches | lr 20.00 | ms/batch 107.71 | loss 4.63 | ppl 102.86
| epoch 7 | 800/ 1106 batches | lr 20.00 | ms/batch 108.75 | loss 4.67 | ppl 106.88
| epoch 7 | 1000/ 1106 batches | lr 20.00 | ms/batch 106.55 | loss 4.68 | ppl 107.82

| end of epoch 7 | time: 126.81s | valid loss 4.66 | valid ppl 105.25

Saving Normal!
| epoch 8 | 200/ 1106 batches | lr 20.00 | ms/batch 107.66 | loss 4.69 | ppl 109.08
| epoch 8 | 400/ 1106 batches | lr 20.00 | ms/batch 108.32 | loss 4.56 | ppl 95.22
| epoch 8 | 600/ 1106 batches | lr 20.00 | ms/batch 107.16 | loss 4.56 | ppl 95.40
| epoch 8 | 800/ 1106 batches | lr 20.00 | ms/batch 105.46 | loss 4.59 | ppl 98.41
| epoch 8 | 1000/ 1106 batches | lr 20.00 | ms/batch 107.50 | loss 4.62 | ppl 101.75

| end of epoch 8 | time: 127.05s | valid loss 4.60 | valid ppl 99.58

Saving Normal!
| epoch 9 | 200/ 1106 batches | lr 20.00 | ms/batch 107.12 | loss 4.62 | ppl 101.09
| epoch 9 | 400/ 1106 batches | lr 20.00 | ms/batch 105.97 | loss 4.49 | ppl 88.71
| epoch 9 | 600/ 1106 batches | lr 20.00 | ms/batch 107.97 | loss 4.49 | ppl 88.86
| epoch 9 | 800/ 1106 batches | lr 20.00 | ms/batch 108.20 | loss 4.54 | ppl 93.39
| epoch 9 | 1000/ 1106 batches | lr 20.00 | ms/batch 106.97 | loss 4.55 | ppl 94.88

| end of epoch 9 | time: 126.72s | valid loss 4.55 | valid ppl 94.72

Saving Normal!
| epoch 10 | 200/ 1106 batches | lr 20.00 | ms/batch 107.85 | loss 4.56 | ppl 95.52
| epoch 10 | 400/ 1106 batches | lr 20.00 | ms/batch 107.13 | loss 4.43 | ppl 83.79
| epoch 10 | 600/ 1106 batches | lr 20.00 | ms/batch 107.88 | loss 4.44 | ppl 84.57
| epoch 10 | 800/ 1106 batches | lr 20.00 | ms/batch 105.14 | loss 4.47 | ppl 87.03
| epoch 10 | 1000/ 1106 batches | lr 20.00 | ms/batch 107.69 | loss 4.50 | ppl 90.47

| end of epoch 10 | time: 126.94s | valid loss 4.53 | valid ppl 92.87

Saving Normal!
| epoch 11 | 200/ 1106 batches | lr 20.00 | ms/batch 108.39 | loss 4.50 | ppl 89.80
| epoch 11 | 400/ 1106 batches | lr 20.00 | ms/batch 108.64 | loss 4.39 | ppl 80.51
| epoch 11 | 600/ 1106 batches | lr 20.00 | ms/batch 107.13 | loss 4.38 | ppl 80.10
| epoch 11 | 800/ 1106 batches | lr 20.00 | ms/batch 108.57 | loss 4.42 | ppl 82.85
| epoch 11 | 1000/ 1106 batches | lr 20.00 | ms/batch 107.64 | loss 4.44 | ppl 85.14

| end of epoch 11 | time: 126.86s | valid loss 4.51 | valid ppl 91.06

Saving Normal!
| epoch 12 | 200/ 1106 batches | lr 20.00 | ms/batch 110.09 | loss 4.46 | ppl 86.88
| epoch 12 | 400/ 1106 batches | lr 20.00 | ms/batch 107.15 | loss 4.34 | ppl 76.66
| epoch 12 | 600/ 1106 batches | lr 20.00 | ms/batch 108.75 | loss 4.35 | ppl 77.28
| epoch 12 | 800/ 1106 batches | lr 20.00 | ms/batch 108.37 | loss 4.39 | ppl 80.29
| epoch 12 | 1000/ 1106 batches | lr 20.00 | ms/batch 106.04 | loss 4.40 | ppl 81.38

| end of epoch 12 | time: 127.21s | valid loss 4.48 | valid ppl 88.10

Saving Normal!
| epoch 13 | 200/ 1106 batches | lr 20.00 | ms/batch 108.32 | loss 4.41 | ppl 82.49
| epoch 13 | 400/ 1106 batches | lr 20.00 | ms/batch 107.79 | loss 4.30 | ppl 73.47
| epoch 13 | 600/ 1106 batches | lr 20.00 | ms/batch 105.57 | loss 4.31 | ppl 74.45
| epoch 13 | 800/ 1106 batches | lr 20.00 | ms/batch 106.75 | loss 4.33 | ppl 76.15
| epoch 13 | 1000/ 1106 batches | lr 20.00 | ms/batch 107.71 | loss 4.38 | ppl 79.87

| end of epoch 13 | time: 127.36s | valid loss 4.46 | valid ppl 86.44

Saving Normal!
| epoch 14 | 200/ 1106 batches | lr 20.00 | ms/batch 107.97 | loss 4.38 | ppl 79.46
| epoch 14 | 400/ 1106 batches | lr 20.00 | ms/batch 106.24 | loss 4.26 | ppl 70.64
| epoch 14 | 600/ 1106 batches | lr 20.00 | ms/batch 107.90 | loss 4.28 | ppl 71.94
| epoch 14 | 800/ 1106 batches | lr 20.00 | ms/batch 110.63 | loss 4.29 | ppl 72.92
| epoch 14 | 1000/ 1106 batches | lr 20.00 | ms/batch 108.50 | loss 4.33 | ppl 75.96

| end of epoch 14 | time: 127.10s | valid loss 4.44 | valid ppl 84.38

Saving Normal!
| epoch 15 | 200/ 1106 batches | lr 20.00 | ms/batch 110.35 | loss 4.35 | ppl 77.14
| epoch 15 | 400/ 1106 batches | lr 20.00 | ms/batch 105.78 | loss 4.22 | ppl 68.31
| epoch 15 | 600/ 1106 batches | lr 20.00 | ms/batch 107.58 | loss 4.23 | ppl 68.62
| epoch 15 | 800/ 1106 batches | lr 20.00 | ms/batch 107.25 | loss 4.25 | ppl 70.16
| epoch 15 | 1000/ 1106 batches | lr 20.00 | ms/batch 108.33 | loss 4.29 | ppl 73.26

| end of epoch 15 | time: 126.82s | valid loss 4.43 | valid ppl 83.77

Saving Normal!
| epoch 16 | 200/ 1106 batches | lr 20.00 | ms/batch 108.29 | loss 4.32 | ppl 75.01
| epoch 16 | 400/ 1106 batches | lr 20.00 | ms/batch 107.73 | loss 4.19 | ppl 66.19
| epoch 16 | 600/ 1106 batches | lr 20.00 | ms/batch 106.91 | loss 4.19 | ppl 66.31
| epoch 16 | 800/ 1106 batches | lr 20.00 | ms/batch 107.26 | loss 4.24 | ppl 69.66
| epoch 16 | 1000/ 1106 batches | lr 20.00 | ms/batch 107.52 | loss 4.26 | ppl 70.59

| end of epoch 16 | time: 126.77s | valid loss 4.41 | valid ppl 82.56

Saving Normal!
| epoch 17 | 200/ 1106 batches | lr 20.00 | ms/batch 107.30 | loss 4.28 | ppl 71.95
| epoch 17 | 400/ 1106 batches | lr 20.00 | ms/batch 108.92 | loss 4.15 | ppl 63.70
| epoch 17 | 600/ 1106 batches | lr 20.00 | ms/batch 108.59 | loss 4.17 | ppl 64.84
| epoch 17 | 800/ 1106 batches | lr 20.00 | ms/batch 107.11 | loss 4.21 | ppl 67.03
| epoch 17 | 1000/ 1106 batches | lr 20.00 | ms/batch 108.99 | loss 4.23 | ppl 68.82

| end of epoch 17 | time: 126.60s | valid loss 4.40 | valid ppl 81.68

Saving Normal!
| epoch 18 | 200/ 1106 batches | lr 20.00 | ms/batch 107.31 | loss 4.26 | ppl 70.81
| epoch 18 | 400/ 1106 batches | lr 20.00 | ms/batch 107.32 | loss 4.14 | ppl 62.80
| epoch 18 | 600/ 1106 batches | lr 20.00 | ms/batch 107.27 | loss 4.15 | ppl 63.22
| epoch 18 | 800/ 1106 batches | lr 20.00 | ms/batch 107.67 | loss 4.17 | ppl 64.77
| epoch 18 | 1000/ 1106 batches | lr 20.00 | ms/batch 107.78 | loss 4.20 | ppl 66.38

| end of epoch 18 | time: 126.32s | valid loss 4.40 | valid ppl 81.22

Saving Normal!
| epoch 19 | 200/ 1106 batches | lr 20.00 | ms/batch 109.86 | loss 4.24 | ppl 69.55
| epoch 19 | 400/ 1106 batches | lr 20.00 | ms/batch 107.61 | loss 4.10 | ppl 60.18
| epoch 19 | 600/ 1106 batches | lr 20.00 | ms/batch 106.67 | loss 4.12 | ppl 61.61
| epoch 19 | 800/ 1106 batches | lr 20.00 | ms/batch 108.29 | loss 4.15 | ppl 63.52
| epoch 19 | 1000/ 1106 batches | lr 20.00 | ms/batch 107.98 | loss 4.17 | ppl 64.81

| end of epoch 19 | time: 126.72s | valid loss 4.40 | valid ppl 81.59

| epoch 20 | 200/ 1106 batches | lr 20.00 | ms/batch 109.31 | loss 4.20 | ppl 66.46
| epoch 20 | 400/ 1106 batches | lr 20.00 | ms/batch 109.37 | loss 4.09 | ppl 59.53
| epoch 20 | 600/ 1106 batches | lr 20.00 | ms/batch 105.99 | loss 4.10 | ppl 60.28
| epoch 20 | 800/ 1106 batches | lr 20.00 | ms/batch 107.38 | loss 4.12 | ppl 61.81
| epoch 20 | 1000/ 1106 batches | lr 20.00 | ms/batch 108.50 | loss 4.16 | ppl 64.14

| end of epoch 20 | time: 127.60s | valid loss 4.38 | valid ppl 79.93

Saving Normal!
| epoch 21 | 200/ 1106 batches | lr 20.00 | ms/batch 108.98 | loss 4.18 | ppl 65.67
| epoch 21 | 400/ 1106 batches | lr 20.00 | ms/batch 108.49 | loss 4.06 | ppl 57.93
| epoch 21 | 600/ 1106 batches | lr 20.00 | ms/batch 108.39 | loss 4.07 | ppl 58.63
| epoch 21 | 800/ 1106 batches | lr 20.00 | ms/batch 107.40 | loss 4.10 | ppl 60.44
| epoch 21 | 1000/ 1106 batches | lr 20.00 | ms/batch 108.18 | loss 4.13 | ppl 61.98

| end of epoch 21 | time: 126.82s | valid loss 4.37 | valid ppl 79.15

Saving Normal!
| epoch 22 | 200/ 1106 batches | lr 20.00 | ms/batch 107.35 | loss 4.15 | ppl 63.52
| epoch 22 | 400/ 1106 batches | lr 20.00 | ms/batch 107.54 | loss 4.05 | ppl 57.21
| epoch 22 | 600/ 1106 batches | lr 20.00 | ms/batch 106.73 | loss 4.06 | ppl 58.12
| epoch 22 | 800/ 1106 batches | lr 20.00 | ms/batch 108.01 | loss 4.08 | ppl 59.21
| epoch 22 | 1000/ 1106 batches | lr 20.00 | ms/batch 107.72 | loss 4.12 | ppl 61.52

| end of epoch 22 | time: 127.15s | valid loss 4.37 | valid ppl 78.95

Saving Normal!
| epoch 23 | 200/ 1106 batches | lr 20.00 | ms/batch 108.04 | loss 4.14 | ppl 62.99
| epoch 23 | 400/ 1106 batches | lr 20.00 | ms/batch 107.54 | loss 4.02 | ppl 55.47
| epoch 23 | 600/ 1106 batches | lr 20.00 | ms/batch 106.67 | loss 4.04 | ppl 57.09
| epoch 23 | 800/ 1106 batches | lr 20.00 | ms/batch 107.55 | loss 4.07 | ppl 58.28
| epoch 23 | 1000/ 1106 batches | lr 20.00 | ms/batch 109.80 | loss 4.10 | ppl 60.09

| end of epoch 23 | time: 126.98s | valid loss 4.36 | valid ppl 78.62

Saving Normal!
| epoch 24 | 200/ 1106 batches | lr 20.00 | ms/batch 109.06 | loss 4.11 | ppl 61.02
| epoch 24 | 400/ 1106 batches | lr 20.00 | ms/batch 108.41 | loss 4.01 | ppl 55.20
| epoch 24 | 600/ 1106 batches | lr 20.00 | ms/batch 106.90 | loss 4.02 | ppl 55.89
| epoch 24 | 800/ 1106 batches | lr 20.00 | ms/batch 106.90 | loss 4.04 | ppl 56.70
| epoch 24 | 1000/ 1106 batches | lr 20.00 | ms/batch 108.93 | loss 4.07 | ppl 58.60

| end of epoch 24 | time: 127.07s | valid loss 4.35 | valid ppl 77.53

Saving Normal!
| epoch 25 | 200/ 1106 batches | lr 20.00 | ms/batch 109.17 | loss 4.10 | ppl 60.07
| epoch 25 | 400/ 1106 batches | lr 20.00 | ms/batch 108.30 | loss 3.99 | ppl 54.28
| epoch 25 | 600/ 1106 batches | lr 20.00 | ms/batch 107.53 | loss 3.98 | ppl 53.76
| epoch 25 | 800/ 1106 batches | lr 20.00 | ms/batch 107.03 | loss 4.02 | ppl 55.54
| epoch 25 | 1000/ 1106 batches | lr 20.00 | ms/batch 110.10 | loss 4.05 | ppl 57.62

| end of epoch 25 | time: 127.27s | valid loss 4.35 | valid ppl 77.44

Saving Normal!
| epoch 26 | 200/ 1106 batches | lr 20.00 | ms/batch 109.52 | loss 4.09 | ppl 59.98
| epoch 26 | 400/ 1106 batches | lr 20.00 | ms/batch 107.39 | loss 3.97 | ppl 53.22
| epoch 26 | 600/ 1106 batches | lr 20.00 | ms/batch 107.96 | loss 3.99 | ppl 53.94
| epoch 26 | 800/ 1106 batches | lr 20.00 | ms/batch 109.01 | loss 4.01 | ppl 55.15
| epoch 26 | 1000/ 1106 batches | lr 20.00 | ms/batch 106.62 | loss 4.04 | ppl 56.81

| end of epoch 26 | time: 127.08s | valid loss 4.35 | valid ppl 77.75

| epoch 27 | 200/ 1106 batches | lr 20.00 | ms/batch 108.88 | loss 4.06 | ppl 58.18
| epoch 27 | 400/ 1106 batches | lr 20.00 | ms/batch 108.70 | loss 3.95 | ppl 51.89
| epoch 27 | 600/ 1106 batches | lr 20.00 | ms/batch 108.18 | loss 3.98 | ppl 53.36
| epoch 27 | 800/ 1106 batches | lr 20.00 | ms/batch 109.59 | loss 3.99 | ppl 54.02
| epoch 27 | 1000/ 1106 batches | lr 20.00 | ms/batch 107.35 | loss 4.03 | ppl 56.08

| end of epoch 27 | time: 127.53s | valid loss 4.34 | valid ppl 76.74

Saving Normal!
| epoch 28 | 200/ 1106 batches | lr 20.00 | ms/batch 107.65 | loss 4.06 | ppl 57.81
| epoch 28 | 400/ 1106 batches | lr 20.00 | ms/batch 106.94 | loss 3.94 | ppl 51.41
| epoch 28 | 600/ 1106 batches | lr 20.00 | ms/batch 107.65 | loss 3.95 | ppl 52.00
| epoch 28 | 800/ 1106 batches | lr 20.00 | ms/batch 107.85 | loss 3.99 | ppl 53.91
| epoch 28 | 1000/ 1106 batches | lr 20.00 | ms/batch 109.11 | loss 4.01 | ppl 54.96

| end of epoch 28 | time: 127.09s | valid loss 4.34 | valid ppl 76.40

Saving Normal!
| epoch 29 | 200/ 1106 batches | lr 20.00 | ms/batch 107.82 | loss 4.05 | ppl 57.28
| epoch 29 | 400/ 1106 batches | lr 20.00 | ms/batch 109.19 | loss 3.92 | ppl 50.40
| epoch 29 | 600/ 1106 batches | lr 20.00 | ms/batch 108.92 | loss 3.92 | ppl 50.58
| epoch 29 | 800/ 1106 batches | lr 20.00 | ms/batch 107.38 | loss 3.96 | ppl 52.57
| epoch 29 | 1000/ 1106 batches | lr 20.00 | ms/batch 107.71 | loss 3.99 | ppl 54.07

| end of epoch 29 | time: 127.12s | valid loss 4.34 | valid ppl 76.52

| epoch 30 | 200/ 1106 batches | lr 20.00 | ms/batch 108.50 | loss 4.02 | ppl 55.65
| epoch 30 | 400/ 1106 batches | lr 20.00 | ms/batch 110.07 | loss 3.90 | ppl 49.44
| epoch 30 | 600/ 1106 batches | lr 20.00 | ms/batch 108.16 | loss 3.92 | ppl 50.26
| epoch 30 | 800/ 1106 batches | lr 20.00 | ms/batch 107.02 | loss 3.96 | ppl 52.26
| epoch 30 | 1000/ 1106 batches | lr 20.00 | ms/batch 109.22 | loss 3.98 | ppl 53.61

| end of epoch 30 | time: 126.97s | valid loss 4.34 | valid ppl 76.73

| epoch 31 | 200/ 1106 batches | lr 20.00 | ms/batch 109.65 | loss 4.02 | ppl 55.58
| epoch 31 | 400/ 1106 batches | lr 20.00 | ms/batch 106.74 | loss 3.88 | ppl 48.29
| epoch 31 | 600/ 1106 batches | lr 20.00 | ms/batch 107.52 | loss 3.93 | ppl 50.65
| epoch 31 | 800/ 1106 batches | lr 20.00 | ms/batch 107.96 | loss 3.95 | ppl 52.14
| epoch 31 | 1000/ 1106 batches | lr 20.00 | ms/batch 107.81 | loss 3.96 | ppl 52.72

| end of epoch 31 | time: 126.83s | valid loss 4.35 | valid ppl 77.28

| epoch 32 | 200/ 1106 batches | lr 20.00 | ms/batch 107.41 | loss 4.00 | ppl 54.76
| epoch 32 | 400/ 1106 batches | lr 20.00 | ms/batch 108.92 | loss 3.87 | ppl 48.12
| epoch 32 | 600/ 1106 batches | lr 20.00 | ms/batch 106.48 | loss 3.90 | ppl 49.48
| epoch 32 | 800/ 1106 batches | lr 20.00 | ms/batch 106.34 | loss 3.91 | ppl 49.74
| epoch 32 | 1000/ 1106 batches | lr 20.00 | ms/batch 109.71 | loss 3.96 | ppl 52.66

| end of epoch 32 | time: 127.73s | valid loss 4.34 | valid ppl 76.91

| epoch 33 | 200/ 1106 batches | lr 20.00 | ms/batch 108.08 | loss 3.97 | ppl 53.03
| epoch 33 | 400/ 1106 batches | lr 20.00 | ms/batch 108.11 | loss 3.87 | ppl 47.84
| epoch 33 | 600/ 1106 batches | lr 20.00 | ms/batch 109.68 | loss 3.88 | ppl 48.43
| epoch 33 | 800/ 1106 batches | lr 20.00 | ms/batch 107.60 | loss 3.91 | ppl 49.83
| epoch 33 | 1000/ 1106 batches | lr 20.00 | ms/batch 106.91 | loss 3.95 | ppl 51.76

| end of epoch 33 | time: 126.99s | valid loss 4.34 | valid ppl 76.64

| epoch 34 | 200/ 1106 batches | lr 20.00 | ms/batch 107.60 | loss 3.96 | ppl 52.49
| epoch 34 | 400/ 1106 batches | lr 20.00 | ms/batch 109.23 | loss 3.86 | ppl 47.26
| epoch 34 | 600/ 1106 batches | lr 20.00 | ms/batch 109.36 | loss 3.88 | ppl 48.20
| epoch 34 | 800/ 1106 batches | lr 20.00 | ms/batch 107.91 | loss 3.90 | ppl 49.34
| epoch 34 | 1000/ 1106 batches | lr 20.00 | ms/batch 110.00 | loss 3.92 | ppl 50.54

| end of epoch 34 | time: 127.73s | valid loss 4.34 | valid ppl 76.54

Switching!
| epoch 35 | 200/ 1106 batches | lr 20.00 | ms/batch 111.66 | loss 3.96 | ppl 52.65
| epoch 35 | 400/ 1106 batches | lr 20.00 | ms/batch 110.60 | loss 3.84 | ppl 46.35
| epoch 35 | 600/ 1106 batches | lr 20.00 | ms/batch 110.78 | loss 3.85 | ppl 46.89
| epoch 35 | 800/ 1106 batches | lr 20.00 | ms/batch 108.84 | loss 3.90 | ppl 49.21
| epoch 35 | 1000/ 1106 batches | lr 20.00 | ms/batch 110.05 | loss 3.92 | ppl 50.46
/root/model.py:85: UserWarning: RNN module weights are not part of single contiguous chunk of memory. This means they need to be compacted at every call, possibly greatly increasing memory usage. To compact weights again call flatten_parameters().
raw_output, new_h = rnn(raw_output, hidden[l])

| end of epoch 35 | time: 129.69s | valid loss 4.23 | valid ppl 68.44

Saving Averaged!
| epoch 36 | 200/ 1106 batches | lr 20.00 | ms/batch 109.41 | loss 3.94 | ppl 51.62
| epoch 36 | 400/ 1106 batches | lr 20.00 | ms/batch 110.57 | loss 3.83 | ppl 45.95
| epoch 36 | 600/ 1106 batches | lr 20.00 | ms/batch 110.09 | loss 3.85 | ppl 47.05
| epoch 36 | 800/ 1106 batches | lr 20.00 | ms/batch 110.61 | loss 3.88 | ppl 48.44
| epoch 36 | 1000/ 1106 batches | lr 20.00 | ms/batch 110.18 | loss 3.93 | ppl 50.70

| end of epoch 36 | time: 130.18s | valid loss 4.22 | valid ppl 67.79

Saving Averaged!
| epoch 37 | 200/ 1106 batches | lr 20.00 | ms/batch 110.96 | loss 3.93 | ppl 51.10
| epoch 37 | 400/ 1106 batches | lr 20.00 | ms/batch 111.73 | loss 3.84 | ppl 46.52
| epoch 37 | 600/ 1106 batches | lr 20.00 | ms/batch 110.79 | loss 3.85 | ppl 46.95
| epoch 37 | 800/ 1106 batches | lr 20.00 | ms/batch 111.91 | loss 3.88 | ppl 48.41
| epoch 37 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.37 | loss 3.91 | ppl 49.82

| end of epoch 37 | time: 130.69s | valid loss 4.21 | valid ppl 67.45

Saving Averaged!
| epoch 38 | 200/ 1106 batches | lr 20.00 | ms/batch 109.26 | loss 3.92 | ppl 50.37
| epoch 38 | 400/ 1106 batches | lr 20.00 | ms/batch 109.33 | loss 3.81 | ppl 45.00
| epoch 38 | 600/ 1106 batches | lr 20.00 | ms/batch 112.15 | loss 3.83 | ppl 45.95
| epoch 38 | 800/ 1106 batches | lr 20.00 | ms/batch 111.61 | loss 3.86 | ppl 47.47
| epoch 38 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.49 | loss 3.89 | ppl 49.12

| end of epoch 38 | time: 130.61s | valid loss 4.21 | valid ppl 67.18

Saving Averaged!
| epoch 39 | 200/ 1106 batches | lr 20.00 | ms/batch 111.70 | loss 3.93 | ppl 50.73
| epoch 39 | 400/ 1106 batches | lr 20.00 | ms/batch 111.86 | loss 3.81 | ppl 45.06
| epoch 39 | 600/ 1106 batches | lr 20.00 | ms/batch 110.45 | loss 3.82 | ppl 45.45
| epoch 39 | 800/ 1106 batches | lr 20.00 | ms/batch 113.52 | loss 3.86 | ppl 47.48
| epoch 39 | 1000/ 1106 batches | lr 20.00 | ms/batch 110.96 | loss 3.88 | ppl 48.60

| end of epoch 39 | time: 130.71s | valid loss 4.20 | valid ppl 66.99

Saving Averaged!
| epoch 40 | 200/ 1106 batches | lr 20.00 | ms/batch 112.65 | loss 3.91 | ppl 49.96
| epoch 40 | 400/ 1106 batches | lr 20.00 | ms/batch 111.44 | loss 3.80 | ppl 44.65
| epoch 40 | 600/ 1106 batches | lr 20.00 | ms/batch 112.57 | loss 3.81 | ppl 45.30
| epoch 40 | 800/ 1106 batches | lr 20.00 | ms/batch 111.35 | loss 3.83 | ppl 45.97
| epoch 40 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.31 | loss 3.88 | ppl 48.31

| end of epoch 40 | time: 131.38s | valid loss 4.20 | valid ppl 66.82

Saving Averaged!
| epoch 41 | 200/ 1106 batches | lr 20.00 | ms/batch 110.90 | loss 3.90 | ppl 49.18
| epoch 41 | 400/ 1106 batches | lr 20.00 | ms/batch 114.95 | loss 3.80 | ppl 44.60
| epoch 41 | 600/ 1106 batches | lr 20.00 | ms/batch 110.67 | loss 3.79 | ppl 44.41
| epoch 41 | 800/ 1106 batches | lr 20.00 | ms/batch 110.74 | loss 3.83 | ppl 45.84
| epoch 41 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.57 | loss 3.86 | ppl 47.65

| end of epoch 41 | time: 131.34s | valid loss 4.20 | valid ppl 66.70

Saving Averaged!
| epoch 42 | 200/ 1106 batches | lr 20.00 | ms/batch 112.11 | loss 3.88 | ppl 48.65
| epoch 42 | 400/ 1106 batches | lr 20.00 | ms/batch 112.13 | loss 3.78 | ppl 43.89
| epoch 42 | 600/ 1106 batches | lr 20.00 | ms/batch 112.22 | loss 3.79 | ppl 44.42
| epoch 42 | 800/ 1106 batches | lr 20.00 | ms/batch 111.15 | loss 3.82 | ppl 45.82
| epoch 42 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.65 | loss 3.86 | ppl 47.27

| end of epoch 42 | time: 131.00s | valid loss 4.20 | valid ppl 66.60

Saving Averaged!
| epoch 43 | 200/ 1106 batches | lr 20.00 | ms/batch 111.02 | loss 3.88 | ppl 48.19
| epoch 43 | 400/ 1106 batches | lr 20.00 | ms/batch 111.16 | loss 3.76 | ppl 42.83
| epoch 43 | 600/ 1106 batches | lr 20.00 | ms/batch 111.64 | loss 3.79 | ppl 44.09
| epoch 43 | 800/ 1106 batches | lr 20.00 | ms/batch 114.07 | loss 3.81 | ppl 45.33
| epoch 43 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.05 | loss 3.84 | ppl 46.40

| end of epoch 43 | time: 131.63s | valid loss 4.20 | valid ppl 66.51

Saving Averaged!
| epoch 44 | 200/ 1106 batches | lr 20.00 | ms/batch 110.83 | loss 3.87 | ppl 48.17
| epoch 44 | 400/ 1106 batches | lr 20.00 | ms/batch 110.88 | loss 3.75 | ppl 42.61
| epoch 44 | 600/ 1106 batches | lr 20.00 | ms/batch 112.35 | loss 3.79 | ppl 44.23
| epoch 44 | 800/ 1106 batches | lr 20.00 | ms/batch 111.84 | loss 3.82 | ppl 45.53
| epoch 44 | 1000/ 1106 batches | lr 20.00 | ms/batch 112.13 | loss 3.84 | ppl 46.51

| end of epoch 44 | time: 131.22s | valid loss 4.20 | valid ppl 66.41

Saving Averaged!
| epoch 45 | 200/ 1106 batches | lr 20.00 | ms/batch 113.02 | loss 3.87 | ppl 47.90
| epoch 45 | 400/ 1106 batches | lr 20.00 | ms/batch 110.75 | loss 3.76 | ppl 42.95
| epoch 45 | 600/ 1106 batches | lr 20.00 | ms/batch 108.98 | loss 3.75 | ppl 42.72
| epoch 45 | 800/ 1106 batches | lr 20.00 | ms/batch 111.09 | loss 3.79 | ppl 44.15
| epoch 45 | 1000/ 1106 batches | lr 20.00 | ms/batch 112.63 | loss 3.82 | ppl 45.76

| end of epoch 45 | time: 130.78s | valid loss 4.19 | valid ppl 66.31

Saving Averaged!
| epoch 46 | 200/ 1106 batches | lr 20.00 | ms/batch 113.03 | loss 3.86 | ppl 47.39
| epoch 46 | 400/ 1106 batches | lr 20.00 | ms/batch 110.15 | loss 3.75 | ppl 42.33
| epoch 46 | 600/ 1106 batches | lr 20.00 | ms/batch 111.82 | loss 3.76 | ppl 43.10
| epoch 46 | 800/ 1106 batches | lr 20.00 | ms/batch 110.42 | loss 3.78 | ppl 43.94
| epoch 46 | 1000/ 1106 batches | lr 20.00 | ms/batch 113.16 | loss 3.82 | ppl 45.46

| end of epoch 46 | time: 131.03s | valid loss 4.19 | valid ppl 66.22

Saving Averaged!
| epoch 47 | 200/ 1106 batches | lr 20.00 | ms/batch 110.95 | loss 3.84 | ppl 46.39
| epoch 47 | 400/ 1106 batches | lr 20.00 | ms/batch 111.54 | loss 3.73 | ppl 41.59
| epoch 47 | 600/ 1106 batches | lr 20.00 | ms/batch 110.20 | loss 3.75 | ppl 42.34
| epoch 47 | 800/ 1106 batches | lr 20.00 | ms/batch 113.73 | loss 3.78 | ppl 43.95
| epoch 47 | 1000/ 1106 batches | lr 20.00 | ms/batch 112.80 | loss 3.80 | ppl 44.92

| end of epoch 47 | time: 130.80s | valid loss 4.19 | valid ppl 66.16

Saving Averaged!
| epoch 48 | 200/ 1106 batches | lr 20.00 | ms/batch 113.88 | loss 3.84 | ppl 46.62
| epoch 48 | 400/ 1106 batches | lr 20.00 | ms/batch 111.08 | loss 3.73 | ppl 41.79
| epoch 48 | 600/ 1106 batches | lr 20.00 | ms/batch 110.69 | loss 3.76 | ppl 42.80
| epoch 48 | 800/ 1106 batches | lr 20.00 | ms/batch 112.65 | loss 3.77 | ppl 43.37
| epoch 48 | 1000/ 1106 batches | lr 20.00 | ms/batch 112.66 | loss 3.80 | ppl 44.81

| end of epoch 48 | time: 131.30s | valid loss 4.19 | valid ppl 66.11

Saving Averaged!
| epoch 49 | 200/ 1106 batches | lr 20.00 | ms/batch 112.19 | loss 3.83 | ppl 46.07
| epoch 49 | 400/ 1106 batches | lr 20.00 | ms/batch 111.67 | loss 3.72 | ppl 41.27
| epoch 49 | 600/ 1106 batches | lr 20.00 | ms/batch 111.66 | loss 3.74 | ppl 42.12
| epoch 49 | 800/ 1106 batches | lr 20.00 | ms/batch 111.92 | loss 3.76 | ppl 42.93
| epoch 49 | 1000/ 1106 batches | lr 20.00 | ms/batch 110.30 | loss 3.78 | ppl 43.90

| end of epoch 49 | time: 130.84s | valid loss 4.19 | valid ppl 66.06

Saving Averaged!
| epoch 50 | 200/ 1106 batches | lr 20.00 | ms/batch 112.05 | loss 3.82 | ppl 45.56
| epoch 50 | 400/ 1106 batches | lr 20.00 | ms/batch 109.52 | loss 3.71 | ppl 40.79
| epoch 50 | 600/ 1106 batches | lr 20.00 | ms/batch 112.56 | loss 3.73 | ppl 41.83
| epoch 50 | 800/ 1106 batches | lr 20.00 | ms/batch 112.31 | loss 3.75 | ppl 42.56
| epoch 50 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.26 | loss 3.81 | ppl 45.11

| end of epoch 50 | time: 131.99s | valid loss 4.19 | valid ppl 66.01

Saving Averaged!
| epoch 51 | 200/ 1106 batches | lr 20.00 | ms/batch 111.97 | loss 3.81 | ppl 45.36
| epoch 51 | 400/ 1106 batches | lr 20.00 | ms/batch 110.76 | loss 3.70 | ppl 40.61
| epoch 51 | 600/ 1106 batches | lr 20.00 | ms/batch 110.32 | loss 3.72 | ppl 41.24
| epoch 51 | 800/ 1106 batches | lr 20.00 | ms/batch 111.41 | loss 3.75 | ppl 42.73
| epoch 51 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.41 | loss 3.79 | ppl 44.09

| end of epoch 51 | time: 131.37s | valid loss 4.19 | valid ppl 65.95

Saving Averaged!
| epoch 52 | 200/ 1106 batches | lr 20.00 | ms/batch 112.89 | loss 3.82 | ppl 45.39
| epoch 52 | 400/ 1106 batches | lr 20.00 | ms/batch 110.37 | loss 3.69 | ppl 39.91
| epoch 52 | 600/ 1106 batches | lr 20.00 | ms/batch 111.23 | loss 3.72 | ppl 41.13
| epoch 52 | 800/ 1106 batches | lr 20.00 | ms/batch 113.92 | loss 3.76 | ppl 42.82
| epoch 52 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.41 | loss 3.78 | ppl 43.66

| end of epoch 52 | time: 130.84s | valid loss 4.19 | valid ppl 65.90

Saving Averaged!
| epoch 53 | 200/ 1106 batches | lr 20.00 | ms/batch 108.60 | loss 3.79 | ppl 44.06
| epoch 53 | 400/ 1106 batches | lr 20.00 | ms/batch 111.69 | loss 3.68 | ppl 39.81
| epoch 53 | 600/ 1106 batches | lr 20.00 | ms/batch 111.81 | loss 3.71 | ppl 40.94
| epoch 53 | 800/ 1106 batches | lr 20.00 | ms/batch 112.97 | loss 3.73 | ppl 41.55
| epoch 53 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.22 | loss 3.76 | ppl 42.99

| end of epoch 53 | time: 130.73s | valid loss 4.19 | valid ppl 65.86

Saving Averaged!
| epoch 54 | 200/ 1106 batches | lr 20.00 | ms/batch 110.81 | loss 3.80 | ppl 44.48
| epoch 54 | 400/ 1106 batches | lr 20.00 | ms/batch 111.58 | loss 3.68 | ppl 39.60
| epoch 54 | 600/ 1106 batches | lr 20.00 | ms/batch 110.40 | loss 3.69 | ppl 40.06
| epoch 54 | 800/ 1106 batches | lr 20.00 | ms/batch 113.08 | loss 3.73 | ppl 41.57
| epoch 54 | 1000/ 1106 batches | lr 20.00 | ms/batch 110.01 | loss 3.75 | ppl 42.58

| end of epoch 54 | time: 130.96s | valid loss 4.19 | valid ppl 65.82

Saving Averaged!
| epoch 55 | 200/ 1106 batches | lr 20.00 | ms/batch 110.72 | loss 3.79 | ppl 44.12
| epoch 55 | 400/ 1106 batches | lr 20.00 | ms/batch 110.21 | loss 3.67 | ppl 39.39
| epoch 55 | 600/ 1106 batches | lr 20.00 | ms/batch 112.75 | loss 3.70 | ppl 40.33
| epoch 55 | 800/ 1106 batches | lr 20.00 | ms/batch 111.63 | loss 3.72 | ppl 41.26
| epoch 55 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.66 | loss 3.75 | ppl 42.56

| end of epoch 55 | time: 130.91s | valid loss 4.19 | valid ppl 65.78

Saving Averaged!
| epoch 56 | 200/ 1106 batches | lr 20.00 | ms/batch 112.06 | loss 3.77 | ppl 43.59
| epoch 56 | 400/ 1106 batches | lr 20.00 | ms/batch 109.95 | loss 3.66 | ppl 38.92
| epoch 56 | 600/ 1106 batches | lr 20.00 | ms/batch 111.30 | loss 3.69 | ppl 40.14
| epoch 56 | 800/ 1106 batches | lr 20.00 | ms/batch 113.68 | loss 3.71 | ppl 40.85
| epoch 56 | 1000/ 1106 batches | lr 20.00 | ms/batch 112.22 | loss 3.74 | ppl 42.21

| end of epoch 56 | time: 130.58s | valid loss 4.19 | valid ppl 65.75

Saving Averaged!
| epoch 57 | 200/ 1106 batches | lr 20.00 | ms/batch 112.40 | loss 3.77 | ppl 43.21
| epoch 57 | 400/ 1106 batches | lr 20.00 | ms/batch 112.68 | loss 3.65 | ppl 38.47
| epoch 57 | 600/ 1106 batches | lr 20.00 | ms/batch 110.29 | loss 3.67 | ppl 39.36
| epoch 57 | 800/ 1106 batches | lr 20.00 | ms/batch 111.27 | loss 3.71 | ppl 40.78
| epoch 57 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.95 | loss 3.73 | ppl 41.84

| end of epoch 57 | time: 130.85s | valid loss 4.19 | valid ppl 65.73

Saving Averaged!
| epoch 58 | 200/ 1106 batches | lr 20.00 | ms/batch 110.63 | loss 3.77 | ppl 43.50
| epoch 58 | 400/ 1106 batches | lr 20.00 | ms/batch 111.27 | loss 3.66 | ppl 38.86
| epoch 58 | 600/ 1106 batches | lr 20.00 | ms/batch 107.62 | loss 3.67 | ppl 39.30
| epoch 58 | 800/ 1106 batches | lr 20.00 | ms/batch 110.70 | loss 3.68 | ppl 39.55
| epoch 58 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.29 | loss 3.72 | ppl 41.06

| end of epoch 58 | time: 130.33s | valid loss 4.19 | valid ppl 65.70

Saving Averaged!
| epoch 59 | 200/ 1106 batches | lr 20.00 | ms/batch 109.21 | loss 3.76 | ppl 42.84
| epoch 59 | 400/ 1106 batches | lr 20.00 | ms/batch 112.42 | loss 3.65 | ppl 38.52
| epoch 59 | 600/ 1106 batches | lr 20.00 | ms/batch 112.32 | loss 3.68 | ppl 39.49
| epoch 59 | 800/ 1106 batches | lr 20.00 | ms/batch 111.64 | loss 3.69 | ppl 40.02
| epoch 59 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.50 | loss 3.73 | ppl 41.56

| end of epoch 59 | time: 130.76s | valid loss 4.18 | valid ppl 65.67

Saving Averaged!
| epoch 60 | 200/ 1106 batches | lr 20.00 | ms/batch 110.45 | loss 3.76 | ppl 43.05
| epoch 60 | 400/ 1106 batches | lr 20.00 | ms/batch 110.57 | loss 3.64 | ppl 38.27
| epoch 60 | 600/ 1106 batches | lr 20.00 | ms/batch 110.06 | loss 3.66 | ppl 38.81
| epoch 60 | 800/ 1106 batches | lr 20.00 | ms/batch 111.18 | loss 3.70 | ppl 40.48
| epoch 60 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.95 | loss 3.72 | ppl 41.24

| end of epoch 60 | time: 130.84s | valid loss 4.18 | valid ppl 65.64

Saving Averaged!
| epoch 61 | 200/ 1106 batches | lr 20.00 | ms/batch 112.97 | loss 3.74 | ppl 42.27
| epoch 61 | 400/ 1106 batches | lr 20.00 | ms/batch 109.70 | loss 3.62 | ppl 37.26
| epoch 61 | 600/ 1106 batches | lr 20.00 | ms/batch 109.84 | loss 3.66 | ppl 38.99
| epoch 61 | 800/ 1106 batches | lr 20.00 | ms/batch 111.17 | loss 3.67 | ppl 39.38
| epoch 61 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.05 | loss 3.71 | ppl 40.74

| end of epoch 61 | time: 130.82s | valid loss 4.18 | valid ppl 65.62

Saving Averaged!
| epoch 62 | 200/ 1106 batches | lr 20.00 | ms/batch 113.02 | loss 3.74 | ppl 41.92
| epoch 62 | 400/ 1106 batches | lr 20.00 | ms/batch 110.16 | loss 3.63 | ppl 37.73
| epoch 62 | 600/ 1106 batches | lr 20.00 | ms/batch 111.14 | loss 3.65 | ppl 38.56
| epoch 62 | 800/ 1106 batches | lr 20.00 | ms/batch 110.92 | loss 3.68 | ppl 39.74
| epoch 62 | 1000/ 1106 batches | lr 20.00 | ms/batch 110.38 | loss 3.72 | ppl 41.20

| end of epoch 62 | time: 131.65s | valid loss 4.18 | valid ppl 65.59

Saving Averaged!
| epoch 63 | 200/ 1106 batches | lr 20.00 | ms/batch 111.10 | loss 3.74 | ppl 41.89
| epoch 63 | 400/ 1106 batches | lr 20.00 | ms/batch 113.07 | loss 3.62 | ppl 37.30
| epoch 63 | 600/ 1106 batches | lr 20.00 | ms/batch 111.86 | loss 3.64 | ppl 38.16
| epoch 63 | 800/ 1106 batches | lr 20.00 | ms/batch 110.04 | loss 3.68 | ppl 39.74
| epoch 63 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.89 | loss 3.70 | ppl 40.62

| end of epoch 63 | time: 130.40s | valid loss 4.18 | valid ppl 65.57

Saving Averaged!
| epoch 64 | 200/ 1106 batches | lr 20.00 | ms/batch 113.01 | loss 3.74 | ppl 42.17
| epoch 64 | 400/ 1106 batches | lr 20.00 | ms/batch 113.32 | loss 3.63 | ppl 37.59
| epoch 64 | 600/ 1106 batches | lr 20.00 | ms/batch 111.17 | loss 3.65 | ppl 38.39
| epoch 64 | 800/ 1106 batches | lr 20.00 | ms/batch 113.85 | loss 3.67 | ppl 39.23
| epoch 64 | 1000/ 1106 batches | lr 20.00 | ms/batch 112.71 | loss 3.69 | ppl 40.10

| end of epoch 64 | time: 131.54s | valid loss 4.18 | valid ppl 65.55

Saving Averaged!
| epoch 65 | 200/ 1106 batches | lr 20.00 | ms/batch 111.88 | loss 3.72 | ppl 41.46
| epoch 65 | 400/ 1106 batches | lr 20.00 | ms/batch 112.05 | loss 3.61 | ppl 37.11
| epoch 65 | 600/ 1106 batches | lr 20.00 | ms/batch 111.35 | loss 3.65 | ppl 38.30
| epoch 65 | 800/ 1106 batches | lr 20.00 | ms/batch 110.31 | loss 3.67 | ppl 39.40
| epoch 65 | 1000/ 1106 batches | lr 20.00 | ms/batch 112.52 | loss 3.70 | ppl 40.31

| end of epoch 65 | time: 131.19s | valid loss 4.18 | valid ppl 65.54

Saving Averaged!
| epoch 66 | 200/ 1106 batches | lr 20.00 | ms/batch 110.93 | loss 3.72 | ppl 41.24
| epoch 66 | 400/ 1106 batches | lr 20.00 | ms/batch 109.33 | loss 3.60 | ppl 36.45
| epoch 66 | 600/ 1106 batches | lr 20.00 | ms/batch 110.89 | loss 3.63 | ppl 37.77
| epoch 66 | 800/ 1106 batches | lr 20.00 | ms/batch 112.14 | loss 3.65 | ppl 38.66
| epoch 66 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.35 | loss 3.68 | ppl 39.83

| end of epoch 66 | time: 130.74s | valid loss 4.18 | valid ppl 65.53

Saving Averaged!
| epoch 67 | 200/ 1106 batches | lr 20.00 | ms/batch 111.79 | loss 3.71 | ppl 40.83
| epoch 67 | 400/ 1106 batches | lr 20.00 | ms/batch 110.10 | loss 3.58 | ppl 35.97
| epoch 67 | 600/ 1106 batches | lr 20.00 | ms/batch 110.87 | loss 3.63 | ppl 37.81
| epoch 67 | 800/ 1106 batches | lr 20.00 | ms/batch 111.76 | loss 3.66 | ppl 39.01
| epoch 67 | 1000/ 1106 batches | lr 20.00 | ms/batch 109.28 | loss 3.67 | ppl 39.17

| end of epoch 67 | time: 130.49s | valid loss 4.18 | valid ppl 65.51

Saving Averaged!
| epoch 68 | 200/ 1106 batches | lr 20.00 | ms/batch 111.89 | loss 3.69 | ppl 39.88
| epoch 68 | 400/ 1106 batches | lr 20.00 | ms/batch 110.61 | loss 3.60 | ppl 36.76
| epoch 68 | 600/ 1106 batches | lr 20.00 | ms/batch 110.93 | loss 3.62 | ppl 37.51
| epoch 68 | 800/ 1106 batches | lr 20.00 | ms/batch 111.24 | loss 3.64 | ppl 38.04
| epoch 68 | 1000/ 1106 batches | lr 20.00 | ms/batch 113.06 | loss 3.66 | ppl 38.86

| end of epoch 68 | time: 131.55s | valid loss 4.18 | valid ppl 65.49

Saving Averaged!
| epoch 69 | 200/ 1106 batches | lr 20.00 | ms/batch 112.91 | loss 3.70 | ppl 40.60
| epoch 69 | 400/ 1106 batches | lr 20.00 | ms/batch 112.99 | loss 3.60 | ppl 36.51
| epoch 69 | 600/ 1106 batches | lr 20.00 | ms/batch 110.26 | loss 3.61 | ppl 36.91
| epoch 69 | 800/ 1106 batches | lr 20.00 | ms/batch 112.56 | loss 3.65 | ppl 38.48
| epoch 69 | 1000/ 1106 batches | lr 20.00 | ms/batch 109.85 | loss 3.67 | ppl 39.35

| end of epoch 69 | time: 131.35s | valid loss 4.18 | valid ppl 65.48

Saving Averaged!
| epoch 70 | 200/ 1106 batches | lr 20.00 | ms/batch 112.32 | loss 3.69 | ppl 40.03
| epoch 70 | 400/ 1106 batches | lr 20.00 | ms/batch 109.32 | loss 3.60 | ppl 36.51
| epoch 70 | 600/ 1106 batches | lr 20.00 | ms/batch 110.70 | loss 3.62 | ppl 37.19
| epoch 70 | 800/ 1106 batches | lr 20.00 | ms/batch 110.78 | loss 3.64 | ppl 37.93
| epoch 70 | 1000/ 1106 batches | lr 20.00 | ms/batch 110.63 | loss 3.66 | ppl 38.85

| end of epoch 70 | time: 130.72s | valid loss 4.18 | valid ppl 65.46

Saving Averaged!
| epoch 71 | 200/ 1106 batches | lr 20.00 | ms/batch 113.46 | loss 3.70 | ppl 40.30
| epoch 71 | 400/ 1106 batches | lr 20.00 | ms/batch 110.81 | loss 3.60 | ppl 36.43
| epoch 71 | 600/ 1106 batches | lr 20.00 | ms/batch 111.39 | loss 3.61 | ppl 37.05
| epoch 71 | 800/ 1106 batches | lr 20.00 | ms/batch 111.27 | loss 3.63 | ppl 37.65
| epoch 71 | 1000/ 1106 batches | lr 20.00 | ms/batch 113.49 | loss 3.66 | ppl 38.69

| end of epoch 71 | time: 131.20s | valid loss 4.18 | valid ppl 65.44

Saving Averaged!
| epoch 72 | 200/ 1106 batches | lr 20.00 | ms/batch 110.07 | loss 3.68 | ppl 39.65
| epoch 72 | 400/ 1106 batches | lr 20.00 | ms/batch 109.92 | loss 3.58 | ppl 35.79
| epoch 72 | 600/ 1106 batches | lr 20.00 | ms/batch 113.23 | loss 3.61 | ppl 36.88
| epoch 72 | 800/ 1106 batches | lr 20.00 | ms/batch 111.43 | loss 3.62 | ppl 37.37
| epoch 72 | 1000/ 1106 batches | lr 20.00 | ms/batch 110.51 | loss 3.66 | ppl 39.02

| end of epoch 72 | time: 130.95s | valid loss 4.18 | valid ppl 65.43

Saving Averaged!
| epoch 73 | 200/ 1106 batches | lr 20.00 | ms/batch 112.43 | loss 3.69 | ppl 39.94
| epoch 73 | 400/ 1106 batches | lr 20.00 | ms/batch 112.86 | loss 3.57 | ppl 35.61
| epoch 73 | 600/ 1106 batches | lr 20.00 | ms/batch 110.95 | loss 3.59 | ppl 36.35
| epoch 73 | 800/ 1106 batches | lr 20.00 | ms/batch 109.81 | loss 3.63 | ppl 37.60
| epoch 73 | 1000/ 1106 batches | lr 20.00 | ms/batch 109.59 | loss 3.65 | ppl 38.50

| end of epoch 73 | time: 131.34s | valid loss 4.18 | valid ppl 65.42

Saving Averaged!
| epoch 74 | 200/ 1106 batches | lr 20.00 | ms/batch 110.61 | loss 3.69 | ppl 40.07
| epoch 74 | 400/ 1106 batches | lr 20.00 | ms/batch 112.46 | loss 3.58 | ppl 35.76
| epoch 74 | 600/ 1106 batches | lr 20.00 | ms/batch 110.58 | loss 3.60 | ppl 36.70
| epoch 74 | 800/ 1106 batches | lr 20.00 | ms/batch 112.62 | loss 3.62 | ppl 37.41
| epoch 74 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.17 | loss 3.65 | ppl 38.37

| end of epoch 74 | time: 131.45s | valid loss 4.18 | valid ppl 65.41

Saving Averaged!
| epoch 75 | 200/ 1106 batches | lr 20.00 | ms/batch 111.16 | loss 3.68 | ppl 39.56
| epoch 75 | 400/ 1106 batches | lr 20.00 | ms/batch 112.05 | loss 3.57 | ppl 35.62
| epoch 75 | 600/ 1106 batches | lr 20.00 | ms/batch 109.31 | loss 3.59 | ppl 36.19
| epoch 75 | 800/ 1106 batches | lr 20.00 | ms/batch 112.52 | loss 3.61 | ppl 36.94
| epoch 75 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.73 | loss 3.64 | ppl 38.07

| end of epoch 75 | time: 130.90s | valid loss 4.18 | valid ppl 65.40

Saving Averaged!
| epoch 76 | 200/ 1106 batches | lr 20.00 | ms/batch 112.32 | loss 3.68 | ppl 39.72
| epoch 76 | 400/ 1106 batches | lr 20.00 | ms/batch 109.74 | loss 3.55 | ppl 34.81
| epoch 76 | 600/ 1106 batches | lr 20.00 | ms/batch 110.49 | loss 3.59 | ppl 36.41
| epoch 76 | 800/ 1106 batches | lr 20.00 | ms/batch 109.70 | loss 3.60 | ppl 36.45
| epoch 76 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.66 | loss 3.66 | ppl 38.81

| end of epoch 76 | time: 131.32s | valid loss 4.18 | valid ppl 65.40

Saving Averaged!
| epoch 77 | 200/ 1106 batches | lr 20.00 | ms/batch 110.93 | loss 3.66 | ppl 39.05
| epoch 77 | 400/ 1106 batches | lr 20.00 | ms/batch 110.88 | loss 3.56 | ppl 35.15
| epoch 77 | 600/ 1106 batches | lr 20.00 | ms/batch 110.62 | loss 3.58 | ppl 36.05
| epoch 77 | 800/ 1106 batches | lr 20.00 | ms/batch 111.94 | loss 3.61 | ppl 37.12
| epoch 77 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.49 | loss 3.63 | ppl 37.62

| end of epoch 77 | time: 130.89s | valid loss 4.18 | valid ppl 65.39

Saving Averaged!
| epoch 78 | 200/ 1106 batches | lr 20.00 | ms/batch 112.99 | loss 3.66 | ppl 38.79
| epoch 78 | 400/ 1106 batches | lr 20.00 | ms/batch 110.78 | loss 3.56 | ppl 35.01
| epoch 78 | 600/ 1106 batches | lr 20.00 | ms/batch 112.61 | loss 3.57 | ppl 35.67
| epoch 78 | 800/ 1106 batches | lr 20.00 | ms/batch 109.34 | loss 3.60 | ppl 36.62
| epoch 78 | 1000/ 1106 batches | lr 20.00 | ms/batch 110.55 | loss 3.62 | ppl 37.41

| end of epoch 78 | time: 131.01s | valid loss 4.18 | valid ppl 65.38

Saving Averaged!
| epoch 79 | 200/ 1106 batches | lr 20.00 | ms/batch 113.47 | loss 3.67 | ppl 39.14
| epoch 79 | 400/ 1106 batches | lr 20.00 | ms/batch 112.23 | loss 3.55 | ppl 34.90
| epoch 79 | 600/ 1106 batches | lr 20.00 | ms/batch 109.23 | loss 3.58 | ppl 35.85
| epoch 79 | 800/ 1106 batches | lr 20.00 | ms/batch 112.75 | loss 3.60 | ppl 36.63
| epoch 79 | 1000/ 1106 batches | lr 20.00 | ms/batch 112.83 | loss 3.62 | ppl 37.38

| end of epoch 79 | time: 130.35s | valid loss 4.18 | valid ppl 65.38

Saving Averaged!
| epoch 80 | 200/ 1106 batches | lr 20.00 | ms/batch 113.06 | loss 3.65 | ppl 38.33
| epoch 80 | 400/ 1106 batches | lr 20.00 | ms/batch 110.13 | loss 3.53 | ppl 34.26
| epoch 80 | 600/ 1106 batches | lr 20.00 | ms/batch 110.58 | loss 3.57 | ppl 35.51
| epoch 80 | 800/ 1106 batches | lr 20.00 | ms/batch 110.33 | loss 3.59 | ppl 36.22
| epoch 80 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.88 | loss 3.61 | ppl 37.15

| end of epoch 80 | time: 130.41s | valid loss 4.18 | valid ppl 65.38

Saving Averaged!
| epoch 81 | 200/ 1106 batches | lr 20.00 | ms/batch 111.94 | loss 3.66 | ppl 38.67
| epoch 81 | 400/ 1106 batches | lr 20.00 | ms/batch 111.80 | loss 3.54 | ppl 34.45
| epoch 81 | 600/ 1106 batches | lr 20.00 | ms/batch 110.85 | loss 3.56 | ppl 35.31
| epoch 81 | 800/ 1106 batches | lr 20.00 | ms/batch 109.87 | loss 3.59 | ppl 36.41
| epoch 81 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.29 | loss 3.61 | ppl 36.91

| end of epoch 81 | time: 130.52s | valid loss 4.18 | valid ppl 65.37

Saving Averaged!
| epoch 82 | 200/ 1106 batches | lr 20.00 | ms/batch 112.82 | loss 3.65 | ppl 38.65
| epoch 82 | 400/ 1106 batches | lr 20.00 | ms/batch 111.36 | loss 3.54 | ppl 34.51
| epoch 82 | 600/ 1106 batches | lr 20.00 | ms/batch 112.64 | loss 3.56 | ppl 35.17
| epoch 82 | 800/ 1106 batches | lr 20.00 | ms/batch 111.80 | loss 3.58 | ppl 35.96
| epoch 82 | 1000/ 1106 batches | lr 20.00 | ms/batch 110.34 | loss 3.63 | ppl 37.64

| end of epoch 82 | time: 130.95s | valid loss 4.18 | valid ppl 65.36

Saving Averaged!
| epoch 83 | 200/ 1106 batches | lr 20.00 | ms/batch 110.01 | loss 3.65 | ppl 38.37
| epoch 83 | 400/ 1106 batches | lr 20.00 | ms/batch 111.55 | loss 3.54 | ppl 34.33
| epoch 83 | 600/ 1106 batches | lr 20.00 | ms/batch 111.52 | loss 3.55 | ppl 34.83
| epoch 83 | 800/ 1106 batches | lr 20.00 | ms/batch 110.79 | loss 3.58 | ppl 35.91
| epoch 83 | 1000/ 1106 batches | lr 20.00 | ms/batch 112.42 | loss 3.61 | ppl 36.83

| end of epoch 83 | time: 130.66s | valid loss 4.18 | valid ppl 65.36

Saving Averaged!
| epoch 84 | 200/ 1106 batches | lr 20.00 | ms/batch 111.89 | loss 3.64 | ppl 38.24
| epoch 84 | 400/ 1106 batches | lr 20.00 | ms/batch 110.79 | loss 3.52 | ppl 33.95
| epoch 84 | 600/ 1106 batches | lr 20.00 | ms/batch 111.34 | loss 3.55 | ppl 34.78
| epoch 84 | 800/ 1106 batches | lr 20.00 | ms/batch 111.22 | loss 3.59 | ppl 36.08
| epoch 84 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.75 | loss 3.60 | ppl 36.72

| end of epoch 84 | time: 130.50s | valid loss 4.18 | valid ppl 65.36

Saving Averaged!
| epoch 85 | 200/ 1106 batches | lr 20.00 | ms/batch 111.04 | loss 3.65 | ppl 38.32
| epoch 85 | 400/ 1106 batches | lr 20.00 | ms/batch 110.69 | loss 3.52 | ppl 33.65
| epoch 85 | 600/ 1106 batches | lr 20.00 | ms/batch 109.67 | loss 3.55 | ppl 34.93
| epoch 85 | 800/ 1106 batches | lr 20.00 | ms/batch 108.89 | loss 3.56 | ppl 35.07
| epoch 85 | 1000/ 1106 batches | lr 20.00 | ms/batch 110.31 | loss 3.62 | ppl 37.20

| end of epoch 85 | time: 131.11s | valid loss 4.18 | valid ppl 65.35

Saving Averaged!
| epoch 86 | 200/ 1106 batches | lr 20.00 | ms/batch 111.14 | loss 3.63 | ppl 37.76
| epoch 86 | 400/ 1106 batches | lr 20.00 | ms/batch 111.91 | loss 3.52 | ppl 33.77
| epoch 86 | 600/ 1106 batches | lr 20.00 | ms/batch 112.45 | loss 3.54 | ppl 34.40
| epoch 86 | 800/ 1106 batches | lr 20.00 | ms/batch 111.93 | loss 3.57 | ppl 35.43
| epoch 86 | 1000/ 1106 batches | lr 20.00 | ms/batch 112.47 | loss 3.60 | ppl 36.59

| end of epoch 86 | time: 130.86s | valid loss 4.18 | valid ppl 65.35

Saving Averaged!
| epoch 87 | 200/ 1106 batches | lr 20.00 | ms/batch 111.02 | loss 3.64 | ppl 38.11
| epoch 87 | 400/ 1106 batches | lr 20.00 | ms/batch 110.38 | loss 3.51 | ppl 33.31
| epoch 87 | 600/ 1106 batches | lr 20.00 | ms/batch 111.61 | loss 3.53 | ppl 34.25
| epoch 87 | 800/ 1106 batches | lr 20.00 | ms/batch 111.30 | loss 3.56 | ppl 35.10
| epoch 87 | 1000/ 1106 batches | lr 20.00 | ms/batch 110.33 | loss 3.59 | ppl 36.32

| end of epoch 87 | time: 130.38s | valid loss 4.18 | valid ppl 65.34

Saving Averaged!
| epoch 88 | 200/ 1106 batches | lr 20.00 | ms/batch 112.70 | loss 3.62 | ppl 37.38
| epoch 88 | 400/ 1106 batches | lr 20.00 | ms/batch 111.16 | loss 3.51 | ppl 33.53
| epoch 88 | 600/ 1106 batches | lr 20.00 | ms/batch 109.65 | loss 3.54 | ppl 34.40
| epoch 88 | 800/ 1106 batches | lr 20.00 | ms/batch 111.54 | loss 3.56 | ppl 35.29
| epoch 88 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.62 | loss 3.60 | ppl 36.59

| end of epoch 88 | time: 131.19s | valid loss 4.18 | valid ppl 65.34

Saving Averaged!
| epoch 89 | 200/ 1106 batches | lr 20.00 | ms/batch 111.31 | loss 3.61 | ppl 37.03
| epoch 89 | 400/ 1106 batches | lr 20.00 | ms/batch 110.65 | loss 3.50 | ppl 33.22
| epoch 89 | 600/ 1106 batches | lr 20.00 | ms/batch 109.31 | loss 3.53 | ppl 34.02
| epoch 89 | 800/ 1106 batches | lr 20.00 | ms/batch 109.53 | loss 3.55 | ppl 34.90
| epoch 89 | 1000/ 1106 batches | lr 20.00 | ms/batch 109.12 | loss 3.59 | ppl 36.35

| end of epoch 89 | time: 130.54s | valid loss 4.18 | valid ppl 65.34

Saving Averaged!
| epoch 90 | 200/ 1106 batches | lr 20.00 | ms/batch 114.45 | loss 3.63 | ppl 37.64
| epoch 90 | 400/ 1106 batches | lr 20.00 | ms/batch 111.30 | loss 3.51 | ppl 33.35
| epoch 90 | 600/ 1106 batches | lr 20.00 | ms/batch 111.72 | loss 3.53 | ppl 34.01
| epoch 90 | 800/ 1106 batches | lr 20.00 | ms/batch 111.19 | loss 3.54 | ppl 34.59
| epoch 90 | 1000/ 1106 batches | lr 20.00 | ms/batch 110.96 | loss 3.59 | ppl 36.14

| end of epoch 90 | time: 131.03s | valid loss 4.18 | valid ppl 65.33

Saving Averaged!
| epoch 91 | 200/ 1106 batches | lr 20.00 | ms/batch 112.76 | loss 3.61 | ppl 36.85
| epoch 91 | 400/ 1106 batches | lr 20.00 | ms/batch 111.81 | loss 3.52 | ppl 33.70
| epoch 91 | 600/ 1106 batches | lr 20.00 | ms/batch 110.26 | loss 3.52 | ppl 33.74
| epoch 91 | 800/ 1106 batches | lr 20.00 | ms/batch 111.90 | loss 3.55 | ppl 34.72
| epoch 91 | 1000/ 1106 batches | lr 20.00 | ms/batch 113.06 | loss 3.57 | ppl 35.67

| end of epoch 91 | time: 131.45s | valid loss 4.18 | valid ppl 65.33

Saving Averaged!
| epoch 92 | 200/ 1106 batches | lr 20.00 | ms/batch 113.04 | loss 3.61 | ppl 36.80
| epoch 92 | 400/ 1106 batches | lr 20.00 | ms/batch 110.20 | loss 3.51 | ppl 33.44
| epoch 92 | 600/ 1106 batches | lr 20.00 | ms/batch 114.52 | loss 3.52 | ppl 33.88
| epoch 92 | 800/ 1106 batches | lr 20.00 | ms/batch 112.64 | loss 3.56 | ppl 35.03
| epoch 92 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.87 | loss 3.58 | ppl 35.86

| end of epoch 92 | time: 131.62s | valid loss 4.18 | valid ppl 65.33

Saving Averaged!
| epoch 93 | 200/ 1106 batches | lr 20.00 | ms/batch 112.31 | loss 3.59 | ppl 36.10
| epoch 93 | 400/ 1106 batches | lr 20.00 | ms/batch 110.96 | loss 3.50 | ppl 33.18
| epoch 93 | 600/ 1106 batches | lr 20.00 | ms/batch 112.32 | loss 3.53 | ppl 34.00
| epoch 93 | 800/ 1106 batches | lr 20.00 | ms/batch 110.78 | loss 3.55 | ppl 34.68
| epoch 93 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.56 | loss 3.58 | ppl 36.01

| end of epoch 93 | time: 130.59s | valid loss 4.18 | valid ppl 65.33

| epoch 94 | 200/ 1106 batches | lr 20.00 | ms/batch 114.22 | loss 3.59 | ppl 36.33
| epoch 94 | 400/ 1106 batches | lr 20.00 | ms/batch 111.61 | loss 3.49 | ppl 32.95
| epoch 94 | 600/ 1106 batches | lr 20.00 | ms/batch 109.69 | loss 3.52 | ppl 33.69
| epoch 94 | 800/ 1106 batches | lr 20.00 | ms/batch 111.47 | loss 3.55 | ppl 34.84
| epoch 94 | 1000/ 1106 batches | lr 20.00 | ms/batch 110.22 | loss 3.57 | ppl 35.54

| end of epoch 94 | time: 130.99s | valid loss 4.18 | valid ppl 65.32

Saving Averaged!
| epoch 95 | 200/ 1106 batches | lr 20.00 | ms/batch 112.99 | loss 3.61 | ppl 36.81
| epoch 95 | 400/ 1106 batches | lr 20.00 | ms/batch 111.03 | loss 3.49 | ppl 32.84
| epoch 95 | 600/ 1106 batches | lr 20.00 | ms/batch 111.93 | loss 3.52 | ppl 33.72
| epoch 95 | 800/ 1106 batches | lr 20.00 | ms/batch 113.99 | loss 3.55 | ppl 34.66
| epoch 95 | 1000/ 1106 batches | lr 20.00 | ms/batch 110.27 | loss 3.55 | ppl 34.84

| end of epoch 95 | time: 130.87s | valid loss 4.18 | valid ppl 65.32

Saving Averaged!
| epoch 96 | 200/ 1106 batches | lr 20.00 | ms/batch 110.37 | loss 3.60 | ppl 36.44
| epoch 96 | 400/ 1106 batches | lr 20.00 | ms/batch 111.69 | loss 3.48 | ppl 32.57
| epoch 96 | 600/ 1106 batches | lr 20.00 | ms/batch 110.17 | loss 3.52 | ppl 33.72
| epoch 96 | 800/ 1106 batches | lr 20.00 | ms/batch 111.01 | loss 3.54 | ppl 34.34
| epoch 96 | 1000/ 1106 batches | lr 20.00 | ms/batch 109.86 | loss 3.58 | ppl 36.04

| end of epoch 96 | time: 131.12s | valid loss 4.18 | valid ppl 65.32

| epoch 97 | 200/ 1106 batches | lr 20.00 | ms/batch 111.15 | loss 3.58 | ppl 36.03
| epoch 97 | 400/ 1106 batches | lr 20.00 | ms/batch 110.66 | loss 3.49 | ppl 32.65
| epoch 97 | 600/ 1106 batches | lr 20.00 | ms/batch 112.72 | loss 3.50 | ppl 33.18
| epoch 97 | 800/ 1106 batches | lr 20.00 | ms/batch 112.92 | loss 3.53 | ppl 34.07
| epoch 97 | 1000/ 1106 batches | lr 20.00 | ms/batch 112.22 | loss 3.56 | ppl 35.03

| end of epoch 97 | time: 131.51s | valid loss 4.18 | valid ppl 65.32

| epoch 98 | 200/ 1106 batches | lr 20.00 | ms/batch 112.28 | loss 3.58 | ppl 35.77
| epoch 98 | 400/ 1106 batches | lr 20.00 | ms/batch 109.91 | loss 3.48 | ppl 32.40
| epoch 98 | 600/ 1106 batches | lr 20.00 | ms/batch 111.12 | loss 3.52 | ppl 33.75
| epoch 98 | 800/ 1106 batches | lr 20.00 | ms/batch 112.41 | loss 3.53 | ppl 34.23
| epoch 98 | 1000/ 1106 batches | lr 20.00 | ms/batch 113.57 | loss 3.55 | ppl 34.76

| end of epoch 98 | time: 131.21s | valid loss 4.18 | valid ppl 65.32

Saving Averaged!
| epoch 99 | 200/ 1106 batches | lr 20.00 | ms/batch 112.74 | loss 3.58 | ppl 35.92
| epoch 99 | 400/ 1106 batches | lr 20.00 | ms/batch 112.47 | loss 3.48 | ppl 32.48
| epoch 99 | 600/ 1106 batches | lr 20.00 | ms/batch 112.83 | loss 3.50 | ppl 33.01
| epoch 99 | 800/ 1106 batches | lr 20.00 | ms/batch 111.27 | loss 3.54 | ppl 34.53
| epoch 99 | 1000/ 1106 batches | lr 20.00 | ms/batch 111.70 | loss 3.55 | ppl 34.81

| end of epoch 99 | time: 131.17s | valid loss 4.18 | valid ppl 65.32

Saving Averaged!
| epoch 100 | 200/ 1106 batches | lr 20.00 | ms/batch 112.87 | loss 3.58 | ppl 35.97
| epoch 100 | 400/ 1106 batches | lr 20.00 | ms/batch 109.93 | loss 3.46 | ppl 31.73
| epoch 100 | 600/ 1106 batches | lr 20.00 | ms/batch 111.76 | loss 3.49 | ppl 32.75
| epoch 100 | 800/ 1106 batches | lr 20.00 | ms/batch 110.81 | loss 3.53 | ppl 34.07
| epoch 100 | 1000/ 1106 batches | lr 20.00 | ms/batch 112.64 | loss 3.54 | ppl 34.61

| end of epoch 100 | time: 131.12s | valid loss 4.18 | valid ppl 65.32

What is creators GPU hardware and speed?

Which GPU does the first recommended command line run well on? On an Nvidia 1050ti (4Gb of GPU memory) it exhausts GPU memory with a batch size of 12, but runs with a batch size of 6. With that setting, I get order of 300ms per batch. Ubuntu Linux 16.04 on a cheap used Dell 7500 with two 6 core Xeons, if it matters.

How to run this code on a multi-cpu computer?

I am running this code on my Linux server, however it seems that this python program is only runned by one of my computer's cpu and leaves two thirds total RAM free. Is where any way to accelerate the program?

Not able to download Penn Treebank data

Hi, Zihang

It seems that the original link for downloading Penn Treebank data no longer exists. Could you please update this and let me know where I can download the same data to replicate your experiment ?

Thanks,
Yuzhou

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.