Comments (3)
I also train resnet101 of 384x288,my batch_size is 24
06-25 09:26:36 Epoch 351 itr 21061536/60000: lr: 1.5625e-05 speed: 1.21(0.94s r0.25)s/itr 0.21h/epoch global_loss: 21.4816 refine_loss: 34.5009 loss: 55.9825 06-25 09:26:37 Epoch 351 itr 21061632/60000: lr: 1.5625e-05 speed: 1.21(0.94s r0.25)s/itr 0.21h/epoch global_loss: 18.6679 refine_loss: 28.4806 loss: 47.1486 06-25 09:26:38 Epoch 351 itr 21061728/60000: lr: 1.5625e-05 speed: 1.21(0.94s r0.25)s/itr 0.21h/epoch global_loss: 18.8858 refine_loss: 28.5278 loss: 47.4136 06-25 09:26:39 Epoch 351 itr 21061824/60000: lr: 1.5625e-05 speed: 1.21(0.94s r0.25)s/itr 0.21h/epoch global_loss: 24.7110 refine_loss: 38.7275 loss: 63.4385 06-25 09:26:40 Epoch 351 itr 21061920/60000: lr: 1.5625e-05 speed: 1.21(0.94s r0.25)s/itr 0.21h/epoch global_loss: 19.4357 refine_loss: 33.6059 loss: 53.0416 06-25 09:26:42 Epoch 351 itr 21062016/60000: lr: 1.5625e-05 speed: 1.21(0.94s r0.25)s/itr 0.21h/epoch global_loss: 21.2636 refine_loss: 36.7295 loss: 57.9931 06-25 09:26:43 Epoch 351 itr 21062112/60000: lr: 1.5625e-05 speed: 1.21(0.94s r0.25)s/itr 0.21h/epoch global_loss: 18.0329 refine_loss: 32.2765 loss: 50.3094 06-25 09:26:44 Epoch 351 itr 21062208/60000: lr: 1.5625e-05 speed: 1.21(0.94s r0.25)s/itr 0.21h/epoch global_loss: 17.2394 refine_loss: 27.5176 loss: 44.7570 06-25 09:26:45 Epoch 351 itr 21062304/60000: lr: 1.5625e-05 speed: 1.21(0.94s r0.25)s/itr 0.21h/epoch global_loss: 16.4739 refine_loss: 27.0160 loss: 43.4899 06-25 09:26:46 Epoch 351 itr 21062400/60000: lr: 1.5625e-05 speed: 1.21(0.94s r0.25)s/itr 0.21h/epoch global_loss: 18.3966 refine_loss: 36.0244 loss: 54.4210 06-25 09:26:48 Epoch 351 itr 21062496/60000: lr: 1.5625e-05 speed: 1.21(0.94s r0.25)s/itr 0.21h/epoch global_loss: 19.4529 refine_loss: 32.0978 loss: 51.5507 06-25 09:26:49 Epoch 351 itr 21062592/60000: lr: 1.5625e-05 speed: 1.21(0.94s r0.25)s/itr 0.21h/epoch global_loss: 21.5119 refine_loss: 34.4650 loss: 55.9769 06-25 09:26:50 Epoch 351 itr 21062688/60000: lr: 1.5625e-05 speed: 1.21(0.94s r0.25)s/itr 0.21h/epoch global_loss: 18.6829 refine_loss: 28.4995 loss: 47.1824 06-25 09:26:51 Epoch 351 itr 21062784/60000: lr: 1.5625e-05 speed: 1.21(0.94s r0.25)s/itr 0.21h/epoch global_loss: 18.9842 refine_loss: 34.5742 loss: 53.5584 06-25 09:26:52 Epoch 351 itr 21062880/60000: lr: 1.5625e-05 speed: 1.21(0.94s r0.25)s/itr 0.21h/epoch global_loss: 19.8309 refine_loss: 33.7855 loss: 53.6165 06-25 09:26:54 Epoch 351 itr 21062976/60000: lr: 1.5625e-05 speed: 1.21(0.94s r0.25)s/itr 0.21h/epoch global_loss: 19.5187 refine_loss: 31.1628 loss: 50.6815 06-25 09:26:55 Epoch 351 itr 21063072/60000: lr: 1.5625e-05 speed: 1.21(0.94s r0.25)s/itr 0.21h/epoch global_loss: 21.1973 refine_loss: 38.4505 loss: 59.6478 06-25 09:26:56 Epoch 351 itr 21063168/60000: lr: 1.5625e-05 speed: 1.21(0.94s r0.25)s/itr 0.21h/epoch global_loss: 15.4164 refine_loss: 25.8850 loss: 41.3013 06-25 09:26:57 Epoch 351 itr 21063264/60000: lr: 1.5625e-05 speed: 1.21(0.94s r0.25)s/itr 0.21h/epoch global_loss: 17.2441 refine_loss: 29.4565 loss: 46.7006 06-25 09:26:59 Epoch 351 itr 21063360/60000: lr: 1.5625e-05 speed: 1.21(0.94s r0.25)s/itr 0.21h/epoch global_loss: 21.7971 refine_loss: 37.6570 loss: 59.4541 06-25 09:27:00 Epoch 351 itr 21063456/60000: lr: 1.5625e-05 speed: 1.21(0.94s r0.25)s/itr 0.21h/epoch global_loss: 19.3076 refine_loss: 32.4295 loss: 51.7371
It's my loss, hope it can be helpful to you
from tf-cpn.
@juciny Thanks
from tf-cpn.
@juciny what GPU did you used for this? Also, do you think I need to train it to epoch 350?
I tried retraining and stopped early (epoch 100 or so) and the results look far worse. Do you get a significant performance improvement as you continue?
from tf-cpn.
Related Issues (20)
- Has anyone tested on the MPII dataset?
- Strange results HOT 1
- About detect single person HOT 16
- About code running environment
- How do I test with pictures?
- bbox is necessary in Test? HOT 2
- An error occurred after I executed: `python3 mptest.py -d 0-1 -m log/model_dump/snapshot_350.ckpt`
- global_loss += tf.reduce_mean(tf.square(global_out - global_label)) / len(labels); global_loss /= 2;.so damn hard to understand
- why? gk15 = (23, 23) gk11 = (17, 17) gk9 = (13, 13) gk7 = (9, 9)
- strange results
- > @dagongji10 Excuse me, can I ask you a question? Must I provide bbox of pictures when I test a picture?Could you please give me a script which can visualize test results like what you show? Thank you very much HOT 3
- coco dataset? which year?
- train train
- "read none image" during training? HOT 1
- from nets.basemodel import resnet50, resnet_arg_scope, resnet_v1
- labels = [label15, label11, label9, label7]
- How to apply inference on own dataset without GT_bbox(.json)?
- How to decrease Batch size HOT 4
- Using just GlobalNet
- is it possible to add new joint in the skel and retrain the model
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from tf-cpn.