Comments (10)
Hi,
I made a similar script to yours, and I get nearly identical feature maps between Pytorch and Tensorflow for identical inputs.
I don't think that you use the Inception model in this repository with the latest changed by me, but the original Inception from torchvision. This Inception implementation does not correspond to the one used in Tensorflow's FID. Could you please run your tests again, but with the newest model (and weights) provided here?
from pytorch-fid.
I downloaded your model and run the script with command python test_inception.py --load_path <your_model>
So I used your weight.
Could you please provide your script?
from pytorch-fid.
What's more, in my result the difference is concentrated on edge. Perhaps the padding is not consistent.
tf_pytorch_error.pdf
from pytorch-fid.
Sorry you are right. I didn't use your model. Now my results are
=> Pytorch pool3:
[0.13110925 0.5254472 0.22908828 0.02930989 0.24280292 0.51165456]
=> Tensorflow pool3:
[0.1308305 0.52602327 0.22930455 0.02920746 0.24395223 0.5096492 ]
=> Mean abs difference
0.0011506503
I believe your repo is correct. But I am wondering why the official Pytorch Inception is different. Is the official network architecture different from yours? Because I ported the Tensorflow weights directly and the weights I used should be exactly same.
from pytorch-fid.
Yes, I discovered that there are some minor differences between the Pytorch implementation and the Inception model used by FID. There are different Inception graphs/weights published. If you open the graph in Tensorboard, you can see some differences.
You can see the necessary changes to the model in this commit: f64228c. The changes are marked in comments beginning with Patch
.
By the way, I think you can get even lower mean difference if you use exactly the same input tensor for both Tensorflow and Pytorch. So resize your image only once, with PIL, and then skip the resize of the Tensorflow graph (with input key FID_Inception_Net/Mul:0
, but be aware that this requires your inputs to in range [-1, 1]
).
from pytorch-fid.
Yes. It is now on 1e-5. Thank you for your work!
I also discovered how the difference in my network come from. The TF InceptionA module's average pooling does not count padded 0s on edges while Pytorch counted the padded 0s. Also on the last layer TF's implementation uses MAX pooling (which is incorrect) rather than the average pooling used in Pytorch.
from pytorch-fid.
@mseitzer But do you really think we should use the result exactly the same as TTUR? According to your patch, they do not use the correct Inception code. Maybe the official weight of Pytorch is better for evaluation.
from pytorch-fid.
Well, I don't think there is much choice here. The authors of FID chose this version of Inception, and to stay comparable, everyone has to use this version now. I guess the FID authors just took the same model Inception score uses: https://github.com/openai/improved-gan/blob/master/inception_score/model.py
Also, this Inception model is not "wrong", it just does not directly correspond to the paper. Its feature maps should still contain what the authors of FID and Inception scores want to measure.
But I do think it is a bit concerning that FID score gives wildly different results if you use a different version of Inception. In my opinion, the metric should not be sensitive to that.
from pytorch-fid.
Fixed by #16
from pytorch-fid.
hey @AtlantixJJ can you tell me what change you made to the test_inception.py script which led you to get better results?
P.S. Nvm, the images i was feeding were corrupted, which is why i was getting wrong values. It works perfectly, thanks 💯
from pytorch-fid.
Related Issues (20)
- Imaginary component 3.1913775165377e+114 HOT 4
- How to calculate fid score with label HOT 1
- Can I implement the code with video data? HOT 1
- ValueError: Imaginary component 4.082076360939105e+125 HOT 12
- Batch-size Error HOT 5
- FID is a negative value HOT 2
- ValueError: batch_size should be a positive integer value, but got batch_size=0 HOT 8
- Invalid path error HELP PLS
- CUDNN_STATUS_NOT_SUPPORTED
- transforms: ToTensor(), Normalization HOT 1
- python: symbol lookup error: /home/xxx/miniconda3/envs/torch/lib/python3.7/site-packages/mk│ l/../../../libmkl_intel_thread.so.1: undefined symbol: __kmpc_global_thread_num HOT 1
- RuntimeError: unexpected EOF, expected 877244 more bytes. The file might be corrupted. HOT 3
- Query: WGAN-GP FID SCORE (PyTorch) HOT 1
- No module named pytorch_fid
- RuntimeError: cuDNN error: CUDNN_STATUS_NOT_INITIALIZED HOT 1
- batch_size error HOT 7
- A better way to compute the FID
- Faster computation for FID
- Dataset sizes
- Error while loading weights from url HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from pytorch-fid.