Comments (7)
Hello @andrewilyas and @rsokl
I got the same issue when loading model l2 0.25 and 1.0. But loading model l2 0.5 and l_inf_8 has no problem. I followed your recommendation to downgrade dill 0.3.4 to 0.3.2, but it did not work. I am using python 3.8. Please do me a favor!
Thanks
from robustness.
Hi! Can you please indicate what version of dill
is in each environment?
from robustness.
Sure! Both versions are using
dill 0.3.2 pypi_0 pypi
from robustness.
Sorry forgot to follow up on this! Can you also post your torch version on the 3.7 environment? Is it the same?
from robustness.
It is the same - 1.6.0
Both environments have the same library versions. Only the Python version is different
from robustness.
Ok looks like this is a known issue with dill
(uqfoundation/dill#357), but it was patched in the latest version. If you have both 3.7 and 3.8 on your machine, you can fix this yourself by doing:
- Update dill to 0.3.2 in both versions
- In Python 3.7, execute:
import torch, dill
x = torch.load("cifar_l2_1_0.pt", pickle_module=dill)
torch.save(x, "cifar_l2_1_0.pt", pickle_module=dill)
- In Python 3.8, you should now be able to load the checkpoint using your attached code.
Meanwhile, we will do this for all the links we have above so that the public versions are also cross-compatible.
from robustness.
Great, I will give that a shot. Thank you for your help!
Closing this assuming that the above route will work without a hitch :)
from robustness.
Related Issues (20)
- RuntimeError: Found 0 files in subfolders of: ./data/imagenet2012/images/val Supported extensions are: .jpg,.jpeg,.png,.ppm,.bmp,.pgm,.tif HOT 1
- Can't load 'densenet121" HOT 1
- Inquiry - running tensorboardx on store HOT 1
- Missing files for Customized ImageNet Subset. HOT 2
- Cannot load model CIFAR-10 - epsilon = 0.25 or 1.0 HOT 4
- Constant name for final feature layer for easier visualization
- How to customize the loss function
- Interpreting Top5 Accuracy
- Add validate every n epochs HOT 3
- torchvision-dependent code breaks for version `0.11.3+cu113` HOT 1
- import of `custom_modules` breaking with newer version of python/pytorch/torchvision
- The loss-aggregation for the attacker should be `sum` not `mean` HOT 3
- Pretrained imagenet_linf_16?
- ModuleNotFoundError: No module named 'robustness.model_utils'; 'robustness' is not a package HOT 4
- import robustness. dataset "No module named 'torchvision.models.utils'" HOT 2
- Architecture Types
- Robustness of ImageNet Linf-norm (ResNet50): HOT 2
- Preparing dataset takes too long..
- The Link to robust-models-transfer repo is down HOT 1
- How to use the pretrained model?
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from robustness.