igorsusmelj / pytorch-styleguide Goto Github PK
View Code? Open in Web Editor NEWAn unofficial styleguide and best practices summary for PyTorch
License: GNU General Public License v3.0
An unofficial styleguide and best practices summary for PyTorch
License: GNU General Public License v3.0
The style guide started as a small side project for personal use. We then started using it within our company. Now, I heard from friends that some of the largest tech companies all over the world are using it or extended versions.
I would love to learn which companies are using this style guide
(If you're a user of the guide and want to be featured in the readme let me know)
Feel free to let me know if you want something else covered :)
Hi Igor,
First of all, thank you for the repo, super cool and very usefull. Hovewer, I am sorry but I have to say it, all the examples in the readme use bad coding practices. Take this:
class ConvBlock(nn.Module):
def __init__(self):
super(ConvBlock, self).__init__()
block = [nn.Conv2d(...)]
block += [nn.ReLU()]
block += [nn.BatchNorm2d(...)]
self.block = nn.Sequential(*block)
def forward(self, x):
return self.block(x)
Why create a 1 item array every time and add it to block? It doesn't make sense, it is confusing and useless. Do this instead
class ConvBlock(nn.Module):
def __init__(self):
super(ConvBlock, self).__init__()
self.block = nn.Sequential(
nn.Conv2d(...),
nn.ReLU(),
nn.BatchNorm2d(...)
)
def forward(self, x):
return self.block(x)
Cleaner and faster to code, or even better:
class ConvBlock(nn.Sequential):
def __init__(self):
super().__init__( nn.Conv2d(...),
nn.ReLU(),
nn.BatchNorm2d(...))
No need to write the forward method.
Hope it helps and I hope to see better and better code in the future :)
I really appreciate your tutorial, it would be nice to have a complete example.
Hi Team,
I've been thinking about doing something like this and you guys already have a great head start. I'd love to be a collaborator or at least a regular contributor to this project.
Speaking of which, here are some ideas on how the guide can be improved.
And many others I forgot about.
I really enjoy this guide! However, I am not sure what the advantage of prefetch_generator
is. It seems that DataLoader in pytorch has already supported prefetching.
Thank you!
After using the prefetch generator, the dataloader is not automatically released after each epoch under multi-gpu-DDP, resulting in a memory leak, has this been encountered by anyone?
In opposite to what you say in the section
A nn.module can be used on input data in two ways whereas the latter one is commonly used for better readability. self.net(input) simply uses the call() method of the object to feed the input through the module.
output = self.net.forward(input) # or output = self.net(input)
it is not recommended, to call the plain forward
in python.
If this would, be the same, the __call__
method (which is called if you call a class) would be something like this:
def __call__(self, *args, **kwargs):
return self.forward(*args, **kwargs)
But instead it is slightly more complex :
def __call__(self, *input, **kwargs):
for hook in self._forward_pre_hooks.values():
hook(self, input)
if torch._C._get_tracing_state():
result = self._slow_forward(*input, **kwargs)
else:
result = self.forward(*input, **kwargs)
for hook in self._forward_hooks.values():
hook_result = hook(self, input, result)
if hook_result is not None:
raise RuntimeError(
"forward hooks should never return any values, but '{}'"
"didn't return None".format(hook))
if len(self._backward_hooks) > 0:
var = result
while not isinstance(var, torch.Tensor):
if isinstance(var, dict):
var = next((v for v in var.values() if isinstance(v, torch.Tensor)))
else:
var = var[0]
grad_fn = var.grad_fn
if grad_fn is not None:
for hook in self._backward_hooks.values():
wrapper = functools.partial(hook, self)
functools.update_wrapper(wrapper, hook)
grad_fn.register_hook(wrapper)
return result
because it also deals with all the registered hooks (which wouldn't be considered when calling the plain forward
).
def multi_gpu(model):
device = torch.device('cuda' if torch.cuda.is_available() else 'cpu')
if device != torch.device('cpu') and torch.cuda.device_count() > 1:
model = nn.DataParallel(model)
print(f'Using {torch.cuda.device_count()} GPUs!')
elif device == torch.device('cuda'):
print(f'Using 1 GPU!')
else:
print('Using CPU!')
model.to(device)
return model, device
I do by this.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.