GithubHelp home page GithubHelp logo

abdo-eldesokey / nconv Goto Github PK

View Code? Open in Web Editor NEW
83.0 83.0 18.0 13.14 MB

A PyTorch implementation for our work "Confidence Propagation through CNNs for Guided Sparse Depth Regression"

License: GNU General Public License v3.0

Python 99.57% Shell 0.43%

nconv's People

Contributors

abdo-eldesokey avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

nconv's Issues

Which dataset used for training

Hey there.
I'm a little confused by which datasets you used for training exp_guided_nconv_cnn_l2, it seems to be train+selval according to run-nconv-cnn.py#L67 .
But it's validate on selval too.

Evaluation results on [selval], Epoch [39]:

Also what's the difference between nconv/workspace/exp_guided_nconv_cnn_l2/checkpoints/CNN_ep0039.pth.tar and the one for test set, which datasets do you train it on.

about cached memory when running the code

Hi,
I have a question about the code.
When I train the model on KittiDataset, I find that the program puts a lot of data to the cached memory until the memory is full.
when i use "top" command in Terminal, I see that: (I use the exp=exp_guided_enc_dec)

image

Have you ever encountered this situation? Did you notice that? (Although it might not affect training? )
Many thanks!

About NYU pretrained weight

Thank you for sharing this wonderful work,

When I go to the nyu repo, ican't find the nyu pretrained weight. Can you provide it?

How to produce the pretrained model for unguided network

Hi, I have a few questions about your implementation.

  1. I saw that you already had a pre-trained model for unguided_network.py. How to re-produce this pretrained model?
  2. I didn't see you use the SoftPlus function in unguided_network.py. Can you explain it to me? In your paper, your unguided network was proposed to use this function.

Bug when try LiDAR only

This is the exp I use : exp_unguided_depth
But got this:

TypeError: init() missing 1 required positional argument: 'padding_mode'

Could you help us figure it out?

The demo works very well when I choose -exp exp_guided_nconv_cnn_l2

evaluate function

Hi,
Is there an error in the evaluate function in kittidepthtrainer?

outputs *= self.params['data_normalize_factor']/256
labels *= self.params['data_normalize_factor']/256

why do you divide outputs by 256? I think the output should be just multiplied by data_normmalize_factor.

Many thanks!

ValueError: num_samples should be a positive integeral value, but got num_samples=0

Hi!
When I run:

 python3 run-nconv-cnn.py -mode eval -exp /home/chen/nconv/workspace/exp_guided_nconv_cnn_l1

This is a error:

Traceback (most recent call last):
  File "run-nconv-cnn.py", line 56, in <module>
    dataloaders, dataset_sizes = KittiDepthDataloader(params)
  File "/home/chen/nconv/dataloader/KittiDepthDataloader.py", line 44, in KittiDepthDataloader
    dataloaders['train'] = DataLoader(image_datasets['train'], shuffle=True, batch_size=params['train_batch_sz'], num_workers=4)
  File "/home/chen/.local/lib/python3.5/site-packages/torch/utils/data/dataloader.py", line 802, in __init__
    sampler = RandomSampler(dataset)
  File "/home/chen/.local/lib/python3.5/site-packages/torch/utils/data/sampler.py", line 64, in __init__
    "value, but got num_samples={}".format(self.num_samples))
ValueError: num_samples should be a positive integeral value, but got num_samples=0

Could you tell me why?
Thanks!

Trained weights

Hello,

I am very interested in running your evaluation code with the trained model. Will you provide it soon?

Thanks,
Adam

Confidence map

hi, recently I looked into your code and got doubt on how to obtain a confidence map from the sparse depth map. Please help me to go through this doubt, Thanks in advance.

Some questions about the training

Hello ! When retraining the code for NYU dataset, the original file nconv.py is as followed:
image
but when running, it meets a problem that reminds me that missing argument:
image
And after I add the argument "padding_mode = 'zeros' ",it reminds me size dismatch.
image

So I'm confused and wondering how to fix this problem. Looking forward to your reply.

about pos_fn

Hi,
I have a question about the EnforcePos part in file ncon.py.
Why do you choose to use a softplus function as a forward_pre_hook, instead of just use softplus in forward function?
My understanding is that, after lots of training iteration, the forward_pre_hook(softplus) will work on weights multiple times, which makes W = softplus(••••••softplus(W)) ? (althouth you set the beta=10, very close to RELU? it will be almost an identity map when W > 0?) And for last training iteration, after the backward optimization, we do not apply the softplus to weights. (Which I may think that the weights are already converged?)
But I am still confused about that...
Can we just simply apply softplus in forward function, for both train and eval phase?
Many thanks!

AAbout Normalized Convolution

Hi, Abdelrahman!
I notice that in normalized convolution the basis is set to only one basis constant function f(x)=1. I wonder why we don't have 2 or more basis functions and what would happen than. I would be grateful if you would like to share your view on the basis setting.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.