Comments (9)
So the culprit with the SwigPyObjects
is in the deconvolution function, most likely tensorflow. If I leave out deconvolution there is one more problem: the lock doesn't work as I intented. After removing the lock I can do parallel deskew/rotate.
from lattice_lightsheet_deskew_deconv.
cloesed by accident, reopening
from lattice_lightsheet_deskew_deconv.
@VolkerH multiprocessing is very primitive, and for most functions these days that use NumPy you should use dask or at least threaded. Here's a PR where I parallelised a PyTorch inference pipeline using dask and threading locks to prevent GPU oversubscription:
https://github.com/saalfeldlab/simpleference/pull/3/files
dask uses cloudpickle which is much more powerful than pickle, although swigpyobjects sounds hard to pickle for sure.
from lattice_lightsheet_deskew_deconv.
@jni thanks a lot. I will see whether cloudpickle
in dask
helps. Probably will let this sit for a few days though.
I shoule be able to avoid pickeling the SwigPyObjects
as the tensorflow part is the one that I don't want to run in parallel anyway. However, I will have to restructure my code and break up process_file
into its different parts (reading, deskewing, deconvolving and writing).
from lattice_lightsheet_deskew_deconv.
there is also some pyopencl
stuff that can't be pickled.
__
for obj in iterable:
File "/home/vhil0002/.conda/pkgs/cache/su62_scratch/volker_conda/newllsm/lib/python3.6/site-packages/multiprocess/pool.py", line 735, in next
raise value
multiprocess.pool.MaybeEncodingError: Error sending result: '<multiprocess.pool.ExceptionWithTraceback object at 0x7fd90fd115f8>'. Reason: 'TypeError("can't pickle pyopencl._cl._ErrorRecord objects",)'
from lattice_lightsheet_deskew_deconv.
Ok ... I was impatient. Rather than testing dask I just did a quick test to see whether cloudpickle
can deal with the processing functions:
print("Cloudpickling deskew_function")
cpdsf=cloudpickle.dumps(deskew_func)
print("Cloudpickling rotate_func")
cprotf=cloudpickle.dumps(rotate_func)
print("Cloudpickling deconv_functions")
cpdcf=cloudpickle.dumps(deconv_functions)
Output was:
Cloudpickling deskew_function
Cloudpickling rotate_func
Cloudpickling deconv_functions
Traceback (most recent call last):
File "C:\Users\Volker\Anaconda3\envs\spimenv\Scripts\lls_dd-script.py", line 11, in <module>
load_entry_point('lls-dd==2019.3a1', 'console_scripts', 'lls_dd')()
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\site-packages\click\core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\site-packages\click\core.py", line 717, in main
rv = self.invoke(ctx)
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\site-packages\click\core.py", line 1137, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\site-packages\click\core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\site-packages\click\core.py", line 555, in invoke
return callback(*args, **kwargs)
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\site-packages\click\decorators.py", line 64, in new_func
return ctx.invoke(f, obj, *args, **kwargs)
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\site-packages\click\core.py", line 555, in invoke
return callback(*args, **kwargs)
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\site-packages\lls_dd-2019.3a1-py3.6.egg\lls_dd\cmdline.py", line 151, in process
ep.process_stack_subfolder(processcmd.ef.stacks[int(number)])
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\site-packages\lls_dd-2019.3a1-py3.6.egg\lls_dd\process_llsm_experiment.py", line 525, in process_stack_subfolder
cpdcf=cloudpickle.dumps(deconv_functions)
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\site-packages\cloudpickle\cloudpickle.py", line 961, in dumps
cp.dump(obj)
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\site-packages\cloudpickle\cloudpickle.py", line 267, in dump
return Pickler.dump(self, obj)
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 409, in dump
self.save(obj)
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 521, in save
self.save_reduce(obj=obj, *rv)
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 631, in save_reduce
self._batch_setitems(dictitems)
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 847, in _batch_setitems
save(v)
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 521, in save
self.save_reduce(obj=obj, *rv)
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 634, in save_reduce
save(state)
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 751, in save_tuple
save(element)
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 821, in save_dict
self._batch_setitems(obj.items())
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 847, in _batch_setitems
save(v)
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 521, in save
self.save_reduce(obj=obj, *rv)
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 634, in save_reduce
save(state)
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 821, in save_dict
self._batch_setitems(obj.items())
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 847, in _batch_setitems
save(v)
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 521, in save
self.save_reduce(obj=obj, *rv)
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 634, in save_reduce
save(state)
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 821, in save_dict
self._batch_setitems(obj.items())
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 847, in _batch_setitems
save(v)
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 521, in save
self.save_reduce(obj=obj, *rv)
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 634, in save_reduce
save(state)
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 476, in save
f(self, obj) # Call unbound method with explicit self
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 821, in save_dict
self._batch_setitems(obj.items())
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 847, in _batch_setitems
save(v)
File "C:\Users\Volker\Anaconda3\envs\spimenv\lib\pickle.py", line 496, in save
rv = reduce(self.proto)
TypeError: can't pickle _thread.RLock objects
So in contrast to dill
, cloudpickle
fails at serializing _thread.RLock
, presumably also from tensorflow. Not sure whether this happens before or after serializing the swig stuff.
from lattice_lightsheet_deskew_deconv.
Hmm. I'm not sure whether:
- PyTorch has better pickle compatibility, or
- Somehow dask doesn't need to pickle that function because workers can import directly?
from lattice_lightsheet_deskew_deconv.
Hmmm. Doesn't really matter as the GPU computation is the part I don't want to run in multiple processes anyway.
I have decided to use the pattern implemented here, adatped to either pathos
or dask
so that class methods can be serialized:
https://pypi.org/project/multiprocessing-generator/
It is exacyly what I want:
Up to 100 elements ahead of what is consumed will be fetched by the generator in the background, which is useful when the producer and the consumer do not use the same resources (for instance network vs. CPU).
Currently process_file
takes a path and reads the file. I will seperate out the file reading and change it into process_volume
. Then the file reading can be handled by such a prefetching generator. For file writing I can trivially start a new process for each file.
from lattice_lightsheet_deskew_deconv.
closing this until I have more time to work on it
from lattice_lightsheet_deskew_deconv.
Related Issues (20)
- Think about deployment for end-users HOT 1
- Real-Time deskew visualization HOT 15
- Add scale to processed tiffs HOT 1
- fixed camera background subtraction - clip at 0, allow negative ? Also include for PSF. HOT 1
- openCL error HOT 6
- Assertion error in tensorflow
- PSF management - lattice pattern in file name
- include gputools.deconv as deconvolution backend HOT 3
- Performance in batch mode HOT 5
- artefacts introduced by rotate to coverslip HOT 6
- Artefacts on RTX 2080 Ti HOT 1
- Not enough GPU ram ... make a "slow" option HOT 12
- psf generation - change to two stage affine to avoid artefacts HOT 1
- reuse PSF
- Fail gracefully when out of GPU memory HOT 1
- bug: incompatible shapes HOT 4
- psf z_galvo step extraction fails for 20190802 dataset HOT 1
- CPU version — ModuleNotFoundError: No module named 'numpy.core._multiarray_umath' HOT 6
- Allow deskewing when no PSF is present
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from lattice_lightsheet_deskew_deconv.