Comments (13)
wait....
could be a mistake on my side. Sorry...
But I found another issue with 2D data (now individual image)
File "c:\users\schorb\pybdv\pybdv\converter.py", line 433, in make_bdv
factors = [[1, 1, 1]] + downscale_factors
TypeError: can only concatenate list (not "tuple") to list
This used to work before...
from pybdv.
I provide it as tuple (I think I saw that somewhere in your code...):
downscale_factors = ([1,2,2],[1,2,2],[1,2,2],[1,4,4])
If I provide is as list/array:
downscale_factors = [[1,2,2],[1,2,2],[1,2,2],[1,4,4]]
I get:
File "c:\users\schorb\pybdv\pybdv\converter.py", line 458, in make_bdv
enforce_consistency=enforce_consistency)
File "c:\users\schorb\pybdv\pybdv\metadata.py", line 218, in write_xml_metadata
_update_attributes(viewsets, attributes, overwrite)
File "c:\users\schorb\pybdv\pybdv\metadata.py", line 159, in _update_attributes
for name, val in this_values:
ValueError: too many values to unpack (expected 2)
from pybdv.
OK, I think I fixed my tile issue from above. But now...
File "c:\users\schorb\pybdv\pybdv\converter.py", line 409, in make_bdv
attributes_ = validate_attributes(xml_path, attributes, setup_id, enforce_consistency)
File "c:\users\schorb\pybdv\pybdv\metadata.py", line 503, in validate_attributes
attrs_out = _validate_existing_attributes(setups, setup_id, attributes, enforce_consistency)
File "c:\users\schorb\pybdv\pybdv\metadata.py", line 466, in _validate_existing_attributes
raise ValueError("Attributes contains unexpected names")
ValueError: Attributes contains unexpected names
thisview['attributes']
Out[18]:
{'tile': {'id': 0},
'displaysettings': {'id': 0,
'color': '255 255 255 255',
'isset': 'true',
'Projection_Mode': 'Average',
'min': '408',
'max': '10927'}}
Which one creates the problem?
from pybdv.
This happens if you write a new set-up with an attribute name that has not been added to the xml before.
E.g. if the first call to make_bdv
contains attributes={'channel': {...},
tile: {...},
displaysettings: {...}}
then all subsequent calls must also contain attributes with channel, tile and displaysettings
(and nothing else!).
Otherwise, set-ups would have different attributes, which might leave to undefined behaviour.
from pybdv.
Is there any way that allows me to create a new XML with pybdv without re-writing the h5?
Assume we have a giant data set like the platy. Now we want to generate a new metadata XML that contains the displaysettings
and some other useful additions. Currently pybdv would not allow this. Would this need another overwrite mode for make_bdv
?
from pybdv.
Something like "force-metadata" or "renew-metadata".... It will completely ignore what already exists and just write the metadata for the given ViewSetup. Obviously it cannot check that the ViewSetups written to the XML will correspond to the data container, but this would be up to the user. To be on the safe side, it could maybe even create a backup of the original XML file before creating the new one.
from pybdv.
Or it would do that if there is no original xml file (manually removed/renamed)...
from pybdv.
Back to the original problem:
Now I have a proper h5 and xml and now when I call make_bdv
with metadata
, I again get:
File "c:\users\schorb\pybdv\pybdv\converter.py", line 433, in make_bdv
factors = [[1, 1, 1]] + downscale_factors
TypeError: can only concatenate list (not "tuple") to list
from pybdv.
Something like "force-metadata" or "renew-metadata".... It will completely ignore what already exists and just write the metadata for the given ViewSetup. Obviously it cannot check that the ViewSetups written to the XML will correspond to the data container, but this would be up to the user. To be on the safe side, it could maybe even create a backup of the original XML file before creating the new one.
The problem is that this can bring attributes for different set-ups out of sync.
I don't think it's a good idea to include this in make_bdv
. This should either be done by the user or we introduce a different function that only operates on metadata for this in pybdv.
from pybdv.
File "c:\users\schorb\pybdv\pybdv\converter.py", line 433, in make_bdv
factors = [[1, 1, 1]] + downscale_factorsTypeError: can only concatenate list (not "tuple") to list
I think the problem is that you are passing the downscale_factors
as tuple, but right now it only works with a list
I will change the line to
factors = [[1, 1, 1]] + list(downscale_factors)
to allow for both again.
from pybdv.
OK,
so now whenenver it tries to overwrite an existing XML, even with the very same metadata, Irun into:
File "c:\users\schorb\pybdv\pybdv\converter.py", line 458, in make_bdv
enforce_consistency=enforce_consistency)
File "c:\users\schorb\pybdv\pybdv\metadata.py", line 218, in write_xml_metadata
_update_attributes(viewsets, attributes, overwrite)
File "c:\users\schorb\pybdv\pybdv\metadata.py", line 159, in _update_attributes
for name, val in this_values:
ValueError: too many values to unpack (expected 2)
from pybdv.
OK,
so now whenenver it tries to overwrite an existing XML, even with the very same metadata, Irun into:
I see, yeah that's an issue in the code (it muse be this_values.items()
); looks like there is a code path that is not covered by my tests yet ....
I will fix it and let you know once the fix is there.
from pybdv.
Ok, both issues should be fixed now.
I haven't checked why the missing .items()
wasn't caught by tests yet (it should have ...)
So there might be some other issues coming up.
from pybdv.
Related Issues (20)
- anisotropic voxel sizes HOT 4
- dealing with absolute paths HOT 4
- attributes no longer supported HOT 4
- Downsampling if already present HOT 2
- #viewsetups >100 HOT 10
- memmaps while modifying metadata only HOT 2
- edit mode and group mode kill each other HOT 3
- pybdv crashes when I import HOT 1
- Scales by make_bdv seem to 'move' the data HOT 5
- Halo computation in downsample function does not work correctly HOT 3
- make_bdv fails upon certain dataset size HOT 2
- TIF conversion fails due to missing extension in directory path HOT 13
- generate N5 fails HOT 2
- chunk boundaries visible as black cubes HOT 7
- On the fly editing much slower for n5 than hdf5 HOT 2
- on the fly example 3D, multi channels, multi timepoints HOT 1
- Updating version on conda HOT 3
- Feature request: working with dask arrays HOT 5
- file not found error in abspath HOT 2
- Interpolation mode failed in the newest version?
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from pybdv.