GithubHelp home page GithubHelp logo

ome / napari-ome-zarr Goto Github PK

View Code? Open in Web Editor NEW
26.0 10.0 21.0 122 KB

A napari plugin for zarr backed OME-NGFF images

Home Page: https://www.napari-hub.org/plugins/napari-ome-zarr

License: BSD 3-Clause "New" or "Revised" License

Python 100.00%
python napari-plugin ngff zarr

napari-ome-zarr's Introduction

napari-ome-zarr

License PyPI Python Version tests codecov pre-commit.ci status

A reader for zarr backed OME-NGFF images.


This napari plugin was generated with Cookiecutter using with @napari's cookiecutter-napari-plugin template.

Installation

Install napari if not already installed.

You can install napari-ome-zarr via pip. Activate the same environment as you installed napari into, then:

pip install napari-ome-zarr

Usage

Napari will use napari-ome-zarr plugin to open images that the plugin recognises as ome-zarr. The image metadata from OMERO will be used to set channel names and rendering settings in napari::

napari "https://uk1s3.embassy.ebi.ac.uk/idr/zarr/v0.3/9836842.zarr/"

If a dialog in napari pops up, encouraging you to choose a reader, choose napari-ome-zarr and click OK. You can stop it happening with addition of --plugin napari-ome-zarr as in the example below.

To open a local file::

napari --plugin napari-ome-zarr 13457227.zarr

OR in python::

import napari

viewer = napari.Viewer()
viewer.open("https://uk1s3.embassy.ebi.ac.uk/idr/zarr/v0.4/idr0101A/13457537.zarr", plugin="napari-ome-zarr")

napari.run()

Contributing

Contributions are very welcome. Tests can be run with tox, please ensure the coverage at least stays the same before you submit a pull request.

License

Distributed under the terms of the BSD-3 license, "napari-ome-zarr" is free and open source software

Issues

If you encounter any problems, please file an issue along with a detailed description.

napari-ome-zarr's People

Contributors

dragadoncila avatar dstansby avatar jburel avatar jni avatar joshmoore avatar neuromusic avatar pre-commit-ci[bot] avatar psobolewskiphd avatar pwalczysko avatar sbesson avatar will-moore avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

napari-ome-zarr's Issues

Labels should be opened as label layer

After fixing the issues with my data observed in #33. The labels are now loaded by the plugin.
However they are loaded as image layer:
image

It would make more sense to load them as label layer instead. Note that this is particularly annoying right now since it's not possible to convert them into a label layer due to #34.

Converting image to labels

Hello!

I have a couple of datasets with have segmentations in them, stored under /labels/mask. The plugin loads the image and the label just fine, displaying both layers:
image

However, we aim to edit the segmentations to edit some mistakes that our automatic process makes. For this we want to convert the /[1] image to a label by right-clicking and choosing the "Convert to labels" option:
image

This results in the error below, however. Is this a problem with our implementation or there something in the plugin which it doesn't like?

AttributeError                            Traceback (most recent call last)
File ~/Library/napari-0.4.18/envs/napari-0.4.18/lib/python3.9/site-packages/app_model/backends/qt/_qaction.py:62, in QCommandAction._on_triggered(self=QMenuItemAction(MenuItem(when=None, group='1_con...tle=None, toggled=None), alt=None), app='napari'), checked=False)
     58 def _on_triggered(self, checked: bool) -> None:
     59     # execute_command returns a Future, for the sake of eventually being
     60     # asynchronous without breaking the API.  For now, we call result()
     61     # to raise any exceptions.
---> 62     self._app.commands.execute_command(self._command_id).result()
        self._command_id = 'napari:layer:convert_to_labels'
        self = QMenuItemAction(MenuItem(when=None, group='1_conversion', order=None, command=CommandRule(id='napari:layer:convert_to_labels', title='Convert to Labels', category=None, tooltip=None, status_tip=None, icon=None, enablement=Expr.parse('num_selected_image_layers >= 1 or num_selected_shapes_layers >= 1 and all_selected_layers_same_type and not selected_empty_shapes_layer'), short_title=None, toggled=None), alt=None), app='napari')
        self._app = Application('napari')

File ~/Library/napari-0.4.18/envs/napari-0.4.18/lib/python3.9/site-packages/app_model/registries/_commands_reg.py:206, in CommandsRegistry.execute_command(self=<CommandsRegistry at 0x1446362b0 (101 commands)>, id='napari:layer:convert_to_labels', execute_asynchronously=False, *args=(), **kwargs={})
    202 except Exception as e:
    203     if self._raise_synchronous_exceptions:
    204         # note, the caller of this function can also achieve this by
    205         # calling `future.result()` on the returned future object.
--> 206         raise e
    207     future.set_exception(e)
    209 return future

File ~/Library/napari-0.4.18/envs/napari-0.4.18/lib/python3.9/site-packages/app_model/registries/_commands_reg.py:201, in CommandsRegistry.execute_command(self=<CommandsRegistry at 0x1446362b0 (101 commands)>, id='napari:layer:convert_to_labels', execute_asynchronously=False, *args=(), **kwargs={})
    199 future: Future = Future()
    200 try:
--> 201     future.set_result(cmd(*args, **kwargs))
        future = <Future at 0x145a592b0 state=pending>
        cmd = <function _convert_to_labels at 0x2a2ffd310>
        args = ()
        kwargs = {}
    202 except Exception as e:
    203     if self._raise_synchronous_exceptions:
    204         # note, the caller of this function can also achieve this by
    205         # calling `future.result()` on the returned future object.

File ~/Library/napari-0.4.18/envs/napari-0.4.18/lib/python3.9/site-packages/in_n_out/_store.py:936, in Store.inject_processors.<locals>._deco.<locals>._exec(*args=(), **kwargs={})
    934 @wraps(func)
    935 def _exec(*args: P.args, **kwargs: P.kwargs) -> R:
--> 936     result = func(*args, **kwargs)
        func = <function _convert_to_labels at 0x2a2421670>
        args = ()
        kwargs = {}
    937     if result is not None:
    938         self.process(
    939             result,
    940             type_hint=type_hint,
   (...)
    943             _funcname=getattr(func, "__qualname__", str(func)),
    944         )

File ~/Library/napari-0.4.18/envs/napari-0.4.18/lib/python3.9/site-packages/in_n_out/_store.py:804, in Store.inject.<locals>._inner.<locals>._exec(*args=(), **kwargs={})
    797 logger.debug(
    798     "  Calling %s with %r (injected %r)",
    799     _fname,
    800     bound.arguments,
    801     _injected_names,
    802 )
    803 try:
--> 804     result = func(**bound.arguments)
        bound = <BoundArguments (ll=[<Image layer '/' at 0x2a08226d0>])>
        func = <function _convert_to_labels at 0x142fb60d0>
        bound.arguments = {'ll': [<Image layer '/' at 0x2a08226d0>]}
    805 except TypeError as e:
    806     if "missing" not in e.args[0]:

File ~/Library/napari-0.4.18/envs/napari-0.4.18/lib/python3.9/site-packages/napari/layers/_layer_actions.py:70, in _convert_to_labels(ll=[<Image layer '/' at 0x2a08226d0>])
     69 def _convert_to_labels(ll: LayerList):
---> 70     return _convert(ll, 'labels')
        ll = [<Image layer '/' at 0x2a08226d0>]

File ~/Library/napari-0.4.18/envs/napari-0.4.18/lib/python3.9/site-packages/napari/layers/_layer_actions.py:59, in _convert(ll=[<Image layer '/' at 0x2a08226d0>], type_='labels')
     57     data = lay.to_labels()
     58 else:
---> 59     data = lay.data.astype(int) if type_ == 'labels' else lay.data
        lay = <Image layer '/ [1]' at 0x16a379f10>
        type_ == 'labels' = True
        type_ = 'labels'
     60 new_layer = Layer.create(data, lay._get_base_state(), type_)
     61 ll.insert(idx, new_layer)

AttributeError: 'MultiScaleData' object has no attribute 'astype'```

Layer names

Currently, the filename is used to determine the layer names:
image
I think it would be better to use the "name" field from the multiscale spec instead (if it's given).
This would make the layers easier to distinguish.

open ngff file with lower resolution pyramid

i'm trying the viewer on this file to see if it will do zoom based resolution fetch, but it opens at the highest resolution.

napari --plugin napari-ome-zarr 'https://dandiarchive.s3.amazonaws.com/zarr/7723d02f-1f71-4553-a7b0-47bda1ae8b42/'

i'm on a 16" macbook pro. wondering if it's because of the resolution of display, and if there is a way to force it to a different pyramid level by default.

Install in dev mode / how to check if napari uses the plugin.

I tried to install the plugin in dev mode via pip, i.e.

git clone install https://github.com/ome/napari-ome-zarr   (I used my own fork instead, but that shouldn't matter)
cd napari-ome-zarr
pip install -e . --no-deps  (In conda, so don't use pip deps, all deps should be there in my case)

However, when I try to run the example, it fails:

$ napari https://s3.embassy.ebi.ac.uk/idr/zarr/v0.1/6001240.zarr/                                                                                   
17:00:02 ERROR PluginError: Error in plugin 'builtins', hook 'napari_get_reader'                                                                                                       
  Cause was: ValueError('Not a zarr dataset or group: https://s3.embassy.ebi.ac.uk/idr/zarr/v0.1/6001240.zarr/')                                                                       
    in file: /home/pape/Work/software/conda/miniconda3/envs/main/lib/python3.7/site-packages/napari/utils/io.py                                                                        
    at line: 292                                                                                                                                                                       
     author: napari team                                                                                                                                                               
      email: [email protected]                                                                                                                                  
    package: napari                                                                                                                                                                    
        url: https://napari.org                                                                                                                                                        
    version: 0.3.6                                                                                                                                                                     
                                                                                                                                                                                       
17:00:02 WARNING No plugin found capable of reading 'https://s3.embassy.ebi.ac.uk/idr/zarr/v0.1/6001240.zarr/'.                                                                        
17:00:02 ERROR (1) error occurred in plugins: 'builtins'. See full error logs in "Plugins → Plugin Errors..." 

no registered plugin named 'napari-ome-zarr'

Reported at https://forum.image.sc/t/napari-napari-ome-zarr-plugin-is-not-registered/78482 and elsewhere...

I installed the napari-ome-zarr plugin either via pip installing or via the napari UI.
Then I specify the plugin on the command-line, trying to open a directory that doesn't contain an OME-Zarr.

Instead of a useful message, I get:

ValueError: There is no registered plugin named 'napari-ome-zarr'.

cc @DragaDoncila

Full stack trace:

Traceback (most recent call last):
  File "/Users/wmoore/opt/anaconda3/envs/SSBD/bin/napari", line 8, in <module>
    sys.exit(main())
  File "/Users/wmoore/opt/anaconda3/envs/ssbd/lib/python3.9/site-packages/napari/__main__.py", line 570, in main
    _run()
  File "/Users/wmoore/opt/anaconda3/envs/ssbd/lib/python3.9/site-packages/napari/__main__.py", line 344, in _run
    viewer._window._qt_viewer._qt_open(
  File "/Users/wmoore/opt/anaconda3/envs/ssbd/lib/python3.9/site-packages/napari/_qt/qt_viewer.py", line 867, in _qt_open
    self.viewer.open(
  File "/Users/wmoore/opt/anaconda3/envs/ssbd/lib/python3.9/site-packages/napari/components/viewer_model.py", line 1076, in open
    self._add_layers_with_plugins(
  File "/Users/wmoore/opt/anaconda3/envs/ssbd/lib/python3.9/site-packages/napari/components/viewer_model.py", line 1276, in _add_layers_with_plugins
    layer_data, hookimpl = read_data_with_plugins(
  File "/Users/wmoore/opt/anaconda3/envs/ssbd/lib/python3.9/site-packages/napari/plugins/io.py", line 105, in read_data_with_plugins
    raise ValueError(
ValueError: There is no registered plugin named 'napari-ome-zarr'.
Names of plugins offering readers are: set()

High-content screening ngff metadata schema empty image

I am trying to open a .zarr folder with this metadata schema choosing napari-plugin as reader, napari does not raise any error but the image loaded is just an "empty" array.
This is my folder structure:
tree_zarr
As you see, I have just one dataset path (0)
This is the result after I drag and drop it on napari with the plugin:
screen-napari

This is the output of the napari with -vvv :

`10:22:39 DEBUG Created nested FSStore(/test2.zarr, r, {'dimension_separator': '/', 'normalize_keys': False})

10:22:39 DEBUG 0.4 matches None?

10:22:39 DEBUG 0.3 matches None?

10:22:39 DEBUG 0.2 matches None?
10:22:39 DEBUG V01:None v. 0.1
10:22:39 DEBUG treating /test2.zarr [zgroup] as Plate

10:22:39 INFO root_attr: plate

10:22:39 DEBUG {'acquisitions': [{'id': 1, 'name': 'single acquisition'}], 'columns': [{'name': '1'}], 'field_count': 1, 'name': 'test', 'rows': [{'name': 'A'}], 'version': '0.3', 'wells': [{'path': 'A/1'}]}

10:22:39 INFO plate_data: {'acquisitions': [{'id': 1, 'name': 'single acquisition'}], 'columns': [{'name': '1'}], 'field_count': 1, 'name': 'test', 'rows': [{'name': 'A'}], 'version': '0.3', 'wells': [{'path': 'A/1'}]}

10:22:39 DEBUG open(ZarrLocation(/test2.zarr/A/1))
10:22:39 DEBUG Created nested FSStore(/test2.zarr/A/1, r, {'dimension_separator': '/', 'normalize_keys': False})
10:22:39 DEBUG 0.4 matches None?
10:22:39 DEBUG 0.3 matches None?
10:22:39 DEBUG 0.2 matches None?
10:22:39 DEBUG V01:None v. 0.1

10:22:39 DEBUG treating /test2.zarr/A/1 [zgroup] as Well

10:22:39 INFO root_attr: well

10:22:39 DEBUG {'images': [{'path': '0'}], 'version': '0.3'}

10:22:39 INFO well_data: {'images': [{'path': '0'}], 'version': '0.3'}

10:22:39 DEBUG open(ZarrLocation(/test2.zarr/A/1/0))

10:22:39 DEBUG Created nested FSStore(/test2.zarr/A/1/0, r, {'dimension_separator': '/', 'normalize_keys': False})

10:22:39 DEBUG 0.4 matches 0.3?
10:22:39 DEBUG 0.3 matches 0.3?
10:22:39 WARNING version mismatch: detected:FormatV03, requested:FormatV04

10:22:39 DEBUG Created nested FSStore(/test2.zarr/A/1/0, r, {'dimension_separator': '/', 'normalize_keys': False})

10:22:39 DEBUG treating /test2.zarr/A/1/0 [zgroup] as Multiscales

10:22:39 INFO root_attr: multiscales

10:22:39 DEBUG [{'version': '0.3', 'axes': [{'name': 'y', 'type': 'space', 'unit': 'centimeter'}, {'name': 'x', 'type': 'space', 'unit':
'centimeter'}], 'datasets': [{'path': '0'}], 'metadata': {'kwargs': {'axes_names': ['y', 'x']}}}]

10:22:39 INFO root_attr: omero
10:22:39 DEBUG {'id': 1, 'version': '0.3'}

10:22:39 INFO root_attr: channels
10:22:39 DEBUG [{'active': True, 'coefficient': 1, 'color': '0000FF', 'family': 'linear', 'inverted': False, 'label': 'LaminB1', 'window': {'end': 1500, 'max': 65535, 'min': 0, 'start': 0}}]

10:22:39 INFO root_attr: rdefs

10:22:39 DEBUG {'defaultT': 0, 'defaultZ': 118, 'model': 'color'}

10:22:39 INFO datasets [{'path': '0'}]

10:22:39 INFO resolution: 0

10:22:39 INFO - shape ('y', 'x') = (8640, 7680)

10:22:39 INFO - chunks = ['2160', '2560']

10:22:39 INFO - dtype = uint16
10:22:39 DEBUG open(ZarrLocation(/test2.zarr/A/1/0/labels))

10:22:39 DEBUG Created nested FSStore(/test2.zarr/A/1/0/labels, r, {'dimension_separator': '/', 'normalize_keys': False})

10:22:39 WARNING version mismatch: detected:FormatV04, requested:FormatV03
10:22:39 DEBUG Created nested FSStore(/test2.zarr/A/1/0/labels, r, {'dimension_separator': '/', 'normalize_keys': False})

10:22:39 DEBUG treating /test2.zarr/A/1/0 [zgroup] as OMERO

10:22:39 INFO root_attr: multiscales

10:22:39 DEBUG [{'version': '0.3', 'axes': [{'name': 'y', 'type': 'space', 'unit': 'centimeter'}, {'name': 'x', 'type': 'space', 'unit': 'centimeter'}], 'datasets': [{'path': '0'}], 'metadata': {'kwargs': {'axes_names': ['y', 'x']}}}]

10:22:39 INFO root_attr: omero
10:22:39 DEBUG {'id': 1, 'version': '0.3'}
10:22:39 INFO root_attr: channels
10:22:39 DEBUG [{'active': True, 'coefficient': 1, 'color': '0000FF', 'family': 'linear', 'inverted': False, 'label': 'LaminB1', 'window': {'end': 1500, 'max': 65535, 'min': 0, 'start': 0}}]

10:22:39 INFO root_attr: rdefs
10:22:39 DEBUG {'defaultT': 0, 'defaultZ': 118, 'model': 'color'}

10:22:39 DEBUG creating lazy_reader. row:0 col:0

10:22:39 DEBUG img_pyramid_shapes: [(8640, 7680)]

10:22:39 DEBUG target_level: 0
10:22:39 DEBUG get_stitched_grid() level: 0, tile_shape: (8640, 7680)

10:22:39 DEBUG treating /test2.zarr [zgroup] as ome-zarr

10:22:39 DEBUG returning /test2.zarr [zgroup]

10:22:39 DEBUG transforming /test2.zarr [zgroup]

10:22:39 DEBUG node.metadata: {'axes': [{'name': 'y', 'type': 'space', 'unit': 'centimeter'}, {'name': 'x', 'type': 'space', 'unit': 'centimeter'}], 'metadata': {'plate': {'acquisitions': [{'id': 1, 'name': 'single acquisition'}], 'columns': [{'name': '1'}], 'field_count': 1, 'name': 'test', 'rows': [{'name': 'A'}], 'version': '0.3', 'wells': [{'path': 'A/1'}]}}}

10:22:39 DEBUG Transformed: ([dask.array<from-value, shape=(8640, 7680), dtype=uint16, chunksize=(8640, 7680), chunktype=numpy.ndarray>], {}, 'image')

10:22:39 DEBUG ImageSlice.init
10:22:39 DEBUG ImageSlice.init
10:22:39 DEBUG LOADING tile... A/1/0/0 with shape: (8640, 7680)
10:22:39 DEBUG ImageSlice.init`

I hope that this informations could be helpful to better understand the issue.
Thank you very much for your work and for the help in figure out what's going on.

Write ome-zarr images (feature request)

Hi all,

I'm just wondering how complicated it might be to provide a writer for napari. I just tried to save a dataset in napari as "test.zarr" and it was written to disc as "test.zarr.tif".

If you say writing zarr files from napari makes sense, e.g. using this code, I'd be happy to provide a pull-request.

Thanks!

Best,
Robert

RFE: relax filename_patterns to read child elements

The filename_patterns introduced as part of the conversion of this plugin to the npe2 engine allowing to open any OME-NGFF dataset ending with *.zarr

This currently covers all the supported specification e.g. multiscale images

napari https://uk1s3.embassy.ebi.ac.uk/idr/zarr/v0.4/idr0083A/9822152.zarr

or plates

napari https://uk1s3.embassy.ebi.ac.uk/idr/zarr/v0.4/idr0072B/9512.zarr

As it stands, this naming pattern prevents direct access to the leaves of a dataset like the well of a plate

napari https://uk1s3.embassy.ebi.ac.uk/idr/zarr/v0.4/idr0072B/9512.zarr/L/20

the field of view of a plate

napari https://uk1s3.embassy.ebi.ac.uk/idr/zarr/v0.4/idr0072B/9512.zarr/L/20/0

or a label image associated with a multiscales image

https://uk1s3.embassy.ebi.ac.uk/idr/zarr/v0.4/idr0062A/6001247.zarr/labels/0

Using a more permissive pattern e.g. *.zarr* or possibly * as suggested in #42 would relax this constraint and allow direct acces. This is potentially outside the intent of the plugin.

Originally posted by @sbesson in #42 (comment)

Error at reading demo file

$ napari 'https://uk1s3.embassy.ebi.ac.uk/idr/zarr/v0.1/6001240.zarr/'
19:36:17 WARNING version mismatch: detected: FormatV01, requested: FormatV04
Traceback (most recent call last):
  File "/home/alobo/miniconda3/envs/napari-env/lib/python3.9/site-packages/napari/_qt/qt_viewer.py", line 953, in _qt_open
    self.viewer.open(
  File "/home/alobo/miniconda3/envs/napari-env/lib/python3.9/site-packages/napari/components/viewer_model.py", line 1102, in open
    layers = self._open_or_raise_error(
  File "/home/alobo/miniconda3/envs/napari-env/lib/python3.9/site-packages/napari/components/viewer_model.py", line 1222, in _open_or_raise_error
    raise MultipleReaderError(
napari.errors.reader_errors.MultipleReaderError: Multiple plugins found capable of reading https://uk1s3.embassy.ebi.ac.uk/idr/zarr/v0.1/6001240.zarr/. Select plugin from {'napari-ome-zarr': 'napari-ome-zarr', 'napari': 'napari builtins'} and pass to reading function e.g. `viewer.open(..., plugin=...)`.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/alobo/miniconda3/envs/napari-env/bin/napari", line 8, in <module>
    sys.exit(main())
  File "/home/alobo/miniconda3/envs/napari-env/lib/python3.9/site-packages/napari/__main__.py", line 564, in main
    _run()
  File "/home/alobo/miniconda3/envs/napari-env/lib/python3.9/site-packages/napari/__main__.py", line 340, in _run
    viewer._window._qt_viewer._qt_open(
  File "/home/alobo/miniconda3/envs/napari-env/lib/python3.9/site-packages/napari/_qt/qt_viewer.py", line 971, in _qt_open
    handle_gui_reading(filenames, self, stack, **kwargs)
  File "/home/alobo/miniconda3/envs/napari-env/lib/python3.9/site-packages/napari/_qt/dialogs/qt_reader_dialog.py", line 201, in handle_gui_reading
    open_with_dialog_choices(
  File "/home/alobo/miniconda3/envs/napari-env/lib/python3.9/site-packages/napari/_qt/dialogs/qt_reader_dialog.py", line 294, in open_with_dialog_choices
    qt_viewer.viewer.open(paths, stack=stack, plugin=plugin_name, **kwargs)
  File "/home/alobo/miniconda3/envs/napari-env/lib/python3.9/site-packages/napari/components/viewer_model.py", line 1092, in open
    self._add_layers_with_plugins(
  File "/home/alobo/miniconda3/envs/napari-env/lib/python3.9/site-packages/napari/components/viewer_model.py", line 1318, in _add_layers_with_plugins
    added.extend(self._add_layer_from_data(*_data))
  File "/home/alobo/miniconda3/envs/napari-env/lib/python3.9/site-packages/napari/components/viewer_model.py", line 1392, in _add_layer_from_data
    layer = add_method(data, **(meta or {}))
  File "/home/alobo/miniconda3/envs/napari-env/lib/python3.9/site-packages/napari/components/viewer_model.py", line 4, in add_labels
    import itertools
  File "/home/alobo/miniconda3/envs/napari-env/lib/python3.9/site-packages/napari/layers/labels/labels.py", line 283, in __init__
    data = self._ensure_int_labels(data)
  File "/home/alobo/miniconda3/envs/napari-env/lib/python3.9/site-packages/napari/layers/labels/labels.py", line 600, in _ensure_int_labels
    raise TypeError(
TypeError: Only integer types are supported for Labels layers, but data contains float64.
(napari-env) alobo@pop-os:/media/alobo/PortableSSD/CERTEC/CERTEC_20230623/RCERTEC_20230723$ 

napari: 0.4.18
Platform: Linux-6.6.10-76060610-generic-x86_64-with-glibc2.35
System: Pop!_OS 22.04 LTS
Python: 3.9.18 | packaged by conda-forge | (main, Dec 23 2023, 16:33:10) [GCC 12.3.0]
Qt: 5.15.2
PyQt5: 5.15.10
NumPy: 1.22.4
SciPy: 1.11.3
Dask: 2021.09.0
VisPy: 0.12.2
magicgui: 0.8.1
superqt: 0.6.1
in-n-out: 0.1.9
app-model: 0.2.4
npe2: 0.7.4

OpenGL:

  • GL version: 4.6 (Compatibility Profile) Mesa 23.3.2-1pop0170423832122.04~36f1d0e
  • MAX_TEXTURE_SIZE: 16384

Screens:

  • screen 1: resolution 1920x1080, scale 1.0
  • screen 2: resolution 1920x1080, scale 1.0

Settings path:

  • /home/alobo/.config/napari/napari-env_9feb76300df8ae00e4aee3e0c1622b87abfa148e/settings.yaml

Right click on any layer fails

Normally it's possible to right click on napari layers to change their properties, e.g. to change from an image layer to a label layer or group layers. E.g.:
image
But this fails for layers that were loaded with the napari-ome-zarr plugin, see the long traceback below.

I have tried this with the following settings (from napari info):

napari: 0.4.14
Platform: Linux-5.14.0-1027-oem-x86_64-with-glibc2.31
System: Ubuntu 20.04.4 LTS
Python: 3.10.2 | packaged by conda-forge | (main, Mar 8 2022, 15:52:01) [GCC 9.4.0]
Qt: 5.12.9
PyQt5: 5.12.3
NumPy: 1.22.3
SciPy: 1.8.0
Dask: 2022.02.1
VisPy: 0.9.6

OpenGL:
- GL version: 4.6 (Compatibility Profile) Mesa 21.2.6
- MAX_TEXTURE_SIZE: 16384

Screens:
- screen 1: resolution 2560x1440, scale 1.0

Plugins:
- console: 0.0.4
- napari-ome-zarr: 0.4.0
- scikit-image: 0.4.14
- svg: 0.1.5

Traceback:

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
File ~/software/conda/miniconda3/envs/napari/lib/python3.10/site-packages/napari/_qt/containers/_layer_delegate.py:153, in LayerDelegate.editorEvent(self=, event=, model=, option=, index=)
    145 """Called when an event has occured in the editor.
    146 
    147 This can be used to customize how the delegate handles mouse/key events
    148 """
    149 if (
    150     event.type() == event.MouseButtonRelease
    151     and event.button() == Qt.RightButton
    152 ):
--> 153     self.show_context_menu(
        self = 
        event = 
        model = 
        option = 
        index = 
    154         index, model, event.globalPos(), option.widget
    155     )
    157 # if the user clicks quickly on the visibility checkbox, we *don't*
    158 # want it to be interpreted as a double-click.  We want the visibilty
    159 # to simply be toggled.
    160 if event.type() == event.MouseButtonDblClick:

File ~/software/conda/miniconda3/envs/napari/lib/python3.10/site-packages/napari/_qt/containers/_layer_delegate.py:185, in LayerDelegate.show_context_menu(self=, index=, model=, pos=PyQt5.QtCore.QPoint(1439, 558), parent=)
    182     self._context_menu = QtActionContextMenu(_LAYER_ACTIONS)
    184 layer_list: LayerList = model.sourceModel()._root
--> 185 self._context_menu.update_from_context(get_context(layer_list))
        layer_list = [, , , ]
        self._context_menu = 
        self = 
    186 action = self._context_menu.exec_(pos)
    187 if action is not None and isinstance(action.data(), dict):
    188     # action.data will be a callable that accepts a layer_list instance

File ~/software/conda/miniconda3/envs/napari/lib/python3.10/site-packages/napari/_qt/widgets/qt_action_context_menu.py:117, in QtActionContextMenu.update_from_context(self=, ctx=Context({'layers_selection_count': 1, 'all_layer...same_shape': True}, {}, SettingsAwareContext({})))
    115 enable = d['enable_when']
    116 if isinstance(enable, Expr):
--> 117     enable = enable.eval(ctx)
        enable = Compare(
  left=BoolOp(
    op=And(),
    values=[
      Compare(
        left=ContextKey(id='active_layer_type', ctx=Load()),
        ops=[
          Eq()],
        comparators=[
          Constant(value='image')]),
      ContextKey(id='active_layer_ndim', ctx=Load())]),
  ops=[
    Gt()],
  comparators=[
    Constant(value=2)])
        ctx = Context({'layers_selection_count': 1, 'all_layers_linked': False, 'unselected_linked_layers': 0, 'active_layer_is_rgb': False, 'active_layer_type': 'image', 'only_images_selected': True, 'only_labels_selected': False, 'only_shapes_selected': False, 'active_layer_ndim': None, 'active_layer_shape': (6, 2048, 2048), 'active_layer_dtype': 'uint16', 'all_layers_same_shape': True}, {}, SettingsAwareContext({}))
    118 item.setEnabled(bool(enable))
    119 # if it's a menu, iterate (but don't toggle visibility)

File ~/software/conda/miniconda3/envs/napari/lib/python3.10/site-packages/napari/utils/context/_expressions.py:186, in Expr.eval(self=Compare(
  left=BoolOp(
    op=And(),
    values...   Gt()],
  comparators=[
    Constant(value=2)]), context=Context({'layers_selection_count': 1, 'all_layer...same_shape': True}, {}, SettingsAwareContext({})))
    184 code = compile(ast.Expression(body=self), '', 'eval')
    185 try:
--> 186     return eval(code, {}, context)
        code =  at 0x7f3f939ef890, file "", line 1>
        context = Context({'layers_selection_count': 1, 'all_layers_linked': False, 'unselected_linked_layers': 0, 'active_layer_is_rgb': False, 'active_layer_type': 'image', 'only_images_selected': True, 'only_labels_selected': False, 'only_shapes_selected': False, 'active_layer_ndim': None, 'active_layer_shape': (6, 2048, 2048), 'active_layer_dtype': 'uint16', 'all_layers_same_shape': True}, {}, SettingsAwareContext({}))
    187 except NameError:
    188     miss = {k for k in _iter_names(self) if k not in context}

File :1, in 

TypeError: '>' not supported between instances of 'NoneType' and 'int'

Layer name only contains first character from a multi-scale 2D image

When writing a multi-scale image with a name using ome_zarr.writer.write_multiscale, I'd expect to be get that name when reading the layer back in with the reader plugin in this repo.

However, instead the name is truncated to the first character.

Here's a test that I'd expect to pass, but fails because assert "k" == "kermit" fails.

def test_read_multiscale_2d_image_with_name(path: str):
    from ome_zarr.writer import write_multiscale
    from napari_ome_zarr import napari_get_reader
    import zarr
    data = [np.zeros((64, 32)), np.zeros((32, 16))]
    store = parse_url(path, mode="w").store
    root = zarr.group(store=store)
    write_multiscale(data, group=root, name="kermit")

    reader = napari_get_reader(path)
    layers = reader(path)

    assert len(layers) == 1
    _, read_metadata, _ = layers[0]
    assert read_metadata["name"] == "kermit"

omero metadata info used in napari

general question, but what's the scope of support of omero metadata for this napari plugin? I'm thinking for instance of:

  • channel ID, saved in omero metadata labels that could set the napari layer name
  • channel color, saved in omero metadata color that sets the napari layer color

there might be more info on rendering that could be used by napari layers (I'd be curious to know which ones).
I think it'd be useful to have a functionality that parse omero metadata and set layer properties in napari accordingly.

right click on layer menu is broken

If I right-click on a layer generated by this plugin, the following error message appears in the console:

error message
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
File ~/mambaforge/envs/napari/lib/python3.9/site-packages/napari/_qt/containers/_layer_delegate.py:155, in LayerDelegate.editorEvent(self=<napari._qt.containers._layer_delegate.LayerDelegate object>, event=<PyQt5.QtGui.QMouseEvent object>, model=<napari._qt.containers.qt_layer_list.ReverseProxyModel object>, option=<PyQt5.QtWidgets.QStyleOptionViewItem object>, index=<PyQt5.QtCore.QModelIndex object>)
    147 """Called when an event has occured in the editor.
    148
    149 This can be used to customize how the delegate handles mouse/key events
    150 """
    151 if (
    152     event.type() == event.MouseButtonRelease
    153     and event.button() == Qt.RightButton
    154 ):
--> 155     self.show_context_menu(
        self = <napari._qt.containers._layer_delegate.LayerDelegate object at 0x1727589d0>
        event = <PyQt5.QtGui.QMouseEvent object at 0x17572e700>
        model = <napari._qt.containers.qt_layer_list.ReverseProxyModel object at 0x172758a60>
        option = <PyQt5.QtWidgets.QStyleOptionViewItem object at 0x28c398dd0>
        index = <PyQt5.QtCore.QModelIndex object at 0x28edb3d60>
    156         index, model, event.globalPos(), option.widget
    157     )
    159 # if the user clicks quickly on the visibility checkbox, we *don't*
    160 # want it to be interpreted as a double-click.  We want the visibilty
    161 # to simply be toggled.
    162 if event.type() == event.MouseButtonDblClick:

File ~/mambaforge/envs/napari/lib/python3.9/site-packages/napari/_qt/containers/_layer_delegate.py:187, in LayerDelegate.show_context_menu(self=<napari._qt.containers._layer_delegate.LayerDelegate object>, index=<PyQt5.QtCore.QModelIndex object>, model=<napari._qt.containers.qt_layer_list.ReverseProxyModel object>, pos=PyQt5.QtCore.QPoint(288, 594), parent=<napari._qt.containers.qt_layer_list.QtLayerList object>)
    184     self._context_menu = QtActionContextMenu(_LAYER_ACTIONS)
    186 layer_list: LayerList = model.sourceModel()._root
--> 187 self._context_menu.update_from_context(get_context(layer_list))
        layer_list = [<Image layer 'Cam2-T1' at 0x28aa5b550>, <Image layer 'Cam1-T2' at 0x173e46310>, <Image layer '2' at 0x28c3bd0a0>, <Image layer '3' at 0x28edc0ca0>]
        self._context_menu = <napari._qt.widgets.qt_action_context_menu.QtActionContextMenu object at 0x17572e550>
        self = <napari._qt.containers._layer_delegate.LayerDelegate object at 0x1727589d0>
    188 action = self._context_menu.exec_(pos)
    189 if action is not None and isinstance(action.data(), dict):
    190     # action.data will be a callable that accepts a layer_list instance

File ~/mambaforge/envs/napari/lib/python3.9/site-packages/napari/_qt/widgets/qt_action_context_menu.py:117, in QtActionContextMenu.update_from_context(self=<napari._qt.widgets.qt_action_context_menu.QtActionContextMenu object>, ctx=Context({'layers_selection_count': 1, 'all_layer...same_shape': True}, {}, SettingsAwareContext({})))
    115 enable = d['enable_when']
    116 if isinstance(enable, Expr):
--> 117     enable = enable.eval(ctx)
        enable = Compare(
  left=BoolOp(
    op=And(),
    values=[
      Compare(
        left=ContextKey(id='active_layer_type', ctx=Load()),
        ops=[
          Eq()],
        comparators=[
          Constant(value='image')]),
      ContextKey(id='active_layer_ndim', ctx=Load())]),
  ops=[
    Gt()],
  comparators=[
    Constant(value=2)])
        ctx = Context({'layers_selection_count': 1, 'all_layers_linked': False, 'unselected_linked_layers': 0, 'active_layer_is_rgb': False, 'active_layer_type': 'image', 'only_images_selected': True, 'only_labels_selected': False, 'only_shapes_selected': False, 'active_layer_ndim': None, 'active_layer_shape': (1920, 1920), 'active_layer_dtype': 'uint16', 'all_layers_same_shape': True}, {}, SettingsAwareContext({}))
    118 item.setEnabled(bool(enable))
    119 # if it's a menu, iterate (but don't toggle visibility)

File ~/mambaforge/envs/napari/lib/python3.9/site-packages/napari/utils/context/_expressions.py:186, in Expr.eval(self=Compare(
  left=BoolOp(
    op=And(),
    values...   Gt()],
  comparators=[
    Constant(value=2)]), context=Context({'layers_selection_count': 1, 'all_layer...same_shape': True}, {}, SettingsAwareContext({})))
    184 code = compile(ast.Expression(body=self), '<Expr>', 'eval')
    185 try:
--> 186     return eval(code, {}, context)
        code = <code object <module> at 0x175732240, file "<Expr>", line 1>
        context = Context({'layers_selection_count': 1, 'all_layers_linked': False, 'unselected_linked_layers': 0, 'active_layer_is_rgb': False, 'active_layer_type': 'image', 'only_images_selected': True, 'only_labels_selected': False, 'only_shapes_selected': False, 'active_layer_ndim': None, 'active_layer_shape': (1920, 1920), 'active_layer_dtype': 'uint16', 'all_layers_same_shape': True}, {}, SettingsAwareContext({}))
    187 except NameError:
    188     miss = {k for k in _iter_names(self) if k not in context}

File <Expr>:1

TypeError: '>' not supported between instances of 'NoneType' and 'int'

Looks like something that should be an int is None instead? But it's a pretty show-stopping bug, because it prevents me from converting an image layer to a label layer.

edit: here's the contents of my conda environment, in case that's of use:

conda list
# Name                    Version                   Build  Channel
aiobotocore               2.5.0                    pypi_0    pypi
aiohttp                   3.8.4                    pypi_0    pypi
aioitertools              0.11.0                   pypi_0    pypi
aiosignal                 1.3.1                    pypi_0    pypi
alabaster                 0.7.13             pyhd8ed1ab_0    conda-forge
aom                       3.5.0                h7ea286d_0    conda-forge
app-model                 0.1.4              pyhd8ed1ab_0    conda-forge
appdirs                   1.4.4              pyh9f0ad1d_0    conda-forge
appnope                   0.1.3              pyhd8ed1ab_0    conda-forge
asciitree                 0.3.3                      py_2    conda-forge
asttokens                 2.2.1              pyhd8ed1ab_0    conda-forge
async-timeout             4.0.2                    pypi_0    pypi
attrs                     23.1.0             pyh71513ae_1    conda-forge
aws-c-auth                0.6.28               h826ccd7_5    conda-forge
aws-c-cal                 0.5.27               h92f41cd_0    conda-forge
aws-c-common              0.8.20               hb547adb_0    conda-forge
aws-c-compression         0.2.17               h28992cd_0    conda-forge
aws-c-event-stream        0.3.0                h7f691c3_6    conda-forge
aws-c-http                0.7.8                he2e4218_4    conda-forge
aws-c-io                  0.13.26              h379bc4c_0    conda-forge
aws-c-mqtt                0.8.13               hdb3a981_2    conda-forge
aws-c-s3                  0.3.4                h02343f9_5    conda-forge
aws-c-sdkutils            0.1.10               h28992cd_0    conda-forge
aws-checksums             0.1.16               h28992cd_0    conda-forge
aws-crt-cpp               0.20.2               h37dd33e_9    conda-forge
aws-sdk-cpp               1.10.57             h9b90b14_14    conda-forge
babel                     2.12.1             pyhd8ed1ab_1    conda-forge
backcall                  0.2.0              pyh9f0ad1d_0    conda-forge
backports                 1.0                pyhd8ed1ab_3    conda-forge
backports.functools_lru_cache 1.6.4              pyhd8ed1ab_0    conda-forge
blosc                     1.21.4               hc338f07_0    conda-forge
bokeh                     3.1.1              pyhd8ed1ab_0    conda-forge
botocore                  1.29.76                  pypi_0    pypi
brotli                    1.0.9                h1a8c8d9_8    conda-forge
brotli-bin                1.0.9                h1a8c8d9_8    conda-forge
brotlipy                  0.7.0           py39h02fc5c5_1005    conda-forge
brunsli                   0.1                  h9f76cd9_0    conda-forge
bzip2                     1.0.8                h3422bc3_4    conda-forge
c-ares                    1.19.1               hb547adb_0    conda-forge
c-blosc2                  2.9.2                h068da5f_0    conda-forge
ca-certificates           2023.5.7             hf0a4a13_0    conda-forge
cachey                    0.2.1              pyh9f0ad1d_0    conda-forge
certifi                   2023.5.7           pyhd8ed1ab_0    conda-forge
cffi                      1.15.1           py39h7e6b969_3    conda-forge
cfitsio                   4.2.0                h2f961c4_0    conda-forge
charls                    2.4.2                h13dd4ca_0    conda-forge
charset-normalizer        3.1.0              pyhd8ed1ab_0    conda-forge
click                     8.1.3           unix_pyhd8ed1ab_2    conda-forge
cloudpickle               2.2.1              pyhd8ed1ab_0    conda-forge
colorama                  0.4.6              pyhd8ed1ab_0    conda-forge
comm                      0.1.3              pyhd8ed1ab_0    conda-forge
contourpy                 1.1.0            py39hbd775c9_0    conda-forge
cryptography              41.0.1           py39had97604_0    conda-forge
cytoolz                   0.12.0           py39h02fc5c5_1    conda-forge
dask                      2023.6.0           pyhd8ed1ab_0    conda-forge
dask-core                 2023.6.0           pyhd8ed1ab_0    conda-forge
dav1d                     1.2.1                hb547adb_0    conda-forge
debugpy                   1.6.7            py39h23fbdae_0    conda-forge
decorator                 5.1.1              pyhd8ed1ab_0    conda-forge
distributed               2023.6.0           pyhd8ed1ab_0    conda-forge
docstring_parser          0.15               pyhd8ed1ab_0    conda-forge
docutils                  0.20.1           py39h2804cbe_0    conda-forge
entrypoints               0.4                pyhd8ed1ab_0    conda-forge
executing                 1.2.0              pyhd8ed1ab_0    conda-forge
expat                     2.5.0                hb7217d7_1    conda-forge
fasteners                 0.17.3             pyhd8ed1ab_0    conda-forge
font-ttf-dejavu-sans-mono 2.37                 hab24e00_0    conda-forge
font-ttf-inconsolata      3.000                h77eed37_0    conda-forge
font-ttf-source-code-pro  2.038                h77eed37_0    conda-forge
font-ttf-ubuntu           0.83                 hab24e00_0    conda-forge
fontconfig                2.14.2               h82840c6_0    conda-forge
fonts-conda-ecosystem     1                             0    conda-forge
fonts-conda-forge         1                             0    conda-forge
freetype                  2.12.1               hd633e50_1    conda-forge
freetype-py               2.4.0              pyhd8ed1ab_0    conda-forge
frozenlist                1.3.3                    pypi_0    pypi
fsspec                    2023.6.0           pyh1a96a4e_0    conda-forge
gettext                   0.21.1               h0186832_0    conda-forge
gflags                    2.2.2             hc88da5d_1004    conda-forge
giflib                    5.2.1                h1a8c8d9_3    conda-forge
glib                      2.76.3               ha614eb4_0    conda-forge
glib-tools                2.76.3               ha614eb4_0    conda-forge
glog                      0.6.0                h6da1cb0_0    conda-forge
gst-plugins-base          1.22.3               h27255cc_1    conda-forge
gstreamer                 1.22.3               he42f4ea_1    conda-forge
heapdict                  1.0.1                      py_0    conda-forge
hsluv                     5.0.2              pyh44b312d_0    conda-forge
icu                       70.1                 h6b3803e_0    conda-forge
idna                      3.4                pyhd8ed1ab_0    conda-forge
imagecodecs               2023.1.23        py39h43d391a_0    conda-forge
imageio                   2.31.1             pyh24c5eb1_0    conda-forge
imagesize                 1.4.1              pyhd8ed1ab_0    conda-forge
importlib-metadata        6.7.0              pyha770c72_0    conda-forge
importlib_metadata        6.7.0                hd8ed1ab_0    conda-forge
importlib_resources       5.12.0             pyhd8ed1ab_0    conda-forge
in-n-out                  0.1.7              pyhd8ed1ab_0    conda-forge
ipykernel                 6.23.2             pyh5fb750a_0    conda-forge
ipython                   8.14.0             pyhd1c38e8_0    conda-forge
ipython_genutils          0.2.0                      py_1    conda-forge
jedi                      0.18.2             pyhd8ed1ab_0    conda-forge
jinja2                    3.1.2              pyhd8ed1ab_1    conda-forge
jmespath                  1.0.1                    pypi_0    pypi
jpeg                      9e                   h1a8c8d9_3    conda-forge
jsonschema                4.17.3             pyhd8ed1ab_0    conda-forge
jupyter_client            8.2.0              pyhd8ed1ab_0    conda-forge
jupyter_core              5.3.1            py39h2804cbe_0    conda-forge
jxrlib                    1.1                  h27ca646_2    conda-forge
kiwisolver                1.4.4            py39haaf3ac1_1    conda-forge
krb5                      1.20.1               h69eda48_0    conda-forge
lazy_loader               0.2                pyhd8ed1ab_0    conda-forge
lcms2                     2.15                 h481adae_0    conda-forge
lerc                      4.0.0                h9a09cb3_0    conda-forge
libabseil                 20230125.2      cxx17_h13dd4ca_2    conda-forge
libaec                    1.0.6                hb7217d7_1    conda-forge
libarrow                  12.0.0           hbfb5349_8_cpu    conda-forge
libavif                   0.11.1               h9f83d30_2    conda-forge
libblas                   3.9.0           17_osxarm64_openblas    conda-forge
libbrotlicommon           1.0.9                h1a8c8d9_8    conda-forge
libbrotlidec              1.0.9                h1a8c8d9_8    conda-forge
libbrotlienc              1.0.9                h1a8c8d9_8    conda-forge
libcblas                  3.9.0           17_osxarm64_openblas    conda-forge
libclang                  14.0.6          default_h5dc8d65_1    conda-forge
libclang13                14.0.6          default_hc7183e1_1    conda-forge
libcrc32c                 1.1.2                hbdafb3b_0    conda-forge
libcurl                   8.1.2                h912dcd9_0    conda-forge
libcxx                    16.0.6               h4653b0c_0    conda-forge
libdeflate                1.17                 h1a8c8d9_0    conda-forge
libedit                   3.1.20191231         hc8eb9b7_2    conda-forge
libev                     4.33                 h642e427_1    conda-forge
libevent                  2.1.12               h2757513_1    conda-forge
libexpat                  2.5.0                hb7217d7_1    conda-forge
libffi                    3.4.2                h3422bc3_5    conda-forge
libgfortran               5.0.0           12_2_0_hd922786_31    conda-forge
libgfortran5              12.2.0              h0eea778_31    conda-forge
libglib                   2.76.3               h24e9cb9_0    conda-forge
libgoogle-cloud           2.11.0               h5263b79_1    conda-forge
libgrpc                   1.55.1               hc384137_1    conda-forge
libiconv                  1.17                 he4db4b2_0    conda-forge
liblapack                 3.9.0           17_osxarm64_openblas    conda-forge
libllvm14                 14.0.6               hd1a9a77_3    conda-forge
libnghttp2                1.52.0               hae82a92_0    conda-forge
libogg                    1.3.4                h27ca646_1    conda-forge
libopenblas               0.3.23          openmp_hc731615_0    conda-forge
libopus                   1.3.1                h27ca646_1    conda-forge
libpng                    1.6.39               h76d750c_0    conda-forge
libpq                     15.3                 h7126958_1    conda-forge
libprotobuf               4.23.2               hf32f9b9_5    conda-forge
libsodium                 1.0.18               h27ca646_1    conda-forge
libsqlite                 3.42.0               hb31c410_0    conda-forge
libssh2                   1.11.0               h7a5bd25_0    conda-forge
libthrift                 0.18.1               ha061701_2    conda-forge
libtiff                   4.5.0                h5dffbdd_2    conda-forge
libutf8proc               2.8.0                h1a8c8d9_0    conda-forge
libvorbis                 1.3.7                h9f76cd9_0    conda-forge
libwebp-base              1.3.0                h1a8c8d9_0    conda-forge
libxcb                    1.13              h9b22ae9_1004    conda-forge
libzlib                   1.2.13               h53f4e23_5    conda-forge
libzopfli                 1.0.3                h9f76cd9_0    conda-forge
llvm-openmp               16.0.6               h1c12783_0    conda-forge
locket                    1.0.0              pyhd8ed1ab_0    conda-forge
lz4                       4.3.2            py39hb35ce34_0    conda-forge
lz4-c                     1.9.4                hb7217d7_0    conda-forge
magicgui                  0.7.2              pyhd8ed1ab_0    conda-forge
markdown-it-py            3.0.0              pyhd8ed1ab_0    conda-forge
markupsafe                2.1.3            py39h0f82c59_0    conda-forge
matplotlib-inline         0.1.6              pyhd8ed1ab_0    conda-forge
mdurl                     0.1.0              pyhd8ed1ab_0    conda-forge
msgpack-python            1.0.5            py39haaf3ac1_0    conda-forge
multidict                 6.0.4                    pypi_0    pypi
mypy_extensions           1.0.0              pyha770c72_0    conda-forge
mysql-common              8.0.33               h7b5afe1_0    conda-forge
mysql-libs                8.0.33               hb292caa_0    conda-forge
napari                    0.4.16          pyh275ddea_0_pyqt    conda-forge
napari-console            0.0.8              pyhd8ed1ab_0    conda-forge
napari-ome-zarr           0.5.2                    pypi_0    pypi
napari-plugin-engine      0.2.0              pyhd8ed1ab_2    conda-forge
napari-svg                0.1.7              pyhd8ed1ab_0    conda-forge
ncurses                   6.4                  h7ea286d_0    conda-forge
nest-asyncio              1.5.6              pyhd8ed1ab_0    conda-forge
networkx                  3.1                pyhd8ed1ab_0    conda-forge
npe2                      0.7.0              pyhd8ed1ab_0    conda-forge
nspr                      4.35                 hb7217d7_0    conda-forge
nss                       3.89                 h789eff7_0    conda-forge
numcodecs                 0.11.0           py39h23fbdae_1    conda-forge
numpy                     1.24.3                   pypi_0    pypi
numpydoc                  1.5.0              pyhd8ed1ab_0    conda-forge
ome-zarr                  0.7.1                    pypi_0    pypi
openjpeg                  2.5.0                hbc2ba62_2    conda-forge
openssl                   3.1.1                h53f4e23_1    conda-forge
orc                       1.8.4                h13b7ede_0    conda-forge
packaging                 23.1               pyhd8ed1ab_0    conda-forge
pandas                    2.0.2            py39h6b13a34_0    conda-forge
parso                     0.8.3              pyhd8ed1ab_0    conda-forge
partd                     1.4.0              pyhd8ed1ab_0    conda-forge
pcre2                     10.40                hb34f9b4_0    conda-forge
pexpect                   4.8.0              pyh1a96a4e_2    conda-forge
pickleshare               0.7.5                   py_1003    conda-forge
pillow                    9.4.0            py39h8bd98a6_1    conda-forge
pint                      0.22               pyhd8ed1ab_1    conda-forge
pip                       23.1.2             pyhd8ed1ab_0    conda-forge
pkgutil-resolve-name      1.3.10             pyhd8ed1ab_0    conda-forge
platformdirs              3.6.0              pyhd8ed1ab_0    conda-forge
ply                       3.11                       py_1    conda-forge
pooch                     1.7.0              pyha770c72_3    conda-forge
prompt-toolkit            3.0.38             pyha770c72_0    conda-forge
prompt_toolkit            3.0.38               hd8ed1ab_0    conda-forge
psutil                    5.9.5            py39h02fc5c5_0    conda-forge
psygnal                   0.9.0              pyhd8ed1ab_0    conda-forge
pthread-stubs             0.4               h27ca646_1001    conda-forge
ptyprocess                0.7.0              pyhd3deb0d_0    conda-forge
pure_eval                 0.2.2              pyhd8ed1ab_0    conda-forge
pyarrow                   12.0.0          py39hf40061a_8_cpu    conda-forge
pycparser                 2.21               pyhd8ed1ab_0    conda-forge
pydantic                  1.10.9           py39h0f82c59_0    conda-forge
pygments                  2.15.1             pyhd8ed1ab_0    conda-forge
pyopengl                  3.1.6              pyhd8ed1ab_1    conda-forge
pyopenssl                 23.2.0             pyhd8ed1ab_1    conda-forge
pyproject_hooks           1.0.0              pyhd8ed1ab_0    conda-forge
pyqt                      5.15.7           py39h7fba1b6_3    conda-forge
pyqt5-sip                 12.11.0          py39h23fbdae_3    conda-forge
pyrsistent                0.19.3           py39h02fc5c5_0    conda-forge
pysocks                   1.7.1              pyha2e5f31_6    conda-forge
python                    3.9.16          hea58f1e_0_cpython    conda-forge
python-build              0.10.0             pyhd8ed1ab_1    conda-forge
python-dateutil           2.8.2              pyhd8ed1ab_0    conda-forge
python-tzdata             2023.3             pyhd8ed1ab_0    conda-forge
python_abi                3.9                      3_cp39    conda-forge
pytomlpp                  1.0.13           py39haaf3ac1_0    conda-forge
pytz                      2023.3             pyhd8ed1ab_0    conda-forge
pywavelets                1.4.1            py39h4d8bf0d_0    conda-forge
pyyaml                    6.0              py39h02fc5c5_5    conda-forge
pyzmq                     25.1.0           py39h1e134f0_0    conda-forge
qt-main                   5.15.8               hfe8d25c_6    conda-forge
qtconsole-base            5.4.3              pyha770c72_0    conda-forge
qtpy                      2.3.1              pyhd8ed1ab_0    conda-forge
re2                       2023.03.02           hc5e2d97_0    conda-forge
readline                  8.2                  h92ec313_1    conda-forge
requests                  2.31.0             pyhd8ed1ab_0    conda-forge
rich                      13.4.2             pyhd8ed1ab_0    conda-forge
s3fs                      2023.6.0                 pypi_0    pypi
scikit-image              0.20.0           py39hd28f0be_1    conda-forge
scipy                     1.9.1            py39h737da60_0    conda-forge
setuptools                68.0.0             pyhd8ed1ab_0    conda-forge
shellingham               1.5.1              pyhd8ed1ab_0    conda-forge
sip                       6.7.9            py39hb198ff7_0    conda-forge
six                       1.16.0             pyh6c4a22f_0    conda-forge
snappy                    1.1.10               h17c5cce_0    conda-forge
snowballstemmer           2.2.0              pyhd8ed1ab_0    conda-forge
sortedcontainers          2.4.0              pyhd8ed1ab_0    conda-forge
sphinx                    7.0.1              pyhd8ed1ab_0    conda-forge
sphinxcontrib-applehelp   1.0.4              pyhd8ed1ab_0    conda-forge
sphinxcontrib-devhelp     1.0.2                      py_0    conda-forge
sphinxcontrib-htmlhelp    2.0.1              pyhd8ed1ab_0    conda-forge
sphinxcontrib-jsmath      1.0.1                      py_0    conda-forge
sphinxcontrib-qthelp      1.0.3                      py_0    conda-forge
sphinxcontrib-serializinghtml 1.1.5              pyhd8ed1ab_2    conda-forge
stack_data                0.6.2              pyhd8ed1ab_0    conda-forge
superqt                   0.4.1              pyhd8ed1ab_0    conda-forge
tblib                     1.7.0              pyhd8ed1ab_0    conda-forge
tifffile                  2023.4.12          pyhd8ed1ab_0    conda-forge
tk                        8.6.12               he1e0b03_0    conda-forge
toml                      0.10.2             pyhd8ed1ab_0    conda-forge
tomli                     2.0.1              pyhd8ed1ab_0    conda-forge
toolz                     0.12.0             pyhd8ed1ab_0    conda-forge
tornado                   6.3.2            py39h0f82c59_0    conda-forge
tqdm                      4.65.0             pyhd8ed1ab_1    conda-forge
traitlets                 5.9.0              pyhd8ed1ab_0    conda-forge
typer                     0.9.0              pyhd8ed1ab_0    conda-forge
typing-extensions         4.6.3                hd8ed1ab_0    conda-forge
typing_extensions         4.6.3              pyha770c72_0    conda-forge
tzdata                    2023c                h71feb2d_0    conda-forge
urllib3                   1.26.16                  pypi_0    pypi
vispy                     0.10.0           py39h6eccaaf_0    conda-forge
wcwidth                   0.2.6              pyhd8ed1ab_0    conda-forge
wheel                     0.40.0             pyhd8ed1ab_0    conda-forge
wrapt                     1.15.0           py39h02fc5c5_0    conda-forge
xorg-libxau               1.0.11               hb547adb_0    conda-forge
xorg-libxdmcp             1.1.3                h27ca646_0    conda-forge
xyzservices               2023.5.0           pyhd8ed1ab_1    conda-forge
xz                        5.2.6                h57fd34a_0    conda-forge
yaml                      0.2.5                h3422bc3_2    conda-forge
yarl                      1.9.2                    pypi_0    pypi
zarr                      2.15.0             pyhd8ed1ab_0    conda-forge
zeromq                    4.3.4                hbdafb3b_1    conda-forge
zfp                       1.0.0                hb6e4faa_3    conda-forge
zict                      3.0.0              pyhd8ed1ab_0    conda-forge
zipp                      3.15.0             pyhd8ed1ab_0    conda-forge
zlib-ng                   2.0.7                h1a8c8d9_0    conda-forge
zstd                      1.5.2                hf913c23_6    conda-forge

Support bioformats2raw.layout

As noted in previous discussion at #21, the data generated with:

$ bioformats2raw image directory.zarr

cannot be opened with:

$ napari directory.zarr.

What should be the expected behaviour in this case?

  • Open ALL the images under directory.zarr/ in multiple layers? The user could then turn off or delete the layers that they don't want to use. This should probably be the default behaviour in the case where there is only 1 image. But what if there are 3, or 100?
  • In the case of an OME-NGFF plate, we show a grid of thumbnails for the first image in each Well. We could try to do the same for non-plate bioformats2raw.layout, but this doesn't give users an easy way to open the Images themselves.
  • Any other way that a napari file-reader plugin can direct users to the images within the directory.zarr?

Is there a equivalent of `channel_axis` property?

Hello,

Apologies again for another naive question. I was wondering if there was an equivalent of channel_axis that is found in napari.view_image() to be used here when I load an image using viewer.open('path to ome-zarr')?

To give some context, I am trying to load multiplexed imaging data with n number of markers and I would like each marker to be on its own layer.

Here is what I used to do previously with ome.tiff. Once I load the image, I overlay things that would be of interest for data analysis. However, now I would like to do the same but with ome-zarr folder.

Thank you very much.

napari failing on labels v0.2 version of idr:6001247

(z) /tmp $NAPARI_CATCH_ERRORS=0 napari 6001247.zarr/
/usr/local/anaconda3/envs/z/lib/python3.9/site-packages/napari/__main__.py:419: UserWarning: pythonw executable not found.
To unfreeze the menubar on macOS, click away from napari to another app, then reactivate napari. To avoid this problem, please install python.app in conda using:
conda install -c conda-forge python.app
  warnings.warn(msg)
WARNING: Layer-backing can not be explicitly controlled on 10.14 when built against the 10.14 SDK
15:00:36 WARNING Layer-backing can not be explicitly controlled on 10.14 when built against the 10.14 SDK
Traceback (most recent call last):
  File "/usr/local/anaconda3/envs/z/bin/napari", line 8, in <module>
    sys.exit(main())
  File "/usr/local/anaconda3/envs/z/lib/python3.9/site-packages/napari/__main__.py", line 420, in main
    _run()
  File "/usr/local/anaconda3/envs/z/lib/python3.9/site-packages/napari/__main__.py", line 304, in _run
    viewer = view_path(  # noqa: F841
  File "/usr/local/anaconda3/envs/z/lib/python3.9/site-packages/napari/view_layers.py", line 169, in view_path
    return _make_viewer_then('open', args, kwargs)
  File "/usr/local/anaconda3/envs/z/lib/python3.9/site-packages/napari/view_layers.py", line 119, in _make_viewer_then
    method(*args, **kwargs)
  File "/usr/local/anaconda3/envs/z/lib/python3.9/site-packages/napari/components/viewer_model.py", line 856, in open
    self._add_layers_with_plugins(
  File "/usr/local/anaconda3/envs/z/lib/python3.9/site-packages/napari/components/viewer_model.py", line 929, in _add_layers_with_plugins
    added.extend(self._add_layer_from_data(*_data))
  File "/usr/local/anaconda3/envs/z/lib/python3.9/site-packages/napari/components/viewer_model.py", line 1006, in _add_layer_from_data
    raise exc
  File "/usr/local/anaconda3/envs/z/lib/python3.9/site-packages/napari/components/viewer_model.py", line 1003, in _add_layer_from_data
    layer = add_method(data, **(meta or {}))
  File "<string>", line 4, in add_labels
  File "/usr/local/anaconda3/envs/z/lib/python3.9/site-packages/napari/layers/labels/labels.py", line 202, in __init__
    self._properties, self._label_index = self._prepare_properties(
  File "/usr/local/anaconda3/envs/z/lib/python3.9/site-packages/napari/layers/labels/labels.py", line 429, in _prepare_properties
    label_index = cls._map_index(properties)
  File "/usr/local/anaconda3/envs/z/lib/python3.9/site-packages/napari/layers/labels/labels.py", line 511, in _map_index
    max_len = max(len(x) for x in properties.values())
  File "/usr/local/anaconda3/envs/z/lib/python3.9/site-packages/napari/layers/labels/labels.py", line 511, in <genexpr>
    max_len = max(len(x) for x in properties.values())
TypeError: len() of unsized object

Plugin opens image, but does not display data (everything is 0); SOLVED: must use "/" dimension separator.

Hello OME team,

Thank you so much for all your open source efforts, I'm really interested in using the ome-ngff-zarr format, especially if it becomes more widely used by an array of third party tools such as napari.

I've generated an ome-ngff-zarr file, which I can drag and drop into napari, I select this plugin to open it, and the file does open with no complaints. What I see is an image layer with the correct dimensions, but all voxels are 0 valued. If I read the datasets from this ome-ngff-zarr file as numpy arrays and write them as tiff files, those open just fine in napari, so the voxel data in the file is fine. So something about the file metadata is not correct, or not being processed correctly by napari-ome-zarr.

The file is 1 channel, 512 voxels along each spatial axis, and there are 5 scale levels. Here is the zarr.tree() representation:
Screen Shot 2023-07-17 at 12 36 14 PM

Here are the contents of the top level .zattrs file. Can you tell me if anything in this looks suspicious or incorrect?

 {
    "multiscales": [
        {
            "@type": "ngff:Image",
            "axes": [
                {
                    "name": "c",
                    "type": "channel",
                    "unit": null
                },
                {
                    "name": "z",
                    "type": "space",
                    "unit": "micrometer"
                },
                {
                    "name": "y",
                    "type": "space",
                    "unit": "micrometer"
                },
                {
                    "name": "x",
                    "type": "space",
                    "unit": "micrometer"
                }
            ],
            "coordinateTransformations": [],
            "datasets": [
                {
                    "coordinateTransformations": [
                        {
                            "scale": [
                                1.0,
                                2.0,
                                1.0,
                                1.0
                            ],
                            "type": "scale"
                        },
                        {
                            "translation": [
                                1.0,
                                0.0,
                                0.0,
                                0.0
                            ],
                            "type": "translation"
                        }
                    ],
                    "path": "scale0/image"
                },
                {
                    "coordinateTransformations": [
                        {
                            "scale": [
                                1.0,
                                4.0,
                                2.0,
                                2.0
                            ],
                            "type": "scale"
                        },
                        {
                            "translation": [
                                1.0,
                                1.0,
                                0.5,
                                0.5
                            ],
                            "type": "translation"
                        }
                    ],
                    "path": "scale1/image"
                },
                {
                    "coordinateTransformations": [
                        {
                            "scale": [
                                1.0,
                                8.0,
                                4.0,
                                4.0
                            ],
                            "type": "scale"
                        },
                        {
                            "translation": [
                                1.0,
                                3.0,
                                1.5,
                                1.5
                            ],
                            "type": "translation"
                        }
                    ],
                    "path": "scale2/image"
                },
                {
                    "coordinateTransformations": [
                        {
                            "scale": [
                                1.0,
                                16.0,
                                8.0,
                                8.0
                            ],
                            "type": "scale"
                        },
                        {
                            "translation": [
                                1.0,
                                7.0,
                                3.5,
                                3.5
                            ],
                            "type": "translation"
                        }
                    ],
                    "path": "scale3/image"
                },
                {
                    "coordinateTransformations": [
                        {
                            "scale": [
                                1.0,
                                32.0,
                                16.0,
                                16.0
                            ],
                            "type": "scale"
                        },
                        {
                            "translation": [
                                1.0,
                                15.0,
                                7.5,
                                7.5
                            ],
                            "type": "translation"
                        }
                    ],
                    "path": "scale4/image"
                }
            ],
            "name": "image",
            "version": "0.4"
        }
    ]
}

Error loading HCS plates from S3

The plugin loads an HCS plate (sparse or otherwise) fine when saved locally (though I'd prefer it not throw warnings for every missing well), however when I try to load the same file from an S3 bucket (viewer.open(s3_path, plugin="napari-ome-zarr")) it throws the error: Exception: Could not find first well.

Is it not possible to load an HCS plate from S3? I figured with zarr.storage.FSStore and potentially fsspec.mapping.FSMap this should be doable. I will dig into the code further but any help or insight would be appreciated.

Note the plugin is able to load an image from S3 when a specific field is defined (e.g. viewer.open(s3_path + "/A/2/0", plugin="napari-ome-zarr")).

Code

import zarr
import string
from numpy import ones
import napari
import subprocess

from ome_zarr.io import parse_url
from ome_zarr.writer import write_image, write_plate_metadata, write_well_metadata


local_path = "test.ome.zarr"
s3_path = "s3://test-bucket/test.ome.zarr"

# write file to local
def write_HCS_plate(
    file_path, 
    well_paths = ["A/2", "B/3"], 
    num_rows=2, 
    num_cols=3, 
    field_paths = ["0", "1", "2"],
    ):
    store = parse_url(str(file_path), mode="w").store
    root = zarr.group(store=store)

    row_names = string.ascii_uppercase[: num_rows]
    col_names = list(map(str, range(1, num_cols + 1)))
    write_plate_metadata(root, row_names, col_names, well_paths)
    for wp in well_paths:
        row, col = wp.split("/")
        row_group = root.require_group(row)
        well = row_group.require_group(col)
        write_well_metadata(well, field_paths)
        for field in field_paths:
            image = well.require_group(str(field))
            write_image(ones((1, 1, 1, 256, 256)), image)

write_HCS_plate(local_path)

# read file from local
viewer = napari.Viewer()
viewer.open(local_path, plugin="napari-ome-zarr")

# copy file to s3
p = subprocess.Popen(f'aws s3 sync {local_path} {s3_path}', shell=True)
p.wait()

# read file from s3
viewer = napari.Viewer()
viewer.open(s3_path, plugin="napari-ome-zarr")

One related gripe is the only place I can find the pattern for writing HCS plates is in ome_zarr.tests.test_reader.test_multiwells_plate. It would be helpful to (1) provide an example in the documentation or to (2) provide a wrapper function in ome_zarr.writer. I will see if I can provide a pull request for (1) given it's less intrusive.

UPDATE: added an example to the docs via this pull request: ome/ome-zarr-py#317

Error

Full Error
---------------------------------------------------------------------------
Exception                                 Traceback (most recent call last)
Cell In[19], line 13
     12 viewer = napari.Viewer()
---> 13 viewer.open(s3_path, plugin="napari-ome-zarr")

File [~/micromamba/envs/rb_p310/lib/python3.10/site-packages/napari/components/viewer_model.py:1092], in ViewerModel.open(self, path, stack, plugin, layer_type, **kwargs)
   1089 _path = [_path] if not isinstance(_path, list) else _path
   1090 if plugin:
   1091     added.extend(
-> 1092         self._add_layers_with_plugins(
   1093             _path,
   1094             kwargs=kwargs,
   1095             plugin=plugin,
   1096             layer_type=layer_type,
   1097             stack=_stack,
   1098         )
   1099     )
   1100 # no plugin choice was made
   1101 else:
   1102     layers = self._open_or_raise_error(
   1103         _path, kwargs, layer_type, _stack
   1104     )

File [~/micromamba/envs/rb_p310/lib/python3.10/site-packages/napari/components/viewer_model.py:1292], in ViewerModel._add_layers_with_plugins(self, paths, stack, kwargs, plugin, layer_type)
   1290 else:
   1291     assert len(paths) == 1
-> 1292     layer_data, hookimpl = read_data_with_plugins(
   1293         paths, plugin=plugin, stack=stack
   1294     )
   1296 # glean layer names from filename. These will be used as *fallback*
   1297 # names, if the plugin does not return a name kwarg in their meta dict.
   1298 filenames = []

File [~/micromamba/envs/rb_p310/lib/python3.10/site-packages/napari/plugins/io.py:77], in read_data_with_plugins(paths, plugin, stack)
     74     assert len(paths) == 1
     75 hookimpl: Optional[HookImplementation]
---> 77 res = _npe2.read(paths, plugin, stack=stack)
     78 if res is not None:
     79     _ld, hookimpl = res

File [~/micromamba/envs/rb_p310/lib/python3.10/site-packages/napari/plugins/_npe2.py:63], in read(paths, plugin, stack)
     61     npe1_path = paths[0]
     62 try:
---> 63     layer_data, reader = io_utils.read_get_reader(
     64         npe1_path, plugin_name=plugin
     65     )
     66 except ValueError as e:
     67     # plugin wasn't passed and no reader was found
     68     if 'No readers returned data' not in str(e):

File [~/micromamba/envs/rb_p310/lib/python3.10/site-packages/npe2/io_utils.py:66], in read_get_reader(path, plugin_name, stack)
     62 if stack is None:
     63     # "npe1" old path
     64     # Napari 0.4.15 and older, hopefully we can drop this and make stack mandatory
     65     new_path, new_stack = v1_to_v2(path)
---> 66     return _read(
     67         new_path, plugin_name=plugin_name, return_reader=True, stack=new_stack
     68     )
     69 else:
     70     assert isinstance(path, list)

File [~/micromamba/envs/rb_p310/lib/python3.10/site-packages/npe2/io_utils.py:165], in _read(paths, stack, plugin_name, return_reader, _pm)
    160     read_func = rdr.exec(
    161         kwargs={"path": paths, "stack": stack, "_registry": _pm.commands}
    162     )
    163     if read_func is not None:
    164         # if the reader function raises an exception here, we don't try to catch it
--> 165         if layer_data := read_func(paths, stack=stack):
    166             return (layer_data, rdr) if return_reader else layer_data
    168 if plugin_name:

File [~/micromamba/envs/rb_p310/lib/python3.10/site-packages/npe2/manifest/contributions/_readers.py:60], in ReaderContribution.exec..npe1_compat(paths, stack)
     57 @wraps(callable_)
     58 def npe1_compat(paths, *, stack):
     59     path = v2_to_v1(paths, stack)
---> 60     return callable_(path)

File [~/micromamba/envs/rb_p310/lib/python3.10/site-packages/napari_ome_zarr/_reader.py:105], in transform..f(*args, **kwargs)
    102 def f(*args: Any, **kwargs: Any) -> List[LayerData]:
    103     results: List[LayerData] = list()
--> 105     for node in nodes:
    106         data: List[Any] = node.data
    107         metadata: Dict[str, Any] = {}

File [~/micromamba/envs/rb_p310/lib/python3.10/site-packages/ome_zarr/reader.py:625], in Reader.__call__(self)
    624 def __call__(self) -> Iterator[Node]:
--> 625     node = Node(self.zarr, self)
    626     if node.specs:  # Something has matched
    627         LOGGER.debug("treating %s as ome-zarr", self.zarr)

File [~/micromamba/envs/rb_p310/lib/python3.10/site-packages/ome_zarr/reader.py:59], in Node.__init__(self, zarr, root, visibility, plate_labels)
     57     self.specs.append(PlateLabels(self))
     58 elif Plate.matches(zarr):
---> 59     self.specs.append(Plate(self))
     60     # self.add(zarr, plate_labels=True)
     61 if Well.matches(zarr):

File [~/micromamba/envs/rb_p310/lib/python3.10/site-packages/ome_zarr/reader.py:478], in Plate.__init__(self, node)
    476 super().__init__(node)
    477 LOGGER.debug("Plate created with ZarrLocation fmt: %s", self.zarr.fmt)
--> 478 self.get_pyramid_lazy(node)

File [~/micromamba/envs/rb_p310/lib/python3.10/site-packages/ome_zarr/reader.py:504], in Plate.get_pyramid_lazy(self, node)
    502 well_spec: Optional[Well] = well_node.first(Well)
    503 if well_spec is None:
--> 504     raise Exception("Could not find first well")
    505 self.numpy_type = well_spec.numpy_type
    507 LOGGER.debug("img_pyramid_shapes: %s", well_spec.img_pyramid_shapes)

Exception: Could not find first well

Versions

napari-ome-zarr = 0.5.2
ome-zarr = 0.8.0
zarr = 2.16.0
napari = 0.4.18
python = 3.10.12

issues with loading labels data with this plugin

I'm working on saving a dataset that contains multiple segmentations per image (manual segmentations by different people). Trying stuff out, I'm having three related issues:

  • the labels layer is invisible by default. This isn't super great for the intended use of the dataset. I understand that there may be performance reasons why that wouldn't work, but it might be worth considering the alternatives (e.g. only one labels layer is visible by default). Overall this would be minor if not for:
  • If I have more than one labels group in the same ome-zarr dataset, only the first group is loaded. (1) And,
  • If I only have labels in the ome-zarr dataset, the plugin fails to read at all. (2)

I test reading in with napari --plugin napari-ome-zarr path/to/example.zarr. I've got two scripts to reproduce the issue, both based on the write_image script from the ome-zarr-py docs.

Any advice here is appreciated! I don't know how much is intentional design vs how much is simply historical.

(1) Writing an ome-zarr image with two labels groups (only the first group is loaded)
import numpy as np
import zarr
import os

from skimage.data import binary_blobs
from ome_zarr.io import parse_url
from ome_zarr.writer import write_image

path = "test_ngff_image.zarr"
os.mkdir(path)

mean_val=10
size_xy = 128
size_z = 10
rng = np.random.default_rng(0)
data = rng.poisson(mean_val, size=(size_z, size_xy, size_xy)).astype(np.uint8)

# write the image data
store = parse_url(path, mode="w").store
root = zarr.group(store=store)
write_image(
        image=data, group=root, axes="zyx",
        scaler=None,  # don't create multiscales
        storage_options=dict(chunks=(1, size_xy, size_xy))
        )
# optional rendering settings
root.attrs["omero"] = {
    "channels": [{
        "color": "00FFFF",
        "window": {"start": 0, "end": 20, "min": 0, "max": 255},
        "label": "random",
        "active": True,
    }]
}


# add labels...
blobs = binary_blobs(length=size_xy, volume_fraction=0.1, n_dim=3).astype('int8')
blobs2 = binary_blobs(length=size_xy, volume_fraction=0.1, n_dim=3).astype('int8')
# blobs will contain values of 1, 2 and 0 (background)
blobs += 2 * blobs2

# label.shape is (size_xy, size_xy, size_xy), Slice to match the data
label = blobs[:size_z, :, :]

# write the labels to /labels
labels_grp = root.create_group("labels")
# the 'labels' .zattrs lists the named labels data
label_name = "blobs"
labels_grp.attrs["labels"] = [label_name]
label_grp = labels_grp.create_group(label_name)
# need 'image-label' attr to be recognized as label
label_grp.attrs["image-label"] = {}

write_image(label, label_grp, axes="zyx", scaler=None)


# label.shape is (size_xy, size_xy, size_xy), Slice to match the data
label2 = blobs[-size_z:, :, :]

# write the labels to /labels
labels_grp2 = root.create_group("labels2")
# the 'labels' .zattrs lists the named labels data
label_name2 = "blobs2"
labels_grp2.attrs["labels"] = [label_name2]
label_grp2 = labels_grp2.create_group(label_name)
# need 'image-label' attr to be recognized as label
label_grp2.attrs["image-label"] = {}

write_image(label2, label_grp2, axes="zyx", scaler=None)
(2) Writing an ome-zarr file with no image but with a labels group (reading fails altogether)
import numpy as np
import zarr
import os

from skimage.data import binary_blobs
from ome_zarr.io import parse_url
from ome_zarr.writer import write_image

path = "test_ngff_image_labels_only.zarr"
os.mkdir(path)

mean_val=10
size_xy = 128
size_z = 10
rng = np.random.default_rng(0)
data = rng.poisson(mean_val, size=(size_z, size_xy, size_xy)).astype(np.uint8)

# write the image data
store = parse_url(path, mode="w").store
root = zarr.group(store=store)

# add labels...
blobs = binary_blobs(length=size_xy, volume_fraction=0.1, n_dim=3).astype('int8')
blobs2 = binary_blobs(length=size_xy, volume_fraction=0.1, n_dim=3).astype('int8')
# blobs will contain values of 1, 2 and 0 (background)
blobs += 2 * blobs2

# label.shape is (size_xy, size_xy, size_xy), Slice to match the data
label = blobs[:size_z, :, :]

# write the labels to /labels
labels_grp = root.create_group("labels")
# the 'labels' .zattrs lists the named labels data
label_name = "blobs"
labels_grp.attrs["labels"] = [label_name]
label_grp = labels_grp.create_group(label_name)
# need 'image-label' attr to be recognized as label
label_grp.attrs["image-label"] = {}

write_image(label, label_grp, axes="zyx", scaler=None)

Reading from private S3 bucket

Is there a way to read from a private bucket? I would expect that in such a case, I would be asked for credentials before proceeding but at the moment reading from a non-public bucket fails with ERROR 403, message='Forbidden'.
Am I missing something or is it simply not implemented/possible?

Napari colormaps within OME-Zarr Metadata

I am noticing that the OME-Zarr specifications have an option to have a "color" properties within the channel metadata. We use this property do define the colormap for when napari-ome-zarr opens up the dataset. However, it seems that doing this creates a ton of new colormaps within Napari with the title unnamed, and doesn't actually use a napari colormap.

Is there a way to define metadata such that we use napari's inherent colormaps? I see that the _reader.py looks for a metadata key called colormap which gets converted to a vispy colormap, but I am not entirely sure how to include this in the ome metadata within the zarr file or if this vispy colormap will correspond to one of napari's inherent maps and not create a new 'unnamed' one.

Let me know if I can clarify anything here. Thanks for the help

Support coordinateTransformations on multiscales v0.4

Currently we only read the coordinateTransformations on the first dataset of the first multiscales item.
We should also read the coordinateTransformations on the multiscales itself (if present).

Ideally this should be combined with coordinateTransformations on the dataset.

Demo zarr does not work

Hi,

It appears that in the current versions (0.5.0 and main) the opening of the demo url does not work anymore:

napari "https://uk1s3.embassy.ebi.ac.uk/idr/zarr/v0.3/9836842.zarr/"

yields

napari.errors.reader_errors.NoAvailableReaderError: No plugin found capable of reading https://uk1s3.embassy.ebi.ac.uk/idr/zarr/v0.3/9836842.zarr/

versions:

napari : 0.4.16
zarr : 2.11.3
napari_ome_zarr : 0.5.1.dev4+g4e506d

Best way to save/view RGB images

Hi,

Not really an issue but more of a question:
What is currently the best way to save and view multi-resolution RGB images with ome-zarr-py/napari-ome-zarr?

E.g. how can I save a RGB array x = np.random.randint(0,100,(100,100,3)) as multi resolution ome-zarr such that it is displayed correctly in napari?

As napari expects RGB images to be in YXC format but the ome-zarr specs demand channels to come before axis dimensions I assume this might involve some custom dimension swap transform as metadata but I have no idea whether that is even possible ;)

Is there something easy that achieves what I want which I might have missed?

Duplicated layers loaded with s3fs & fsspec 2023.12.2

I'm running into an issue when setting up fresh environments with the napari-ome-zarr plugin and could trace it back to an interaction with fsspec==2023.12.2 & s3fs==2023.12.2.

Using different OME-Zarr datasets (for example this public dataset: https://zenodo.org/records/10257149), I see the same layers being loaded multiple times. Sometimes a layer is loaded 3x, sometimes 46x.
The same dataset has different behaviors whether I load it from my local SSD (more duplicates) or slower remote storage.

Screenshot 2024-02-02 at 10 59 55

The issue is reproducible for local storage and remote shares, on Windows & Mac, with napari 0.4.18, 0.4.19 & 0.5.0dev versions from the main branch.

The thing that fixes it is reverting to an older version of fsspec & s3fs, e.g. by doing: pip install s3fs==2023.9.2


I couldn't trace the underlying issue yet. I don't know if it's a bug in fsspec, s3fs or the way ome-zarr py is using the libraries. Just thought I'd document it here for the moment in case other napari-ome-zarr users come across it. Downgrading the fsspec & s3fs versions does solve it (that was a wasted afternoon to figure out, so maybe it saves someone else time to find this here ;))

Problems opening a 384 well-plate

Hi,

I have a 384 well-plate ome-zarr file where each well has a single field of shape (4, 5, 4096, 4096) (CZYX). When I try to open this I get the following error message:
ValueError: Shape of in dividual tiles in multiscale (16384, 24576) cannot exceed GL_MAX_TEXTURE_SIZE 16384. Rendering is currently in 2D mode.

Are "labels" supported by the plugin?

I have an ome.zarr file with a multi-channel image and two label images, a cell segmentation and a nucleus segmentation:

$ tree -L 2 MMStack_Pos0.ome.zarr
MMStack_Pos0.ome.zarr
├── 0
...
└── labels
    ├── cells
    └── nuclei

When I open this file via the napari-ome-zarr plugin the image is opened and displayed correctly, but the labels are not opened:

napari-ome-zarr.mp4

Are "labels" supported by the plugin? (If yes, then maybe something with how I save the labels is incorrect.)
The example file is available here: https://oc.embl.de/index.php/s/ezHha9qb8HjxSfd

Open data from existing group?

Thanks for providing this great plugin!

We have been exploring storing and reading OME-Zarr data in Arraylake. For our application, it's not possible to just pass an HTTP URL directly. We would need to open the Zarr data outside of Napari and then pass a Zarr group directly. Something like this

import napari
import arraylake as al

client = al.Client()
repo = client.get_repo("earthmover-demos/ome-zarr")
group = repo.root_group["path-to-OME-Zarr-group"]  # type: Zarr.group

viewer = napari.Viewer()
viewer.open(group, plugin="napari-ome-zarr")

Today this does not work (ValueError: Given reader 'napari' is not a compatible reader for ['2']. No compatible readers are available for ['2'].)

Supporting opening a group directly would also perhaps fix the compatibility issues with WebKnossos described in #66.

Failed to open a zarr file

Dear All,

I am trying to visualize ome.zarr file in napari on Windows 10.

I converted a dataset to zarr from the https://camelyon17.grand-challenge.org/Data/

I downloaded it using the following script

import argparse
import ftplib
import os


def download_camelyon16_image(filename):
    filename = filename.lower()
    if os.path.exists(filename):
        print(f"The image [{filename}] already exist locally.")
    else:
        print(f"Downloading '{filename}'...")
        prefix = filename.split("_")[0].lower()
        if prefix == "test":
            folder_name = "testing/images"
        elif prefix in ["normal", "tumor"]:
            folder_name = f"training/{prefix}"
        else:
            raise ValueError(
                f"'{filename}' not found on the server."
                " File name should be like 'test_001.tif', 'tumor_001.tif', or 'normal_001.tif'"
            )
        path = f"gigadb/pub/10.5524/100001_101000/100439/CAMELYON16/{folder_name}/"
        ftp = ftplib.FTP("parrot.genomics.cn")
        ftp.login("anonymous", "")
        ftp.cwd(path)
        ftp.retrbinary("RETR " + filename, open(filename, "wb").write)
        ftp.quit()


if __name__ == "__main__":
    arg_parser = argparse.ArgumentParser()
    arg_parser.add_argument(
        "filename",
        type=str,
        help="The image name in Camelyon16 dataset to be downloaded.",
    )
    args = arg_parser.parse_args()
    download_camelyon16_image(args.filename)

I then converted it into zarr using the following sccript:

import numpy as np
import zarr
import os
#Just to load the openslide dll
os.add_dll_directory('openslide-win64-20171122/bin/')
from openslide import OpenSlide

#Require pip install openslide-python
slide = OpenSlide('data/tumor_001.tif')
file_name = 'data/tumor_001.zarr'

root = zarr.open_group(file_name, mode='a')

for i in range(0, 3):
    print(i, 10)
    shape = (slide.level_dimensions[i][0], slide.level_dimensions[i][1], 4)
    z1 = root.create_dataset(str(i), shape=shape, chunks=(300, 300, None),
                             dtype='uint8')

    # image = np.asarray(slide.read_region((0, 0), i,
    #                    slide.level_dimensions[i])).transpose(1, 0, 2)
    # z1[:] = image

    for j in range(slide.level_dimensions[i][0]//1528):
        print(j, slide.level_dimensions[i][0]/1528)
        image = np.asarray(slide.read_region((j*1528*(2**i), 0), i,
                           (1528, slide.level_dimensions[i][1]))).transpose(1, 0, 2)
        z1[j*1528:(j+1)*1528] = image

I installed the plugin for napari ( pip install ome-zarr )and tried to open it but get the following error:

napari "data\tumor_001.zarr"
10:46:57 ERROR exception calling callback for <Future at 0x2173e8e27c0 state=finished returned ChunkRequest>
Traceback (most recent call last):
  File "c:\gbw_myprograms\anaconda3\envs\napari-env2\lib\concurrent\futures\_base.py", line 328, in _invoke_callbacks
    callback(self)
  File "c:\gbw_myprograms\anaconda3\envs\napari-env2\lib\site-packages\napari\components\experimental\chunk\_pool.py", line 162, in _on_done
    self._on_done_loader(request)
  File "c:\gbw_myprograms\anaconda3\envs\napari-env2\lib\site-packages\napari\components\experimental\chunk\_loader.py", line 225, in _on_done
    self.cache.add_chunks(request)
  File "c:\gbw_myprograms\anaconda3\envs\napari-env2\lib\site-packages\napari\components\experimental\chunk\_cache.py", line 108, in add_chunks
    self.chunks[request.location] = request.chunks
  File "c:\gbw_myprograms\anaconda3\envs\napari-env2\lib\site-packages\napari\_vendor\experimental\cachetools\cachetools\lru.py", line 19, in __setitem__
    cache_setitem(self, key, value)
  File "c:\gbw_myprograms\anaconda3\envs\napari-env2\lib\site-packages\napari\_vendor\experimental\cachetools\cachetools\cache.py", line 47, in __setitem__
    raise ValueError('value too large')

Any tips will be appreciated
Benjamin

installation page usage error, ValueError: No plugin found capable of reading 'https://s3.embassy.ebi.ac.uk/idr/zarr/v0.1/6001240.zarr/'.

python = 3.9
conda = 4.10.3
os = windows 10

Hi,
I have been playing around with ome-zarr writer and encountered ValueError: No plugin found capable of reading test.zarr
This led me to see if I could load the test data. However, when I run

napari "https://s3.embassy.ebi.ac.uk/idr/zarr/v0.1/6001240.zarr/"

from console, this also returns

ValueError: No plugin found capable of reading 'https://s3.embassy.ebi.ac.uk/idr/zarr/v0.1/6001240.zarr/'

I have tested and confirmed error in a new conda environment setup like so...

conda create -n napari_test_3pt9 python=3.9
conda activate napari_test_3pt9 
conda install pip
pip install napari[all]
pip install napari-ome-zarr

I expected to be able to load the test data as per the usage section on the installation page

Open labels in https://s3.embassy.ebi.ac.uk/idr/zarr/v0.1/6001247.zarr

We only have a single public example of labels at https://s3.embassy.ebi.ac.uk/idr/zarr/v0.1/6001247.zarr

e.g. wanted to provide this as a response to https://twitter.com/jm_mightypirate/status/1447486365252399108

But can't get it working...

Starting with recent releases:

napari-0.4.11
napari-ome-zarr==0.2.0
ome-zarr==0.1.0
zarr==2.9.5

$ napari https://s3.embassy.ebi.ac.uk/idr/zarr/v0.1/6001247.zarr -vvv

Opens the image (2 channels) but No labels channel.

Labels seen in verbose logging...

10:53:09 DEBUG open(ZarrLocation(https://s3.embassy.ebi.ac.uk/idr/zarr/v0.1/labels))
10:53:09 DEBUG https://s3.embassy.ebi.ac.uk/idr/zarr/v0.1/labels
10:53:09 DEBUG Created legacy flat FSStore(https://s3.embassy.ebi.ac.uk/idr/zarr/v0.1/labels, r)
10:53:09 DEBUG https://s3.embassy.ebi.ac.uk/idr/zarr/v0.1/labels/.zarray
10:53:09 DEBUG https://s3.embassy.ebi.ac.uk/idr/zarr/v0.1/labels/.zgroup
10:53:10 WARNING version mismatch: detected:FormatV03, requested:FormatV01
10:53:10 DEBUG https://s3.embassy.ebi.ac.uk/idr/zarr/v0.1/labels
10:53:10 DEBUG Created nested FSStore(https://s3.embassy.ebi.ac.uk/idr/zarr/v0.1/labels, r, {'dimension_separator': '/', 'normalize_keys': True})
10:53:10 DEBUG https://s3.embassy.ebi.ac.uk/idr/zarr/v0.1/labels/.zarray
10:53:10 DEBUG https://s3.embassy.ebi.ac.uk/idr/zarr/v0.1/labels/.zgroup

but nothing after that...

TODO: try more versions...

Unable to load webknossos hosted zarr datasets

When testing to load ome/zarr data from webknossos, I could not see any data, everything is shown black. This might be due to 1) Fortran (column-major) order data, and/or 2) the dimension order cxyz.

Steps to reproduce:

  1. I adapted filename_patterns in napari.yaml of this plugin to be able to open the webknossos zarr-streamed datasets (e.g. adding https://*.webknossos.org/data/zarr/*)
  2. Opening any of these datasets does not show any data:
  • python3 -m napari "https://data-humerus.webknossos.org/data/zarr/scalable_minds/l4_sample_dev/color"
  • python3 -m napari "https://data-humerus.webknossos.org/data/zarr/scalable_minds/skin/color"

Please note that the datasets for the different resolutions can be loaded via zarr-python, e.g. using

zarr.open_array("https://data-humerus.webknossos.org/data/zarr/scalable_minds/skin/color/1")

Additional info:

$ cat /etc/debian_version 
10.12
$ python --version
Python 3.8.11
$ napari --version
napari version 0.4.16
$ pip3 list | grep napari-ome-zarr
napari-ome-zarr                   0.5.2

Move to new plugin architecture

Do we need to additionally start testing wit the new engine?

We recommend moving your plugin to the new plugin architecture as soon as possible. The conversion process is simple and is explained in this guide - note that this is all still a work in progress however, and there may be frequent changes made to the docs.

The testing process for plugins should not change however, save for needing an explicit dependency on npe2, for now.

Originally posted by @DragaDoncila in #26 (comment)

Handle AnnData regions tables

With ome/ome-zarr-py#256 (work in progress) we hope to have AnnData tables available to napari-ome-zarr.

I've been using this to read "points" tables and display as tracks:
#81

So now I also want to handle "regions" tables in napari.

I've been looking at the https://github.com/kevinyamauchi/ome-ngff-tables-prototype/blob/main/examples/save_load_squidpy_segment.py example.

The logic as I follow it goes:

  • The labels layer is added first without any table data
  • Read the AnnData as dataframe (X and var) and combine it with the obs to give "features_table"
  • get all the label values from the label.data
  • get the label IDs from the features_table, e.g. cell_id column, and check if we're missing any label values
  • If so, duplicate a table row and assign a cell_id so that ALL label values have a corresponding row in the table
  • Sort the features_table by cell_id, so that the order of rows matches the order of label values in the label layer
  • Pick a random column to use to color the labels - convert to float and use the range to assign colors
  • Pass the features_table dataframe to napari as "features" on an NEW labels layer

Then you can use https://www.napari-hub.org/plugins/napari-properties-viewer to show a table of features for each label:

In this screenshot, I edited the code to use cell_id for color rather than a random column:

Screenshot 2023-03-17 at 10 19 12

I was thinking of doing something similar in napari-ome-zarr with the following differences:

  • Try not to duplicate the labels layer. Just add table info to a single labels layer.
  • Avoid duplicating rows of data. Either create "blank" rows that only have the cell_id OR possibly use a dictionary for features with cell_id as key instead of a dataframe (might be less performant)?
  • Don't pick a random column for color - sometimes this causes an error if a string column is picked from the obs data. Either just pick cell_id as above, or not add color at-all.

cc @kevinyamauchi @LucaMarconato @giovp

Support for N5 drag and drop

Hi, I gather that ome-zarr is able to handle N5 files (via https://zarr.readthedocs.io/en/stable/api/n5.html). Some of our workflows are currently using a mix of N5 and ome-zarr (e.g. from MoBIE projects in Fiji), with a view to using only ome-zarr in the future. If drag-and-drop of N5 into the napari plugin would be possible it'd help with not having to write custom scripts or convert large files. I don't envisage needing to write N5, only read.

Cheers,
Martin

Re-activate Qt tests in non-OSX workflows

Qt on Windows and Ubuntu are going to take more work.
But ultimately, I'm fairly confident that GH actions are now properly testing the the cross-platform support.

not sure if it will be useful to you... but I have a branch of napari working with GH actions on all platforms (there may of course be other OMERO specific stuff that still need to be figured out): https://github.com/tlambert03/napari/blob/feature/github-tests/.github/workflows/tests.yml

Originally posted by @tlambert03 in ome/ome-zarr-py#47 (comment)

Writing Labels

Hello!

Is there any way I can edit and write the labels stored in an OME-Zarr file? We want to use Napari and this plugin to do mask corrections, but my labels seem read-only so far.

Cannot open OME-Zarr created with bioformats2raw

Hi all,

We have a OME-Zarr (190 GB) created with a freshly installed bioformats2raw, installed into a new conda environment (python=3.8) using the ome channel on conda-forge on a Window 10 machine. The top-level file structure looks as follows:

lung.zarr
├── 0
├── OME
├── .zattrs
└── .zgroup

The call to bioformats2raw was:
bioformats2raw image.pattern d:/lung_2x/lung.zarr where image.pattern contains fused_ch<00-16>.tif. Each .tif file is ~14 GB, is a 3D stack created by the BigStitcher plugin in FIJI, and contains one channel.

Using a brand new conda environment (python=3.9) with the latest Napari and napari-ome-zarr, installed using pip install napari[all] napari-ome-zarr, we try to open the file using 'napari d:/lung_2x/lung.zarr and get the following error: ValueError: No plugin found capable of reading 'D:\\lung_2x\\lung.zarr'.

If we instead call Napari using to napari d:/lung.zarr/0 it does open the data, with the channels appearing on one of the sliders.

Any suggestions on how to get Napari to properly open this as a OME-Zarr?

Thanks.

IndexError: index 144 is out of bounds for axis 1 with size 26

With origin/main, I still see the error from #6: TypeError: len() of unsized object

With this PR, the image & label layers load.

I note that I see a separate error when viewing the labels in 3D:

/usr/local/anaconda3/envs/z/lib/python3.9/site-packages/napari/layers/labels/labels.py in _get_value_ray(self=<Labels layer '0'>, start_point=<class 'numpy.ndarray'> (3,) float64, end_point=<class 'numpy.ndarray'> (3,) float64, dims_displayed=[2, 3, 4])
    972                 sample_points, self._display_bounding_box(dims_displayed)
    973             ).astype(int)
--> 974             values = im_slice[tuple(clamped.T)]
        values = undefined
        im_slice = <class 'numpy.ndarray'> (257, 26, 31) int64
        global tuple = undefined
        clamped.T = <class 'numpy.ndarray'> (3, 504) int64
    975             nonzero_indices = np.flatnonzero(values)
    976             if len(nonzero_indices > 0):

IndexError: index 144 is out of bounds for axis 1 with size 26

Originally posted by @joshmoore in #12 (comment)

cc: @jni

PyPI not available

According to the README napari-ome-zarr is be available on PyPI, but

WARNING: Value for scheme.headers does not match. Please report this to <https://github.com/pypa/pip/issues/9617>
distutils: /home/pape/Work/software/conda/miniconda3/envs/torch-em-test/include/python3.8/UNKNOWN
sysconfig: /home/pape/Work/software/conda/miniconda3/envs/torch-em-test/include/python3.8
WARNING: Additional context:
user = False
home = None
root = None
prefix = None
ERROR: Could not find a version that satisfies the requirement napari-ome-zarr (from versions: none)
ERROR: No matching distribution found for napari-ome-zarr

Fail to read multi-image.ome.zarr from local filesystem

Napari fails to open the local file multi-image.ome.zarr obtained from the OME NGFF prototypes with the error:

shape = image[0].shape
AttributeError: 'list' object has no attribute 'shape'

Steps to reproduce:
Download and unzip the ome-ngff prototypes from https://oc.embl.de/index.php/s/0TPPw4CUt0G6nng

conda create -y -n napari-env -c conda-forge python=3.9  
conda activate napari-env  
conda install -c conda-forge napari  
pip install napari-ome-zarr  
napari './ome-ngff-prototypes/v0.4/multi-image.ome.zarr'   

However, it works over s3:

napari 'https://s3.embl.de/i2k-2020/ngff-example-data/v0.4/multi-image.ome.zarr'

So not sure if it's a napari plugin problem or a problem with the downloaded files.

Can't open Zarr files with labels

Hello,

When I try to open an OME-Zarr file with labels inside the file I get an unspecified error and the files fails to load. This happens with the newest version 0.4.19 with the newest version of the plugin. I do not have this issue on older versions (0.4.18) with possibly an older version of the plugin. I think it might be related to the issue I posted here: ome/ome-zarr-py#357

CLI: napari <PATH> fails for some ome.zarr files

For me opening the data from https://oc.embl.de/index.php/s/wfpdHpgHseGUTYK fails if I open it via $ napari MMStack_Pos0.ome.zarr. But opening it via drag and drop works. See the full stack trace below.
Note that opening other ome.zarrs works via the CLI, e.g. $ napari https://uk1s3.embassy.ebi.ac.uk/idr/zarr/v0.3/idr0079A/9836998.zarr, and I also successfully tried with some other files from local disc.

$ napari MMStack_Pos0.ome.zarr
Traceback (most recent call last):
  File "/home/pape/software/conda/miniconda3/envs/napari/bin/napari", line 10, in <module>
    sys.exit(main())
  File "/home/pape/software/conda/miniconda3/envs/napari/lib/python3.10/site-packages/napari/__main__.py", line 449, in main
    _run()
  File "/home/pape/software/conda/miniconda3/envs/napari/lib/python3.10/site-packages/napari/__main__.py", line 314, in _run
    viewer = view_path(  # noqa: F841
  File "/home/pape/software/conda/miniconda3/envs/napari/lib/python3.10/site-packages/napari/view_layers.py", line 178, in view_path
    return _make_viewer_then('open', args, kwargs)
  File "/home/pape/software/conda/miniconda3/envs/napari/lib/python3.10/site-packages/napari/view_layers.py", line 126, in _make_viewer_then
    method(*args, **kwargs)
  File "/home/pape/software/conda/miniconda3/envs/napari/lib/python3.10/site-packages/napari/components/viewer_model.py", line 907, in open
    self._add_layers_with_plugins(
  File "/home/pape/software/conda/miniconda3/envs/napari/lib/python3.10/site-packages/napari/components/viewer_model.py", line 952, in _add_layers_with_plugins
    layer_data, hookimpl = read_data_with_plugins(
  File "/home/pape/software/conda/miniconda3/envs/napari/lib/python3.10/site-packages/napari/plugins/io.py", line 61, in read_data_with_plugins
    res = _npe2.read(path, plugin)
  File "/home/pape/software/conda/miniconda3/envs/napari/lib/python3.10/site-packages/napari/plugins/_npe2.py", line 40, in read
    layer_data, reader = read_get_reader(path, plugin_name=plugin)
  File "/home/pape/software/conda/miniconda3/envs/napari/lib/python3.10/site-packages/npe2/io_utils.py", line 44, in read_get_reader
    return _read(path, plugin_name=plugin_name, return_reader=True)
  File "/home/pape/software/conda/miniconda3/envs/napari/lib/python3.10/site-packages/npe2/io_utils.py", line 131, in _read
    layer_data = read_func(path)
  File "/home/pape/software/conda/miniconda3/envs/napari/lib/python3.10/site-packages/napari/types.py", line 144, in reader_function
    result = func(*args, **kwargs)
  File "/home/pape/software/conda/miniconda3/envs/napari/lib/python3.10/site-packages/napari/utils/io.py", line 236, in magic_imread
    image, zarr_shape = read_zarr_dataset(filename)
  File "/home/pape/software/conda/miniconda3/envs/napari/lib/python3.10/site-packages/napari/utils/io.py", line 328, in read_zarr_dataset
    image.append(read_zarr_dataset(os.path.join(path, subpath))[0])
  File "/home/pape/software/conda/miniconda3/envs/napari/lib/python3.10/site-packages/napari/utils/io.py", line 329, in read_zarr_dataset
    shape = image[0].shape
AttributeError: 'list' object has no attribute 'shape'

napari reader preferences are not respected

Hi,

When drag/drop a zarr folder into napari while multiple plugins capable of reading zarr exists, the napari reader preference dialog is currently always displayed, without the option of saving the choice as part of the preferences (which is expected behavior).

I guess that might be due to a napari.manifest missing that would register a filename pattern, so simply adding this might already solve the issue.

Cheers,
M

PS: thanks a lot for the plugin, it's very useful!

Can't open volumetric data via S3 (works locally!)

Opening the data generated with https://github.com/ome/ome-ngff-prototypes/tree/main/workflows/mobie-example via s3 does not work. Opening it locally works without issues.

Here's napari when opening it locally via $ napari Covid19-S4-Area2/images/ome-zarr/raw.ome.zarr:
image

And here's napari when opening it from s3, eitther with
$ napari https://s3.embl.de/i2k-2020/project-ome-zarr/Covid19-S4-Area2/images/ome-zarr/raw.ome.zarr or napari https://uk1s3.embassy.ebi.ac.uk/bia-mobie-test/Covid19-S4-Area2/images/ome-zarr/raw.ome.zarr. (The dataset is available both via the EMBL-Heidelberg and EBI S3):
image

Note that the data can be accessed from these s3 stores via BigDataViewer (Plugin->BigDataViewer->OME ZARR->Open OME ZARR from S3..., needs MoBIE update site and latest version of it).

opening zarr file just the lowest pyramid level is loaded

I have a zarr file made according to the High Content Screening metadata and each field of view has 5 pyramid levels.
When I open the file in napari it loads just the lowest level and in the debug I see this messages that I think could be useful

10:56:36 DEBUG img_pyramid_shapes: [(3, 19, 12960, 15360), (3, 19, 6480, 7680), (3, 19, 3240, 3840), (3, 19, 1620, 1920), (3, 19, 810, 960)] 10:56:36 DEBUG target_level: 4 10:56:36 DEBUG get_stitched_grid() level: 4, tile_shape: (3, 19, 810, 960)

From my understanding seems that this behavior is related to this https://github.com/ome/ome-zarr-py/blob/6dbd227f9af0cfe78b2b6e28b40c0c9b750d6a05/ome_zarr/reader.py#L502

Am I wrong ? Have you got any clue to solve the problem ?
Thank you very much

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.