GithubHelp home page GithubHelp logo

brainglobe / brainglobe-atlasapi Goto Github PK

View Code? Open in Web Editor NEW
112.0 10.0 23.0 678 KB

A lightweight python module to interact with atlases for systems neuroscience

Home Page: https://brainglobe.info/documentation/brainglobe-atlasapi/index.html

License: BSD 3-Clause "New" or "Revised" License

Python 65.77% Jupyter Notebook 34.23%
neuroscience python microscopy registration visualisation brain atlases

brainglobe-atlasapi's Introduction

brainglobe-atlasapi

Python Version PyPI Wheel Development Status Downloads Tests codecov Code style: black Imports: isort pre-commit DOI License Contributions Website Twitter

The brainglobe atlas API (brainglobe-atlasapi) provides a common interface for programmers to download and process brain atlas data from multiple sources.

Atlases available

A number of atlases are in development, but those available currently are:

Installation

brainglobe-atlasapi works with Python >3.6, and can be installed from PyPI with:

pip install brainglobe-atlasapi

Usage

Full information can be found in the documentation

Python API

List of atlases

To see a list of atlases use brainglobe_atlasapi.show_atlases

from brainglobe_atlasapi import show_atlases
show_atlases()
#                                Brainglobe Atlases
# ╭──────────────────────────────────┬────────────┬───────────────┬──────────────╮
# │ Name                             │ Downloaded │ Local version │    Latest    │
# │                                  │            │               │   version    │
# ├──────────────────────────────────┼────────────┼───────────────┼──────────────┤
# │ allen_human_500um                │     ✔      │      0.1      │     0.1      │
# │ mpin_zfish_1um                   │     ✔      │      0.3      │     0.3      │
# │ allen_mouse_50um                 │     ✔      │      0.3      │     0.3      │
# │ kim_unified_25um                 │     ✔      │      0.1      │     0.1      │
# │ allen_mouse_25um                 │     ✔      │      0.3      │     0.3      │
# │ allen_mouse_10um                 │     ✔      │      0.3      │     0.3      │
# │ example_mouse_100um              │    ---     │      ---      │     0.3      │
# ╰──────────────────────────────────┴────────────┴───────────────┴──────────────╯

Using the atlases

All the features of each atlas can be accessed via the BrainGlobeAtlas class.

e.g. for the 25um Allen Mouse Brain Atlas:

from brainglobe_atlasapi.bg_atlas import BrainGlobeAtlas
atlas = BrainGlobeAtlas("allen_mouse_25um")

The various files associated with the atlas can then be accessed as attributes of the class:

# reference image
reference_image = atlas.reference
print(reference_image.shape)
# (528, 320, 456)

# annotation image
annotation_image = atlas.annotation
print(annotation_image.shape)
# (528, 320, 456)

# a hemispheres image (value 1 in left hemisphere, 2 in right) can be generated
hemispheres_image = atlas.hemispheres
print(hemispheres_image.shape)
# (528, 320, 456)

Brain regions

There are multiple ways to work with individual brain regions. To see a dataframe of each brain region, with it's unique ID, acronym and full name, use atlas.lookup_df:

atlas.lookup_df.head(8)
#      acronym         id                           name
# 0       root        997                           root
# 1       grey          8  Basic cell groups and regions
# 2         CH        567                       Cerebrum
# 3        CTX        688                Cerebral cortex
# 4      CTXpl        695                 Cortical plate
# 5  Isocortex        315                      Isocortex
# 6        FRP        184  Frontal pole, cerebral cortex
# 7       FRP1         68          Frontal pole, layer 1

Each brain region can also be access by the acronym, e.g. for primary visual cortex (VISp):

from pprint import pprint
VISp = atlas.structures["VISp"]
pprint(VISp)
# {'acronym': 'VISp',
#  'id': 385,
#  'mesh': None,
#  'mesh_filename': PosixPath('/home/user/.brainglobe/allen_mouse_25um_v0.3/meshes/385.obj'),
#  'name': 'Primary visual area',
#  'rgb_triplet': [8, 133, 140],
#  'structure_id_path': [997, 8, 567, 688, 695, 315, 669, 385]}

Note on coordinates in brainglobe-atlasapi

Working with both image coordinates and cartesian coordinates in the same space can be confusing! In brainglobe-atlasapi, the origin is always assumed to be in the upper left corner of the image (sectioning along the first dimension), the "ij" convention. This means that when plotting meshes and points using cartesian systems, you might encounter confusing behaviors coming from the fact that in cartesian plots one axis is inverted with respect to ij coordinates (vertical axis increases going up, image row indexes increase going down). To make things as consistent as possible, in brainglobe-atlasapi the 0 of the meshes coordinates is assumed to coincide with the 0 index of the images stack, and meshes coordinates increase following the direction stack indexes increase. To deal with transformations between your data space and brainglobe-atlasapi, you might find the brainglobe-space package helpful.

Contributing to brainglobe-atlasapi

Contributors to bg-atlaspi are absolutely encouraged, whether you want to fix bugs, add/request new features or simply ask questions.

If you would like to contribute to brainglobe-atlasapi (or any of the downstream tools like brainrender etc.) please get in touch by opening a new issue or pull request on GitHub. Please also see the developers guide.

Someone might have already asked a question you might have, so if you're not sure where to start, check out the issues (and the issues of the other repositories).

Citation

If you find the BrainGlobe Atlas API useful, please cite the paper in your work:

Claudi, F., Petrucco, L., Tyson, A. L., Branco, T., Margrie, T. W. and Portugues, R. (2020). BrainGlobe Atlas API: a common interface for neuroanatomical atlases. Journal of Open Source Software, 5(54), 2668, https://doi.org/10.21105/joss.02668

Don't forget to cite the developers of the atlas that you used!


Atlas Generation and Adding a New Atlas

For full instructions to add a new BrainGlobe atlas, please see here.

The brainglobe_atlasapi.atlas_generation submodule contains code for the generation of cleaned-up data, for the main brainglobe_atlasapi module. This code was previously the bg-atlasgen module.

To contribute

  1. Fork this repo
  2. Clone your repo
  3. Run git clone https://github.com/brainglobe/brainglobe-atlasapi
  4. Install an editable version of the package; by running pip install -e . within the cloned directory
  5. Create a script to package your atlas, and place into brainglobe_atlasapi/atlas_generation/atlas_scripts. Please see other scripts for examples.

Your script should contain everything required to run. The raw data should be hosted on a publicly accessible repository so that anyone can run the script to recreate the atlas.

If you need to add any dependencies, please add them as an extra in the pyproject.toml file, e.g.:

[project.optional-dependencies]
allenmouse = ["allensdk"]
newatlas = ["dependency_1", "dependency_2"]

brainglobe-atlasapi's People

Contributors

adamltyson avatar alessandrofelder avatar chrisroat avatar dstansby avatar fedeclaudi avatar igortatarnikov avatar k-meech avatar nickdelgrosso avatar orena1 avatar paddyroddy avatar polarbean avatar pre-commit-ci[bot] avatar raacampbell avatar sdiebolt avatar seankmartin avatar vigji avatar viktorpm avatar willgraham01 avatar yoda-vid avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

brainglobe-atlasapi's Issues

Artifacts when visualizing Atlas Structure Meshes

Hi,

I tried doing some visualization of the atlas meshes in Napari, which takes in the mesh vertices (mesh.points) and the mesh faces (mesh.cells[0].data), and noticed quite a bit of artifacting when the mesh is rotated in the viewer.

Screenshot 2020-10-14 183514

This goes away when the mesh is in one direction:

Screenshot 2020-10-14 183438

But all the faces disappear when the mesh is rotated 180 degress from that:

Screenshot 2020-10-14 183528

After digging into it a bit (I thought at first that it was a napari rendering issue), I think maybe it has to do with the vertex index order in the face data. This seems to be a problem even when the original .obj file is loaded with the original normals data (normal vectors visualized below, they look quite good):

normals_after_load

But when the lighting is tested in blender, a similar problem occurs (looking differently because of the different lighting model, but the same orientation behavior relative to the point light happens):

light_in_center

Recalculating the normals in blender produces a much more even lighting, but new artifacts appear as well:

blender-after-normal-recalculation

In short, I think that if the mesh data were either saved or loaded differently, artifacts in the 3D mesh visualization would disappear.

I hope this was clear. To recreate it and explore the artifacts, here's a script that recreates it:

import numpy as np
import napari
from bg_atlasapi import BrainGlobeAtlas

with napari.gui_qt():

    viewer = napari.Viewer(ndisplay=3)

    atlas = BrainGlobeAtlas("allen_mouse_25um")
    mesh = atlas.mesh_from_structure("CH")

    verts = (mesh.points - mesh.points.mean(axis=0)) / 100
    faces = mesh.cells[0].data
    values = np.ones(len(mesh.points))

    viewer.add_surface(name="CH", data=(verts, faces, values), opacity=0.3)

Formatting of atlases table

Trying to use the list atlases on my laptop, the table generated by list_atlases() in a notebook is impossible to read because everything is too squeezed (always from notebooks, and in terminal unless I widen it a lot from default).

Screenshot 2020-06-30 at 10 07 41

Maybe we can drop the remote URL?

Moving toward better atlas versioning

This issue comes from a long discussion that I had with @nickdelgrosso on those points. Will lay it out in full here, so that we can discuss it better at the meeting.

How does the atlas generation workflow work now

Currently, there are 3 steps at work behind the curtains when a user ask for a bg-atlasapi atlas.

  1. Download the data from a remote source (eg, the Allen website, or a BrainGlobe GIN repo), and structure it according to BrainGlobe specifications;
  2. Upload the compressed result to the brainglobe GIN repo.

(Those 2 steps are taken care of by brainglobe developers, using the code in bg-atlasgen. There, consistency of the atlas structure across atlases is ensured by always invoking the same wrap_atlas() function; and some utility allows to easily re-run generation of all atlases if something is changed in the format.

  1. Download it to the user directory

(This third step is what happens when a user instantiate locally an atlas, if the atlas is not yet available locally.)

Current atlas versioning schema consists of a mayor and a minor name; minor is upgraded for fixes in individual atlases, and major when something changes in the brainglobe atlas format (eg. a new metadata field, or different format for the stacks). Every time there is a change in the source data or in brainglobe format, brainglobe developers need to run again 1. and 2. for one or multiple atlases. Some code in bg-atlasapi check for the user local version and available remote versions, and prompts message to download again if required

Problems of the current formulation

  • atm there is no easy way to redo an analysis using an old version of bg-atlasapi and get an atlas that would work with that version, which would be very important for reproducibility;
  • we are not using at all version control functionality that GIN might offer;
  • we need some more solid validation code that given a folder with an atlas, validates it and check nothing would break when using it (important eg while developing an atlas). Atm this is a bit scattered over both bg-atlasgen and bg-atlasapi, and to understand how an atlas should be a person actually need to check both places. Validation code should live entirely in bg-atlasapi, although a separate repository for the brainglobe-maintained atlases generation is still helpful.

What we propose

  • Have a GIN repository for each atlas, and release it with a DOI every time a stable version is ready; in this way, it is citable, archived, and GIN ensure at least 10 year permanence of the data;
  • Have in bg-atlasapi validation code that can ensure that a folder is a valid atlas source, to facilitate the development of the atlas from third parts who would not have to look into the bg-atlasgen repo (which should remain for our use, nonetheless);
  • Have code in the BrainGlobeAtlas class to try to instantiate it with an arbitrary stable version, (or, for developers, even with a specific commit of the atlas repo); validation code would ensure that that specific version of the atlas is compatible with the bg-atlasapi version of the user.

There is an additional problem that is that different atlas resolutions end up in a lot of duplication (eg, meshes are duplicated for each atlas package at a specific resolution). Ideally, we should not need a different GIN repo for each resolution version of an atlas; on the other side, we also don't want to download a zipped file with all the different resolutions if we care only about one. It would be great to hear smart ideas for this point!

Here's a diagram of the new flow from @nickdelgrosso:
Screenshot 2021-01-13 at 18 34 31

Dealing with additional atlas resolutions

Currently, atlases are only defined at original resolutions (i.e. 10, 25 & 50um for the ARA and 125um for the Swanson rat atlas). I would like to support additional resolutions, e.g.:

  • I use a 100um ARA for automated testing and troubleshooting (anything bigger takes too long to register on 1/2 CPU cores)
  • I only have this atlas from the Kim lab at one resolution, and would like to support lower resolutions (a number of users have asked for lower resolutions, because they don't need them, and would rather quicker registration using less RAM).

I'm not sure exactly how to deal with this. Maybe:

  • For the ARA, we could generate the 100um atlas by just downsampling the 50um atlas from the API.
  • For the Kim atlas, which is defined in ARA coordinate space, I was planning on using the 10um annotations image file I have, and then combining it with the 10um template downloaded from the API. The other resolutions could be supported by downsampling the annotations file and combining with the 25um or 50um atlas downloaded from the Allen.
  • Any other atlas without an API could just be downsampled normally.

Speedup testing

Since we should be testing only the main API, that hopefully should be pretty contained in terms of dependencies/weird OSs interactions we can maybe restrict testing to Linux and to streamline it a bit also in the setup? We can always add back other OSs later when we reach a more stable version of things

moving structures/brainatlas-api

Hey @vigji ,

Is there any reason why you have for instance structures/brainatlas_api/simple_tree.py instead of having it under brainatlas_api/structures ?

I think that having all code under brainatlas_api/.. will make it easier to package the whole thing into a pip-installable package, though surely @adamltyson knows better about this. Either way it might be good to keep all code in the same place.

By the way thanks for all the work you've done so far, Adam and I will get started with the Rat atlas soon. It looks like it's all coming together rather nicely!

Mesh generations: keeps only one hemisphere

The function mesh_utils.extract_mesh_from_mask call's vedo's mesh.extractLargestRegion() (here) after using marching cubes on a image stack to generate the mesh.

This is a problem for atlases with two hemispheres as brain regions which are not connected along the mid-line will generate two large meshes and one will be excluded. This has been fixed by others by selecting the two largest structures, but it remains to be implemented in the atlas generation code.

BUG: get_structure_parent doesn work for Mouse Atlas

Using get_structure_parent with ABA25Um fails because:


E           AttributeError: 'ABA25Um' object has no attribute 'structures_ids'

I'll try to work on brainatlas tonight, updating the human atlas and testing stuff out, but stuff like this error should have been caught by the testing..

Location of atlas folder and config file

This was raised and discussed a bit in #3, addressing it again if we want to aim for a first working release.
We need a way of specifying for brainglobe where is the default location if the atlases. Ideally, it should be easy to fix/change by the user. In the future we might want to support atlases in multiple locations but at the moment we will stick to a single folder, so it just need to contain a single path.

The options would be:

  1. The current option: a variable in the package
  2. A variable read from a config (.json?) file.

I think that in #3 we converged on the second option, in which case we need to decide where to keep this file. Again there are two alternatives:
i. In the brainglobe package directory;
ii. In the root directory.

The first option has the disadvantage of being messy to find to modify manually; the second might not be accessible e.g. on clusters, as I think @adamltyson was pointing out (I have no experience on clusters). I think that we can opt for i, but it would be nice if there was a way of changing it that does not require the user to browse all the way inside the package and modify the file manually, but a little utility (maybe prompt command?) like brainatlas_api configure --atlas_path location that checks if the new path is valid and updates the file. Would this be ok? Is the prompt command an overkill?

Sort out confusion with cartesian/ij indexing

As per the slack discussion:

There is some ambiguity when working on displaying numpy arrays, coming from the fact that image-first applications (eg: matplotlib, brainrender) have the convention that as the numpy array index increase in a slice, you move downward and rightward in the image. As opposed to this, cartesian plotting assumes that, increasing values, you move upward and rightward in the plot. As a result, inconsistencies might happen. Napari takes care of those inconsistencies by adopting the image-first convention for points as well.

this has a series of consequences:

1.

If we want to describe the raw ARA origin as we get it with the bg-space convention, it is actually “asr” and not “asl”. To be sure about this, we can do the following:

spacecache = ReferenceSpaceCache(
    manifest=download_dir_path / "manifest.json",
    # downloaded files are stored relative to here
    resolution=100,
    reference_space_key="annotation/ccf_2017"
    # use the latest version of the CCF
)
ara_ref, h = spacecache.get_template_volume()
ara_ref[:, :, :57] = ara_ref[:, :, :57] / 2
viewer = napari.view_image(ara_ref)

and we can confirm that what gets dimmed is the right hem and not the left one. Conclusion: specify how ARA description is confusing, and moving standard BG orientation to “asr”

2.

This introduces confusion for applications using cartesian indexes. As a simple example, this

at = BrainGlobeAtlas("example_mouse_100um")
mod_ref = at.reference.copy()
#Create some points:
three_pts = np.array([[50, 40, 80], [60, 40, 80], [60, 40, 20]])# 2 in left hem, 1 in right hem
# Take points from root mesh for scatterplot:
bounds = at.root_mesh().points[::10, :] / 100
# Display everything in napari:
viewer = napari.view_image(mod_ref)
viewer.add_points(bounds, size=1, n_dimensional=True)
viewer.add_points(three_pts, size=10, n_dimensional=True)
fig = plt.figure()
ax = fig.add_subplot(111, projection='3d')
ax.scatter(bounds[:, 0], bounds[:, 1], bounds[:, 2], s=1)
ax.scatter(three_pts[:, 0], three_pts[:, 1], three_pts[:, 2], s=20)

produces a correct napari view (with 2 dots in the left hem, and 1 in the right one), but a flipped scatterplot, as I guess brainrender would do. The easiest solution, for ugly that it might sound, would be to just invert one axis (does not really matter which one) in cartesian indexed applications such as brainrender. This:

fig = plt.figure()
ax = fig.add_subplot(111, projection='3d')
ax.scatter(-bounds[:, 0], bounds[:, 1], bounds[:, 2], s=1)
ax.scatter(-three_pts[:, 0], three_pts[:, 1], three_pts[:, 2], s=20)

produces a plot with the correct orientation. For 2D applications (we don’t have any), one would have to flip the y when displaying.

3.

Finally, we always need to consider that when looking at the sliced data ( both with matplotlib and napari viewer in 2D mode), we are looking from the perspective of the slicing coordinate at point 0: for asr orientation, it means looking from the front, which always invert left and right hemispheres (left side of the image is right hemisphere).

This is 100% a display problem, as it will arise with matching underling stack and probe coordinates just by using different kinds of views. So I would not solve it messing with data (eg, exporting “brainrender compatible” coordinates), but by making one application (i would suggest brainrender, flipping an axis) compatible with the other.

What needs to be done:

  • a thorough section in the documentation that discuss this;
  • either simply move standard orientation to asr, or better address this concurrent description when instantiating the SpaceConvention from bg-space, in a way that addresses the ambiguity (see brainglobe/brainglobe-space#6). Such solution would be the neatest because then visualisation tools could reference to some SpaceConvention methods to figure out correct flips;
  • making sure brainrender display images coherent with the napari viewer, ideally going through the above point.

Final repo name before release

Are we sure about brainatlas-api? Before making it permanent, I would reconsider bgatlas-api for consistency, to have bg somewhere :)

Define hemisphere values as a class attribute.

Currently, when users want to work with the hemisphere data, they have to dig into the code. It would be useful for atlas.hemispheres to use atlas.left_hemisphere_value and atlas.right_hemisphere_value.

This would allow custom hemisphere values (maybe for integration with existing pipelines), but mostly to allow them to be more easily queried (some_array[atlas.hemispheres == atlas.left_hemisphere_value).

Versioning atlases

I think we should also have a way of nicely integrating versioning of this package together with versioning that I hope GIN can provide on the packages deployed there. If we introduce new features in the API they could require the latest version of an atlas to be redownloaded for the user that has an outdated one. Viceversa, maybe sometimes we should ensure that outdated API versions ask for legacy versions of the atlases.

@adamltyson @FedeClaudi any thought on this? Does this make sense?

Distributing custom atlases

I have the first occurrence in the lab of a scenario where I would need to generate an high res atlas of a subregion of fish brain. We mentioned this briefly with @adamltyson .

As I said there, by now I would keep it as a completely separate atlas package. Maybe for such scenarios, we can just add special metadata fields like "main reference atlas" - or something similar and "origin", and then handle transformations with bg space.

My main doubt atm is how to allow users to distribute custom atlases like this one, outside the general ones. I canmake a lab GIN with the package, and write custom functions for our custom download to brainglobe dir - once there, everything can be handled. Alternatively, we can have a command line argument to install custom new atlases from a link, maybe with a special prefix-suffix to make sure they stay out from the updating functions for core atlases?

I just wanted to know whether you guys have some thoughts about this.

Visible or hidden folder?

We are currently using a hidden folder .brainglobe for storing the atlases. This matches the behavior of several packages that need to store their private information, but in our case the user might want to browse the files and also as it's quite large in memory would need to be accessible.

@adamltyson you were using the hidden path for cellfinder; do you have arguments in favour of hidden folder? Otherwise I would just make it visible.

Many missing meshes in Allen Atlas

Hello,

I used brainrender recently and when loading brain regions of the Allen Atlas, the following brain regions had no corresponding mesh.

The region 1000 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 1002 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 1007 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 1009 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 1011 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 1014 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 1018 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 1025 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 1027 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 1033 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 1037 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 1041 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 1049 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 1056 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 1057 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 1064 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 1069 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 1080 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 1083 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 1084 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 1091 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 1097 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 1099 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 1100 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 1117 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 1123 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 1132 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 182305689 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 312782546 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 312782574 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 312782628 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 840 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 843 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 848 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 856 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 863 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 864 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 867 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 877 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 879 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 886 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 894 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 895 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 896 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 901 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 904 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 911 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 912 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 917 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 918 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 922 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 926 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 932 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 933 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 936 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 938 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 942 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 944 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 948 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 951 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 956 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 957 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 958 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 960 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 961 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 967 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 968 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 972 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 976 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 984 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 985 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 987 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 993 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping
The region 997 doesn't seem to belong to the atlas being used: allen_mouse_25um. Skipping

For instance, 997 is the whole brain mesh.

Add a -h flag option for the CLI

When you type an invalid command for the CLI, the help message isn't very helpful (it tells you to use another invalid command):

ValueError: Invalid command download. use "brainglobe -h for more info."
(atlas) atyson@enc2-node7:~/python/brainreg$ brainglobe -h                          
Usage: brainglobe [OPTIONS] COMMAND
Try 'brainglobe --help' for help.

Error: no such option: -h

Can we add a -h option with click (I've never used it)?

Atlas.root

Hey @vigji I'm getting up to speed with all the great work you've done recently. I have one small question comment about this:

https://github.com/brainglobe/brainatlas-api/blob/54b37bed32b12265a7d66aabb68d0dcaea2035bb/brainatlas_api/core.py#L52

Given that root is used in some atlases (e.g. Allen) to refer to the lowest hierarchy structure (i.e. the whole brain), won't the use of self.root to refer to the base path cause confusion?

e.g. here:
https://github.com/brainglobe/brainatlas-api/blob/54b37bed32b12265a7d66aabb68d0dcaea2035bb/brainatlas_api/core.py#L116

I know that self.root is mostly internal and users won't be using it but maybe we can give it a different name?

Dealing with inputs in microns vs voxel spaces

Currently functions like https://github.com/brainglobe/brainatlas-api/blob/bece6190d0feb3118a5b09feb4d5d93ab5499d56/brainatlas_api/core.py#L123

Accept input coordinates expressed in voxel space. We should add the possibility to pass inputs as microns and use the atlas' resolution to convert between the two. This would make life easier when using meshes (which are expressed in microns) as starting point, and it might be more intuitive for users.

Ideally we'd let users pass the coordinates as they want, and give them an option like as_microns. Then behind the scenes we use the atlas' resolution to convert from microns to voxels and vice versa.

List available and downloadable atlases

I move this bit of the conversation out of the GIU discussion.

I agree it would definitively be a good thing to have. Happy if you @FedeClaudi can have a look into it, I will merge the config changes as soon as possible so that you can start from there given that it will strongly interact with these parts of the code.

I guess just a little function like brainglobe_api.list_atlases() that prints all the local atlases and their directory, and all the remote one.

This raise two points that we can discuss in this context:

  • currently, keeping the "version" value at the level of the BGAtlas class is quite useless. Maybe there should vbe a global minimal (major) version requirement on the atlas for the brainatlas_api, so that only atlases with at least that major number are listed as available;
  • should we figure out a way of "proofreading" the atlases, maybe in some sort of CI-like fashon? I remember @adamltyson raised the problem of space considerations on Travis testing, I don't know the boundaries there.

Hierarchical structure with treelib

Heya,

I've used treelib in
https://github.com/brainglobe/brainatlas-api/blob/d6adc0e49d844717203576b465f7666b845d742c/brainatlas_api/core.py#L205

and it looks like it's a nice little library that does one thing but it does it very well.

We have a few functions like get_structure_descendants that we could implement using the Allensdk Tree stuff, but I feel like they'd work best with a treelib Tree. I think using the allen stuff to go from structures.json to some hierarchical structure is good as it has a few checks in place to make sure that stuff is correct, but we could build a Tree in parallel and used that for most operations. It's a bit redundant maybe but it offers a very intuitive interface (I'm biased though since I really dislike the way the structure tree is implemented by the AllenSDK).

what do you all think about it?

Adding pycln

Hi,

Opening it here but valid throughout the org.
We could add pycln to our pre-commit stack.
It automatically removes unused imports from all files. I have set it up in pyinspect to run before flake8 so that it fixed is before flake8 stops the commit from going through.

Lemme know!

Extract cellfinder csv -> json conversion

There's a tool to generate the brain region json, but the functionality to convert the csv format to json should be extracted in case users want to define their atlas using the csv.

This could be an optional part of a generalised atlas generation function that could be edited by 3rd party users.

Improve consistency in core class

I think that we should structure a bit methods in the core class in a consistent way for increasing clarity and maintainability.

We will often do stuff either "by region name" or "by region id"; also, either with single regions and list of regions.

I think that it would be good not to have tons of duplicated methods to deal with this; we could probably have a decorator that parses the input, check if value is a string, an int, a list of strings, or a list of ints, prepend a conversion if required, iterates if required, but then applies always a single method.
In the backend method we should always use either acronyms or regions ids - I would go for ids.

If the region is the returned value (eg, get_region_name_from_coords(coords)), I would not duplicate methods for ids and names, but make it a possible keyword argument (get_region_from_coords(coords, as_name=True)).

We might be used to different ways of doing this in our packages but keeping such duplications tend to be quite bug-prone I think.

Also, get_x_from_y is long and redundant, maybe we could switch to x_from_y.

BUG - AttributeError: type object 'tqdm' has no attribute 'wrapattr'

In
https://github.com/brainglobe/brainatlas-api/blob/73b3e06a597f3843840feb9a9f612c4df9eb5d35/brainatlas_api/utils.py#L42

I get

Traceback (most recent call last):
  File "/Users/federicoclaudi/Documents/Github/BrainRender/workspace.py", line 12, in <module>
    scene = Scene(atlas = 'allen_human_500um_v0.1')  # specify that you want a view from the top
  File "/Users/federicoclaudi/Documents/Github/BrainRender/brainrender/scene.py", line 103, in __init__
    self.atlas = atlasclass(**atlas_kwargs)
  File "/Users/federicoclaudi/miniconda3/envs/brainrender/lib/python3.7/site-packages/brainatlas_api/bg_atlas.py", line 56, in __init__
    self.download_extract_file()
  File "/Users/federicoclaudi/miniconda3/envs/brainrender/lib/python3.7/site-packages/brainatlas_api/bg_atlas.py", line 79, in download_extract_file
    utils.retrieve_over_http(self.remote_url, destination_path)
  File "/Users/federicoclaudi/miniconda3/envs/brainrender/lib/python3.7/site-packages/brainatlas_api/utils.py", line 42, in retrieve_over_http
    with tqdm.wrapattr(
AttributeError: type object 'tqdm' has no attribute 'wrapattr'

I propose we move to rich progress bars anyway and ditch tqdm, or at least add a requirement for the tqdm version to be >= tqdm-4.46.1

Specifying atlas orientation

I am trying to figure out a convenient way of describing the orientation of the atlas being imported so that the stack reorientation and affine matrix generation can be handled in a function.

Do you think that something like this would make sense? Specifying the "sectioning" of the atlas, although maybe more intuitive, is prone to lots of ambiguities as each cut can be oriented in two ways. Therefore, one should specify axes names order and axes directions. Is this intuitive enough/adhering to conventions in people's mind?

# Describe the order and direction the atlas axes; tuple specifies (0, 0, 0) for each of the stack axes:
atlas_origin = ("posterior", "superior", "right")

Creating root mesh

Hey,
for the human brain atlas, there's no 'root' mesh, I think that the annotated volume only has the labels of the 'leafs' of the structure tree (the same goes for the rat atlas):

image

So I'm creating the root mesh by taking all voxels that have value >0, setting them to 1 and reconstructing the surface from that. However, in the case of the human brain there is still some structure 'under the brain surface' which makes the root mesh kinda ugly:
image

Does anyone know of a better way to create the root mesh?

Remove atlas_gen from the distributed package

Currently, both brainatlas_api and atlas_gen will be distributed based on setup.py.

Shall we remove atlas_gen, and assume that users who install from pypi will only be using brainatlas_api, and anyone who wants to use atlas_gen will install from github?

documentation

Considering that this package (and others under the brainglobe umbrella) expose an API, we should think about documentation.

I don't have much experience with this kind of documentation, but building it with sphinx from the docstrings seems like the easiest way. Does anyone have any other ideas or objections?

We should also decide on a standard format. braintlas-api looks to be using the numpy format currently. Ideally I'd like to move to the default sphinx/reST format, mostly because it's the format already used by brainrender, cellfinder, amap etc.

Structure of json metatdata file

I added a draft of the json metadata file, I think it's a good place to start agreeing with things about how to describe the atlas in the class!

I merged it already, I don't know if it's better to discuss this here or in PR. Anyway, the structure I had in mind was:

  • general header (name, publication, atlas website, author maybe, etc.)

  • list of atlas "data contents", with kewords that allow us to interact with the files (as they could have different formats); I think the only mandatory ones should be:

    1. "brain": the average reference brain;
    2. "regions": not sure it should be mandatory, but a volume with numbers corresponding to regions
    3. Any additional file (different stacks in standard space useful for transformations, e.g. filtered), we should probably not commit to rigid names here and keep all of those bundled in a dictionary supplelementary_files
  • Affine transformation (grid_to_world, with world defined as per CCF I would say)

  • remote URL for download;

  • Regions hierarchy: this could either go here or (to keep this file cleaner, as it will be quite large) somewhere else.

Let me know what you think about this and I can work on a more concrete class!

Creation of new atlases.

Hello there,

I have been really interested to see cellfinder and brainglobe grow in the last few years, great job on this open-source package! But I'm working with rat brains, and currently, there are no compatible rat atlases available. Are there plans to create a compatible rat atlas? And if not, what would it take to create a rat atlas in the correct format to work with other brainglobe packages? I'm interested in contributing if my skill set allows for it.

Best regards,
Allison McDonald
PhD student
Behavioural & Translational Neuroscience Lab, VUmc, Amsterdam, NL.

Add pycln to pre-commit

Hi,

Opening it here but valid throughout the org.
We could add pycln to our pre-commit stack.
It automatically removes unused imports from all files. I have set it up in pyinspect to run before flake8 so that it fixed is before flake8 stops the commit from going through.

Lemme know!

Hemisphere annotation correct?

Hi, I have a quick question about reading out hemisphere info as in atlas.hemisphere_from_coords(coords,as_string=True).
I have loaded a numpy array as atlas = BrainGlobeAtlas("allen_mouse_50um")
When reading out coords from left to right, I start from reading hemisphere as "left" and going into "right". So the red x here returns "left" as hemisphere and the blue x returns "right".
Screenshot 2020-08-20 at 20 04 47
It should be the other way around (hemispheres are "mouse centered")? Or maybe I am doing something wrong?

kim_unified_25um has some errors

There seems to be two major errors:

  • The annotations and reference images are not aligned. The reference image also isn't aligned with the allen 25um atlas which it probably should be.
  • There are border regions in the annotation image with incorrect values.

opening .obj

Hey everyone,

Currently .obj files with mesh data are opened and returns as triplets of vertices, faces and edges, which is not very useful.

Given that meshes are used almost exclusively for visualisation (in brainrender) I'd suggest opening them with vtkplotter.vtkio.load, which returns an instance of Mesh from vtkplotter which has loads of useful function. A lot of functionality currently in brainrender which will be moved to brainatlas-api relies on the Mesh class.

An alternative would be to use meshpart, but to be honest I haven't used it too much, don't know how good it is.

The only downside I can see is that this will require adding vtkplotter to the requirements which comes with a bunch of dependencies of it's own. It doesn't cause problems in my experience but it does make the package less lightweight.

What do people think?

Feature Proposal: Conversion method between Numpy and Physical Coordinate Spaces

Hi,

Sometimes I'd like to know where a particular pixel in the reference or annotations template is located in microns or millimeters. The data for calculating this is already present in the atlas data (for example, via the shape_um, resolution, and space attributes), but having a simpler translation method would be helpful.

I'm not sure what the best api for this would be. Some ideas off the top of my head:

  1. bg_atlas.to_microns(coords: Tuple[Union[Int, List[int]], Union[Int, List[int]], Union[Int, List[int]]]) -> Tuple[List[int], List[int], List[int]]
    • pros: simple to implement, at first. Easy to find.
    • cons: future requests for different units, or just slices, may make this method quite complicated.
  2. An attribute on the image data itself: bg_atlas.reference[10, 10, 10].um # ([25], [25], [25])
    • pros: Straightfoward api. No interface changes needed even with slices.
    • cons: needs some kind of subclassing of ndarray or something like xarray (like in this example from their tutorial). Much more work initially, but may save effort in the long run.

Thoughts?

Best wishes,

Nick

structure_from_coords finds too many regions

pp

Hey guys, I'm making a probe planning app, as you can see above.
I use the structure_from_coords method to find through which brain regions the probe goes through. This works mostly fine but occasionally finds too much stuff:

For instance here:
image

it finds some stuff in the hypothalamus, miles away from the probe. I think that this is a bug with the atlas function and not with my probe planner stuff, but might require some investigation.

BrainGlobeAtlas.hemispheres is not correct

I think bg_atlasapi.utils.make_hemispheres_stack doesn't assume the correct orientation.

We should be able to use BrainGlobeAtlas.metadata.orientation to do this properly.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.