GithubHelp home page GithubHelp logo

alleninstitute / mouse_connectivity_models Goto Github PK

View Code? Open in Web Editor NEW
38.0 8.0 14.0 573.63 MB

Python package providing mesoscale connectivity models for mouse.

Home Page: http://mouse-connectivity-models.readthedocs.io/en/latest/

License: Other

Python 99.89% Shell 0.10% Makefile 0.01%
python scikit-learn neuroscience open-science

mouse_connectivity_models's Introduction

mouse_connectivity_models

Travis_ Codecov_ Readthedocs_

image

mouse_connectivity_models is a Python module for constructing and testing mesoscale connectivity models using data from the Allen Institute for Brain Science.

It provides models written in Scikit-Learn estimator style, and has been used in the following publications:

Download: http://download.alleninstitute.org/publications/A_high_resolution_data-driven_model_of_the_mouse_connectome/

Website: http://mouse-connectivity-models.readthedocs.io/en/latest/

Installation

Dependencies

mouse_connectivity_models requires:

  • Python (>=2.7 or >= 3.4)
  • scikit-learn (>= 0.22.1)
  • allensdk (>= 2.10.1)

For running the examples Matplotlib >= 1.3.1 is required.

We have only tested and used this package on Linux.

User installation

We use Git for our version control and Github for hosting our main repository.

You can check out the latest sources and install using pip:

$ git clone [email protected]:AllenInstitute/mouse_connectivity_models.git
$ cd mouse_connectivity_models
$ pip install .

Level of Support

We are not currently supporting this code, but simply releasing it to the community AS IS but are not able to provide any guarantees of support. The community is welcome to submit issues, but you should not expect an active response.

Contributing

We encourage the community to contribute! Please first review the Allen Institute Contributing Agreement <https://github.com/AllenInstitute/ mouse_connectivity_models/blob/master/CONTRIBUTING.md>, then refer to the contributing guide <http://AllenInstitute.github.io/mouse_connectivity_models/ contributing.html>.

Installing the dev requirements

Use pipenv to install the dev dependencies. If you do not have pipenv currently installed :

$ pip install pipenv

Then install dev dependencies :

$ pipenv install --dev

This will create a virtual environment on your machine for this project. To activate the virtual environment (to develop) :

$ pipenv shell

Testing

After installation, you can launch the test suite from outside the source directory (mcmodels) using pytest :

$ pytest mcmodels

Help and Support

Documentation

The documentation that supports mouse_connectivity_models can be found at the Website.

mouse_connectivity_models's People

Contributors

dependabot[bot] avatar drjigsaw avatar jknox13 avatar kamdh avatar kharris avatar nbingo avatar nilegraddis avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mouse_connectivity_models's Issues

From Voxel To Coordinates

Hi,

I've only recently started using this awesome research, so I might be missing something obvious here. Is there a way to know the coordinates in a given reference frame for one of the voxels in VoxelModel?

I would need this for visualisation and to sort voxels based on their spatial location, but couldn't figure out a way to do this.

Thank you!

typo in sphinx docs

I think the structure set ID on the landing page of your docs should be 167587189 (not 165787189). Would you mind updating it?

docs cleanup

We need to:

  1. replace the "data portal" logo in the top right with something specific to this project.
  2. fix the "questions" section on the sidebar so that it refers to this project and not the AllenSDK :P

Incompatibility with new allensdk API

Hi, the module seems to be incomatible to the new allensdk Cache API:

download/mouse_connectivity_models> pip install .
Processing /home/metis/download/mouse_connectivity_models
    ERROR: Command errored out with exit status 1:
     command: /home/metis/venv/conda3/bin/python -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-req-build-34hlah93/setup.py'"'"'; __file__='"'"'/tmp/pip-req-build-34hlah93/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' egg_info --egg-base pip-egg-info
         cwd: /tmp/pip-req-build-34hlah93/
    Complete output (11 lines):
    Traceback (most recent call last):
      File "<string>", line 1, in <module>
      File "/tmp/pip-req-build-34hlah93/setup.py", line 13, in <module>
        import mcmodels
      File "/tmp/pip-req-build-34hlah93/mcmodels/__init__.py", line 16, in <module>
        from . import core
      File "/tmp/pip-req-build-34hlah93/mcmodels/core/__init__.py", line 12, in <module>
        from .voxel_model_api import VoxelModelApi
      File "/tmp/pip-req-build-34hlah93/mcmodels/core/voxel_model_api.py", line 7, in <module>
        from allensdk.api.cache import cacheable, Cache
    ModuleNotFoundError: No module named 'allensdk.api.cache
    ----------------------------------------
ERROR: Command errored out with exit status 1: python setup.py egg_info Check the logs for full command output.
>>> import allensdk
>>> import allensdk.api
>>> import allensdk.api.cache
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ModuleNotFoundError: No module named 'allensdk.api.cache'
>>> allensdk.__version__
'2.10.2'

Consider saving models weights/nodes to '.npy'

Hi,

I've notices that the models parameters that are downloaded from VoxelModelCache are stored as .csv.gz. Given that these are quite large files, it takes several minutes to load them when calling get_voxel_connectivity_array.

An alternative would be to save them as npy.gz. It seems that .npy is much faster to load, giving a significant performance boost. On my machine it went from taking about 5 minutes to load to taking about 30 seconds.

If you still wish to keep them as .gz, saving/loading to .npy.gz is as simple as:

def load_npy_from_gz(filepath):
	f = gzip.GzipFile(filepath, "r")
	return np.load(f)

def save_npy_to_gz(filepath, data):
	f = gzip.GzipFile(filepath, "w")
	np.save(f, data)
	f.close()

Hope this helps,
Fede

nonnegative_linear component test results in an error during pytest

Hello, I got an error during pytest. Do you know how to solve the following error?
I am using python3.7 on MacOS Catalina.

=================================================== test session starts ====================================================
platform darwin -- Python 3.7.4, pytest-5.3.5, py-1.8.1, pluggy-0.13.1
rootdir: /Users/kfujii2/projects/Allen_injection_site_analysis/Reimann_etal/mouse_connectivity_models
collected 109 items

mouse_connectivity_models/mcmodels/core/tests/test_base.py .....                                                     [  4%]
mouse_connectivity_models/mcmodels/core/tests/test_experiment.py .......                                             [ 11%]
mouse_connectivity_models/mcmodels/core/tests/test_masks.py ..........                                               [ 20%]
mouse_connectivity_models/mcmodels/core/tests/test_utils.py ..                                                       [ 22%]
mouse_connectivity_models/mcmodels/core/tests/test_voxel_model_api.py ........                                       [ 29%]
mouse_connectivity_models/mcmodels/core/tests/test_voxel_model_cache.py ...........                                  [ 39%]
mouse_connectivity_models/mcmodels/models/homogeneous/tests/test_homogeneous_model.py ...                            [ 42%]
mouse_connectivity_models/mcmodels/models/homogeneous/tests/test_subset_selection.py ...                             [ 44%]
mouse_connectivity_models/mcmodels/models/voxel/tests/test_regionalized_model.py .......                             [ 51%]
mouse_connectivity_models/mcmodels/models/voxel/tests/test_voxel_connectivity_array.py .................             [ 66%]
mouse_connectivity_models/mcmodels/models/voxel/tests/test_voxel_model.py ....                                       [ 70%]
mouse_connectivity_models/mcmodels/regressors/nonnegative_linear/tests/test_base.py .F..                             [ 74%]
mouse_connectivity_models/mcmodels/regressors/nonnegative_linear/tests/test_ridge.py .F.                             [ 77%]
mouse_connectivity_models/mcmodels/regressors/nonparametric/tests/test_kernels.py .......                            [ 83%]
mouse_connectivity_models/mcmodels/regressors/nonparametric/tests/test_nadaraya_watson.py ............               [ 94%]
mouse_connectivity_models/mcmodels/tests/test_utils.py ......                                                        [100%]

========================================================= FAILURES =========================================================
_______________________________________________ test_nonnegative_regression ________________________________________________

    def test_nonnegative_regression():
        # ------------------------------------------------------------------------
        # test shape incompatibility
        X, y = np.ones((10, 1)), np.ones((11, 1))

        assert_raises(ValueError, nonnegative_regression, X, y)

        # ------------------------------------------------------------------------
        # test X.ndim != 2
        X = np.linspace(-10, 10, 100)
        y = 4*X

        assert_raises(ValueError, nonnegative_regression, X, y)

        # ------------------------------------------------------------------------
        # test function output
        X = X.reshape(-1, 1)

        coef, res = nonnegative_regression(X, y)

        assert_allclose(coef[0], 4.)
        assert_allclose(res[0], 0.0, atol=1e-10)

        # ------------------------------------------------------------------------
        # test sample_weight
        sample_weight = 1.0

        coef, res = nonnegative_regression(X, y)
        coef_sw, res_sw = nonnegative_regression(X, y, sample_weight)

        assert_array_almost_equal(coef.ravel(), coef_sw)
        assert_array_almost_equal(res, res_sw)

        # ------------------------------------------------------------------------
        # test sample_weight shape incompatibility
        sample_weight = np.ones(11)

>       assert_raises(ValueError, nonnegative_regression, X, y, sample_weight)

mouse_connectivity_models/mcmodels/regressors/nonnegative_linear/tests/test_base.py:67:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/unittest/case.py:756: in assertRaises
    return context.handle('assertRaises', args, kwargs)
/usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/unittest/case.py:178: in handle
    callable_obj(*args, **kwargs)
/usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/unittest/case.py:201: in __exit__
    self.obj_name))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <unittest.case._AssertRaisesContext object at 0x1401bd8d0>
standardMsg = 'ValueError not raised by nonnegative_regression'

    def _raiseFailure(self, standardMsg):
        msg = self.test_case._formatMessage(self.msg, standardMsg)
>       raise self.test_case.failureException(msg)
E       AssertionError: ValueError not raised by nonnegative_regression

/usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/unittest/case.py:135: AssertionError
____________________________________________ test_nonnegative_ridge_regression _____________________________________________

    def test_nonnegative_ridge_regression():
        # ------------------------------------------------------------------------
        # test shape incompatibility
        X, y = np.ones((10, 1)), np.ones((11, 1))
        alpha = np.zeros(1)

        assert_raises(ValueError, nonnegative_ridge_regression, X, y, alpha)

        # ------------------------------------------------------------------------
        # test X.ndim != 2
        X = np.linspace(-10, 10, 100)
        y = 4*X

        assert_raises(ValueError, nonnegative_ridge_regression, X, y, alpha)

        # ------------------------------------------------------------------------
        # test incompatible alpha shape
        X = X.reshape(-1, 1)
        alpha = np.zeros(2)

        assert_raises(ValueError, nonnegative_ridge_regression, X, y, alpha)

        # ------------------------------------------------------------------------
        # test sample_weight
        alpha = np.arange(X.shape[1])
        sample_weight = 1.0

        coef, res = _solve_ridge_nnls(X, y.reshape(-1, 1), alpha, 'SLSQP')
        coef_sw, res_sw = nonnegative_ridge_regression(X, y, alpha, sample_weight)

        assert_array_almost_equal(coef.ravel(), coef_sw)
        assert_array_almost_equal(res, res_sw)

        # ------------------------------------------------------------------------
        # test sample_weight shape incompatibility
        sample_weight = np.ones(11)

>       assert_raises(ValueError, nonnegative_ridge_regression, X, y, alpha, sample_weight)

mouse_connectivity_models/mcmodels/regressors/nonnegative_linear/tests/test_ridge.py:61:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
/usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/unittest/case.py:756: in assertRaises
    return context.handle('assertRaises', args, kwargs)
/usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/unittest/case.py:178: in handle
    callable_obj(*args, **kwargs)
/usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/unittest/case.py:201: in __exit__
    self.obj_name))
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <unittest.case._AssertRaisesContext object at 0x140350810>
standardMsg = 'ValueError not raised by nonnegative_ridge_regression'

    def _raiseFailure(self, standardMsg):
        msg = self.test_case._formatMessage(self.msg, standardMsg)
>       raise self.test_case.failureException(msg)
E       AssertionError: ValueError not raised by nonnegative_ridge_regression

/usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/unittest/case.py:135: AssertionError
===================================================== warnings summary =====================================================
/usr/local/lib/python3.7/site-packages/sklearn/utils/deprecation.py:144
  /usr/local/lib/python3.7/site-packages/sklearn/utils/deprecation.py:144: FutureWarning: The sklearn.linear_model.base module is  deprecated in version 0.22 and will be removed in version 0.24. The corresponding classes / functions should instead be imported from sklearn.linear_model. Anything that cannot be imported from sklearn.linear_model is now part of the private API.
    warnings.warn(message, FutureWarning)

/usr/local/lib/python3.7/site-packages/sklearn/utils/deprecation.py:144
  /usr/local/lib/python3.7/site-packages/sklearn/utils/deprecation.py:144: FutureWarning: The sklearn.metrics.scorer module is  deprecated in version 0.22 and will be removed in version 0.24. The corresponding classes / functions should instead be imported from sklearn.metrics. Anything that cannot be imported from sklearn.metrics is now part of the private API.
    warnings.warn(message, FutureWarning)

-- Docs: https://docs.pytest.org/en/latest/warnings.html
======================================== 2 failed, 107 passed, 2 warnings in 1.55s =========================================

mcmodels not found although it's in site-packages

Hi!

First I want to say, wonderful what you have created! And I'm excited to start working with your software/data.

Since I'm fairly new to python I reckon it's probably a newbie question. But tried my best to figure it out. So bear with me.

Using:
macOS Mojave 10.14.2
Python 3.6.0 :: Anaconda 4.3.1 (x86_64)
allensdk==0.16.0
mouse-connectivity-models==0.0.1

After following these instructions:

$ git clone https://github.com/AllenInstitute/mouse_connectivity_models.git
$ cd mouse_connectivity_models
$ pip install .

'$ pytest mcmodels' gives:
ERROR: file not found: mcmodels

While mcmodels with all the subfolders/files is in the site-packages. '$ pip freeze' also doesn't show mcmodels as an installed package.

Thanks in advance

region overlays

Provide a way to overlay region labels and boundaries on top view or flatmap projected data.

can't import fn_temp_dir when running pytest mcmodels

Hi,

I tried to install the package, which seemed fine in the first place. But when trying 'pytest mcmodels' i get the following error:

____________________________ ERROR collecting mcmodels/core/tests/test_voxel_model_api.py _____________________________
ImportError while importing test module 'C:\Users\Christian\Desktop\Projekte\mouse_connectivity_models\mcmodels\core\tests\test_voxel_model_api.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
mcmodels\core\tests\test_voxel_model_api.py:9: in
from allensdk.test_utilities.temp_dir import fn_temp_dir
E ImportError: cannot import name 'fn_temp_dir' from 'allensdk.test_utilities.temp_dir' (C:\Users\Christian\Anaconda3\lib\site-packages\allensdk\test_utilities\temp_dir.py)
___________________________ ERROR collecting mcmodels/core/tests/test_voxel_model_cache.py ____________________________
ImportError while importing test module 'C:\Users\Christian\Desktop\Projekte\mouse_connectivity_models\mcmodels\core\tests\test_voxel_model_cache.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
mcmodels\core\tests\test_voxel_model_cache.py:11: in
from allensdk.test_utilities.temp_dir import fn_temp_dir
E ImportError: cannot import name 'fn_temp_dir' from 'allensdk.test_utilities.temp_dir' (C:\Users\Christian\Anaconda3\lib\site-packages\allensdk\test_utilities\temp_dir.py)

I am using:

Python 3.7.4
allensdk==0.16.3
mouse-connectivity-models==0.0.1

Is it possible, that the method fn_temp_dir was renamed into temp_dir in allensdk.test_utilities.temp_dir?

Thanks for your time
Christian

animation code

Hi, it would be great if the code to make the cortical_projection.gif were included in the "examples" directory.

RegionalizedModel not working as expected?

Dear Joseph,

first of all thanks for the great work.

We are trying to build a regionalized version of the connectome with brain regions that are not necessarily part of the summary structures. I tried the following

from mcmodels.core import VoxelModelCache
from mcmodels.models.voxel import RegionalizedModel
import numpy as np

cache = VoxelModelCache(manifest_file='connectivity/voxel_model_manifest.json')

model, right_hem_mask, whole_brain_mask= cache.get_voxel_connectivity_array()

source_key = right_hem_mask.get_key(structure_ids=[31,864]) # ACA, DORsm
target_key = whole_brain_mask.get_key(structure_ids=[31,864]) # ACA, DORsm

regional_model = RegionalizedModel.from_voxel_array(model, source_key, target_key)

ncd=regional_model.normalized_connection_density 

and according to these examples (here and bottom of here) I was expecting a 2x4 numpy array as a result, instead I got a 2x2

I then tried to include the hemisphere_id for the target key variable, and it seemed to work

target_key_right_hem = whole_brain_mask.get_key(structure_ids=[31,864],hemisphere_id=2) # ACA, DORsm for right hem
target_key_left_hem = whole_brain_mask.get_key(structure_ids=[31,864],hemisphere_id=1) # ACA, DORsm for left hem

regional_model_right_hem = RegionalizedModel.from_voxel_array(model, source_key, target_key_right_hem)
regional_model_left_hem = RegionalizedModel.from_voxel_array(model, source_key, target_key_left_hem)

ncd_right=regional_model_right_hem.normalized_connection_density
ncd_left=regional_model_left_hem.normalized_connection_density

new_ncd=np.concatenate((ncd_right,ncd_left),axis=1)

Is the second piece of code the right way to obtain what we want or the first one should output a 2x4 matrix?

Thanks for your time
Ludovico

Documentation request: how are the files in cortical_coordinates generated?

Dear contributors to the mouse_connectivity_models,

I am looking for a specific information about the files located in cortical_coordinates.

The process by which a curved cortical coordinate system has been obtained is documented on pages 6 and 7 of the technical white paper:

After the borders of isocortex were defined, Laplace’s equation was solved between pia and white matter surfaces resulting in intermediate equi-potential surfaces (Figure 4A). Streamlines were computed by finding orthogonal (steepest descent) path through the equi-potential field (Figure 4B). Information at different cortical depths can then be projected along the streamlines to allow integration or comparison. Streamlines were used to facilitate the annotation of the entire isocortex, including higher visual areas.
...
Annotation of the Isocortex in 3-D Space. The isocortex was annotated from surface views using the curved cortical coordinate system described above. The curved cortical coordinate system has an advantage in that it allows the translation of any point from 2-D surface views into 3-D space or vice versa. Thus, mapping isocortex from surface views is a different approach
compared with conventional 3-D mouse brain atlases that are built from a series of 2-D coronal sections, such as the ARA (Figure 1, version 2).

Although I find the above process clear, I cannot infer from this description how the pia and the white matter surfaces have been flattened to fit in a 2D numpy array such as those of dorsal_flatmap_paths_100.h5 and top_view_paths_100.h5. In other words, what is the recipe to build the arrays view_lookup held by these files? Was some area-preserving transformation applied to the dorsal and top surfaces of the isocortex volume? If so, which one?

Many thanks in advance for your help,
Luc.

As a side note:

import h5py
In [2]: with h5py.File('~/Downloads/dorsal_flatmap_paths_100.h5', 'r') as f:
   ...:             view_lookup = f['view lookup'][:]
   ...:             paths = f['paths'][:]
   ...:     

In [3]: view_lookup.shape
(136, 272)

The returned shape doesn't match the expected value of (132, 114) that is indicated in cortical_map.py.

README gif

Use cortical flat map to make gif for README

subset voxel_array by region

Hello, thanks for the great package and documentation.
I'm trying to figure out how to extract voxel-level connectivity to and from a region of interest (say, VISp). Note that I want the voxel-level data, not a regionalized summary. From the examples provided, I see how to use masks to extract a regionalized summary. But I want to be able to do something like this:
#VISp is region 385
region_mask = Mask.from_cache(cache, hemisphere_id=2, structure_ids=[385])
from_VISp = voxel_array[region_mask,:]
to_VISp = voxel_array[:,region_mask]

Obviously, this doesn't work as written. How can I find the indices to subset the voxel array in this fashion? Thanks for any suggestions you can offer.

RegionalizedModel.from_voxel_array() method gives ValueError

Below is the code that is used to construct a 43 by 43 isocortex connectivity matrix using the spatial model. However, it raises a ValueError as shown below.

cache = VoxelModelCache()

voxel_array, source_mask, target_mask = cache.get_voxel_connectivity_array()

region_ids = np.load('isocortex_id_list.npy') # the 43 isocortex ROIs id here

source_key = source_mask.get_key(region_ids, hemisphere_id=2)

target_key = source_mask.get_key(region_ids, hemisphere_id=2)

regional_model = RegionalizedModel.from_voxel_array(voxel_array, source_key, target_key) # ValueError here: columns of nodes and elements in target_key must be of equal size
regional_model.normalized_connection_density

NCD = regional_model.normalized_connection_density

RegionalData injection_structure_ids

Passing injection_structure_ids to the initialization of RegionalData leads to projections attribute having zero columns outside of the injection structure

HomogeneousModel is not producing what is expected?

We used the lines of code below to generate a connectivity matrix based on Oh method. However, it produces a matrix that is wrong: zero correlation with the new one based on voxel model, which does not make sense.

cache = VoxelModelCache()
voxel_data = cache.get_experiment_data(cre=False, injection_structure_ids=iso_list, projection_structure_ids=iso_list,
injection_hemisphere_id = 2, projection_hemisphere_id = 2)
# iso_list is the list of ids of the 43 isocortex ROIs, right hemisphere only (we tried both hemispheres as well)
regional_data = RegionalData.from_voxel_data(voxel_data)
reg = HomogeneousModel()
reg.fit(regional_data.injections, regional_data.projections)
homogeneous_connectivity = np.array(reg.weights)

top_view_paths_100.h5 is corrupted

The file top_view_paths_100.h5 is missing a path on the interior of the cortex. This causes it to show up as a small empty voxel when plotting.

mouse_connectivity_models cannot be installed

I am having problems to install the tool mouse_connectivity_models following the instructions given on the README.

I am following the instructions to install that package on Ubuntu 20.04.3 and using pip version 22.0.3:

git clone https://github.com/AllenInstitute/mouse_connectivity_models.git
cd mouse_connectivity_models/
pip install .

But then I get the following error:


Processing /home/adietz/Local/1.22_NSE_Testing/mouse_connectivity_models
  Preparing metadata (setup.py) ... error
  error: subprocess-exited-with-error
  
  × python setup.py egg_info did not run successfully.
  │ exit code: 1
  ╰─> [12 lines of output]
      Traceback (most recent call last):
        File "<string>", line 2, in <module>
        File "<pip-setuptools-caller>", line 34, in <module>
        File "/home/adietz/Local/1.22_NSE_Testing/mouse_connectivity_models/setup.py", line 13, in <module>
          import mcmodels
        File "/home/adietz/Local/1.22_NSE_Testing/mouse_connectivity_models/mcmodels/__init__.py", line 16, in <module>
          from . import core
        File "/home/adietz/Local/1.22_NSE_Testing/mouse_connectivity_models/mcmodels/core/__init__.py", line 8, in <module>
          from .base import VoxelData, RegionalData
        File "/home/adietz/Local/1.22_NSE_Testing/mouse_connectivity_models/mcmodels/core/base.py", line 13, in <module>
          import numpy as np
      ModuleNotFoundError: No module named 'numpy'
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

× Encountered error while generating package metadata.
╰─> See above for output.

note: This is an issue with the package mentioned above, not pip.
hint: See above for details.

I appreciate if that can be fixed

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.