GithubHelp home page GithubHelp logo

jonescompneurolab / hnn-core Goto Github PK

View Code? Open in Web Editor NEW
46.0 9.0 50.0 135.18 MB

Simulation and optimization of neural circuits for MEG/EEG source estimates

Home Page: https://jonescompneurolab.github.io/hnn-core/

License: BSD 3-Clause "New" or "Revised" License

Python 97.39% Makefile 0.07% AMPL 2.35% Jupyter Notebook 0.20%
neuron-simulator eeg meg computational-modeling

hnn-core's Introduction

hnn-core

tests CircleCI Codecov PyPI Gitter JOSS

HNN-GUI

About

This is a leaner and cleaner version of the code based off the HNN repository.

The Human Neocortical Neurosolver (HNN) is an open-source neural modeling tool designed to help researchers/clinicians interpret human brain imaging data. Based off the original HNN repository, HNN-core provides a convenient way to run simulations of an anatomically and biophysically detailed dynamical system model of human thalamocortical brain circuits with only a few lines of code. Given its modular, object-oriented design, HNN-core makes it easy to generate and evaluate hypotheses on the mechanistic origin of signals measured with magnetoencephalography (MEG), electroencephalography (EEG), or intracranial electrocorticography (ECoG). A unique feature of the HNN model is that it accounts for the biophysics generating the primary electric currents underlying such data, so simulation results are directly comparable to source localized data (current dipoles in units of nano-Ampere-meters); this enables precise tuning of model parameters to match characteristics of recorded signals. Multimodal neurophysiology data such as local field potential (LFP), current-source density (CSD), and spiking dynamics can also be simulated simultaneously with current dipoles.

While the HNN-core API is designed to be flexible and serve users with varying levels of coding expertise, the HNN-core GUI is designed to be useful to researchers with no formal computational neural modeling or coding experience.

For more information visit https://hnn.brown.edu. There, we describe the use of HNN in studying the circuit-level origin of some of the most commonly measured MEG/EEG and ECoG signals: event related potentials (ERPs) and low frequency rhythms (alpha/beta/gamma).

Contributors are very welcome. Please read our contributing guide if you are interested.

Dependencies

hnn-core requires Python (>=3.8) and the following packages:

  • numpy
  • scipy
  • matplotlib
  • Neuron (>=7.7)

Optional dependencies

GUI

  • ipywidgets
  • voila
  • ipympl
  • ipykernel

Note: Please follow the GUI installation section to install the correct GUI dependency versions automatically.

Optimization

  • scikit-learn

Parallel processing

  • joblib (for simulating trials simultaneously)
  • mpi4py (for simulating the cells in parallel for a single trial). Also depends on:
    • openmpi or other mpi platform installed on system
    • psutil

Installation

We recommend the Anaconda Python distribution. To install hnn-core, simply do:

$ pip install hnn_core

and it will install hnn-core along with the dependencies which are not already installed.

Note that if you installed Neuron using the traditional installer package, it is recommended to remove it first and unset PYTHONPATH and PYTHONHOME if they were set. This is because the pip installer works better with virtual environments such as the ones provided by conda.

If you want to track the latest developments of hnn-core, you can install the current version of the code (nightly) with:

$ pip install --upgrade https://api.github.com/repos/jonescompneurolab/hnn-core/zipball/master

To check if everything worked fine, you can do:

$ python -c 'import hnn_core'

and it should not give any error messages.

Installing optimization dependencies

If you are using bayesian optimization, then scikit-learn is required. Install hnn-core with scikit-learn using the following command:

$ pip install hnn_core[opt]

GUI installation

To install the GUI dependencies along with hnn-core, a simple tweak to the above command is needed:

$ pip install hnn_core[gui]

Note if you are zsh in macOS the command is:

$ pip install hnn_core'[gui]'

To start the GUI, please do:

$ hnn-gui

Parallel backends

For further instructions on installation and usage of parallel backends for using more than one CPU core, refer to our parallel backend guide.

Note for Windows users

Install Neuron using the precompiled installers before installing hnn-core. Make sure that:

$ python -c 'import neuron;'

does not throw any errors before running the install command. If you encounter errors, please get help from NEURON forum. Finally, do:

$ pip install hnn_core[gui]

Documentation and examples

Once you have tested that hnn_core and its dependencies were installed, we recommend downloading and executing the example scripts provided on the documentation pages (as well as in the GitHub repository).

Note that python plots are by default non-interactive (blocking): each plot must thus be closed before the code execution continues. We recommend using and 'interactive' python interpreter such as ipython:

$ ipython --matplotlib

and executing the scripts using the %run-magic:

%run plot_simulate_evoked.py

When executed in this manner, the scripts will execute entirely, after which all plots will be shown. For an even more interactive experience, in which you execute code and interrogate plots in sequential blocks, we recommend editors such as VS Code and Spyder.

Bug reports

Use the github issue tracker to report bugs. For user questions and scientific discussions, please join the HNN Google group.

Interested in Contributing?

Read our contributing guide.

Roadmap

Read our roadmap.

Citing

If you use HNN-core in your work, please cite our publication in JOSS:

Jas et al., (2023). HNN-core: A Python software for cellular and circuit-level interpretation of human MEG/EEG. Journal of Open Source Software, 8(92), 5848, https://doi.org/10.21105/joss.05848

hnn-core's People

Contributors

alexrockhill avatar blakecaldwell avatar carolinafernandezp avatar chenghuzi avatar cjayb avatar dylansdaniels avatar gtdang avatar jasmainak avatar kenloi avatar klankinen avatar kmilo9999 avatar kohl-carmen avatar mjpelah avatar mkhalil8 avatar mohdsherif avatar ntolley avatar orbekolo avatar raj1701 avatar rythorpe avatar samadpls avatar samikane avatar spbrandt avatar stephanie-r-jones avatar yarikoptic avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

hnn-core's Issues

clean up handling of external 'input' feeds

While working on #103 (after #98 broke the alpha example) we got hit by inconsistent handling of 'common' and 'unique' (renamed in #103) parameter sets (see discussion there for context).

The (new) test_feed.py:test_extfeed() illustrates the confusion: Poisson and Gaussian input parameters are always created by create_pext, but the 'common' ones aren't.

Variable naming (git grep p_ext) is quite poor, but rather than fiddling with names, it seems time to refactor the parameter handling process.

I suppose we're talking about an internal refactoring, where the API is less important than code clarity? Users shouldn't need to know anything about how the parameters are handled internally, right? The statefulness of the ExtFeed object is currently not used, so is it time to re-write the methods as functions? Or is there a higher meaning that's not obvious?

Some points to discuss here:

  • should parameter-files be flat, or more structured? (backwards compatibility-issue vs. clarity)
  • per-need inclusion of external feeds into net, which the gid_dict would then reflect (currently inconsistent for common vs. unique types, where no common input gids are created in the network object when t0 > tstop)

Support NetPyNE model specification

The code Salvador has written for HNN can replace the network-building code in network.py and cell.py (maybe feed.py). This would allow users to modify the network configuration in an ad hoc fashion. It also would integrate with the web-based HNN GUI (bokeh-based) being developed by MetaCell.

So the question is how can these 4 files be integrated into mne-neuron, and lead towards a single code base for HNN?
https://github.com/jonescompneurolab/hnn/tree/netpyne/netpyne

gid-to-node assignment should happen within Cell object

Rather than having a NeuronNetwork instance modify a cell's gid-to-node assignment within a parallel context (i.e., via _PC.set_gid2node(gid, rank), we should try delegating the task to the _Cell and _ArtificialCell classes. This way, the Network or NeuronNetwork classes can simply create a list of gid values and let the cell objects handle their respective NEURON backends.

Tests are not runable

It's very cool that results can be compared to HNN on new commits. However, a couple of issues running the tests locally

  1. Tests should probably not depend on mne:
========================================================= ERRORS =========================================================
_________________________________ ERROR collecting mne_neuron/tests/test_compare_hnn.py __________________________________
ImportError while importing test module '/Users/blake/repos/mne-neuron-new/mne_neuron/tests/test_compare_hnn.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
mne_neuron/tests/test_compare_hnn.py:6: in <module>
    from mne.utils import _fetch_file
E   ModuleNotFoundError: No module named 'mne'
!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!! Interrupted: 1 errors during collection !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
================================================ 1 error in 0.37 seconds =================================================
  1. Where is /test_data/ (forgive my ignorance of travis CI)?
    https://github.com/jasmainak/mne-neuron/blob/master/mne_neuron/tests/test_compare_hnn.py#L16

Hangs on import

Hi there,

After installing hnn-core on my Mac, as per instructions in the README.rst file, python hangs (for several hours!) when I try to import hnn_core. Do I need a certain version of Anaconda (e.g., 2 or 3) for this to work?

See standard output below:
$ python
Python 3.6.8 |Anaconda, Inc.| (default, Dec 29 2018, 19:04:46)
[GCC 4.2.1 Compatible Clang 4.0.1 (tags/RELEASE_401/final)] on darwin
Type "help", "copyright", "credits" or "license" for more information.

import hnn_core

Replicate periodic evoked inputs with rhythmic inputs

I currently have a param file with evoked inputs that have periodic spacing. My goal is to replicate the simulation with only rhythmic inputs, however, there seems to be a discrepancy in how synaptic weights are assigned for evoked vs. rhythmic inputs. The following screenshots demonstrate this issue and their corresponding param files are attached. Note that, according to the param files, the synaptic conductancies for proximal and distal inputs are congruent across simulations.
Screenshot from 2020-04-30 20-36-46
Screenshot from 2020-04-30 20-29-04

I realize that there is a difference in how jitter is applied to individual evoked inputs vs. a series of rhythmic inputs; however, the fact that the simulations above produce dipoles that are different by an order of magnitude is confusing.

param_files.zip

Calcium dynamics

As many of you know I have spent the last year or so changing the model L5 pyramidal cells so that the simulated voltage traces in the dendrites look more similar to electrophysiological data recorded from pyramidal neuron dendrites, especially during calcium spikes/bursts. To do this, I changed the L5_pyramidal.py so that the dendritic Na, Kv, and Ca (L-type) conductances are distributed as functions of the distance from the soma, rather than setting the whole dendrite equal to one value. Here is the modified file for anyone who wants to look at or use these (saved as .txt because github didn't allow attaching .py files): L5_pyramidal.txt

For anyone using it

Important: I believe that if you are only changing this file out for the existing L5_pyramidal.py file you will run into an error with __create_all_IClamp() that will stop the code from working. I will get around to fixing this soon but in the meantime just copy the version of this function that's in the original code and replace this function. The reason is that I added functionality to inject current into the dendrite for some of my simulations but that involved changing other files as well. Sorry about this!

Within the file I have included a way to switch back and forth from these distributions ("updated_HNN_dends") to the regular HNN distributions ("classic_HNN_dends") which are constant values for everything except HCN channels. Just comment one or the other out under the _biophys_dends function like so

def __biophys_dends(self):
# self.classic_HNN_dends()
self.updated_HNN_dends()

You have to save the file, then quit HNN and restart it in order for this to do anything because this code gets run right when it is launched.

Right now the dendritic and somatic conductances are still adjustable from the GUI or the param files. The parameters I've been using for these three channels (Cell Parameters > L5 Pyr Biophysics) are:

L5Pyr_soma_gkbar_hh2: 0.06
L5Pyr_soma_gnabar_hh2: 0.32
L5Pyr_soma_gbar_ca: 10.0
L5Pyr_dend_gkbar_hh2: 0.0001
L5Pyr_dend_gnabar_hh2: 0.0028
L5Pyr_dend_gbar_ca: 40.0

The dendritic conductances are now functions of these values. Some of these changes are pretty drastic. With these values, the channel density distribution will now look like this (dotted lines are "classic" conductances).
fig_2_5 copy 2

Files
I have roughly hand-tuned the evoked input parameters for the threshold and suprathreshold ERP parameter files so that, with these changes to the L5 pyramidal cells, the output provides a good fit to the data. These need to be improved/optimized more. Once again I had to upload as .txt but they should be .param files.
image
ERP_yes.txt

image
ERP_suprathreshold.txt

For HNN development

I am sure there is a much better way for this code to be written and organized! Not to mention that hnn-core gets rid of the L5_pyramidal.py file. Some things to think about include

  • whether there would be a way for users to easily switch back and forth from the "classic" and "updated" distributions from the GUI, rather than the bad solution I have of changing the source code and relaunching
  • whether the parameters I have characterizing the distributions (e.g. the distance where the calcium conductance levels off) ought to be hard-coded as they are now or adjustable by the user.

Also, I changed and added a lot more files in order to be able to record and plot voltage traces, calcium currents, and calcium conductances in individual compartments of the L5 pyramidal neurons. I think some of these plots could be useful to include in the new versions of HNN. I've just created a public repository SMPugliese/hnn_calcium (https://github.com/SMPugliese/hnn_calcium) with the most up to date files I have on my computer. Let me know if there is another place you would prefer me to put these files.

add manifest.in

The manifest.in is useful when using pip to install (not working at the moment). It can be used to include the parameter files ...

write txt files containing dipoles and spiking

It looks like a dipole.write() function exists but it needs to be fixed so that the name is assigned appropriately for each dipole file created.

A similar function would be great for network spiking so that we can generate data files similar to those produced by HNN.

Buggy examples

Hi all,
I was just trying the examples in trunk on macos and was unable to get any of the 3 to run.

  1. plot_simulate_somato fails looking for data (/Users/bloyl/mne_data/MNE-somato-data/MEG/somato/sef_raw_sss.fif) maybe the mne dataset has reorganized?

  2. plot_simulate_alpha.py and plot_simulate_evoked both fail with a variant of

File "/Users/bloyl/Projects/hnn-core/hnn_core/basket.py", line 27, in _biophysics
    self.soma.insert('hh2')
ValueError: argument not a density mechanism name.

add functionality for running batch simulations

Enable users to run a non-interactive batch of simulations of the HNN model by specifying a set of parameters values to use. This could be used for sweeping a range of parameter values (e.g. optimization), a list of specific parameter files to iterate over, and/or a distribution to draw from in choosing parameter values (e.g. priors for SNPE).

This will be a new interface for running simulations via hnn_core.simulator with the parameters to use specified as input and the results retrievable by their respective Python objects, and optionally saving the results/parameter files to disk.

Investigate what can be learned from how it is done in scipy and NetPyNE batch simulations.

move analysis code from HNN GUI to start hnn_core.analysis submodule

Purpose:
To make code that was previously only in HNN GUI dual-purpose and serve for the analysis of simulations run by hnn-core from the command-line.

Rationale:
Since HNN GUI will import hnn-core already for hnn_core.simulator to run the NEURON solver, this will be a better home for the analysis functions. These functions will benefit from documentation and unit testing as part of hnn-core

Proposed components:

  • Simulated LFP in hnn_core.simulator (#150)
  • CSD calculation and plotting (#68)
  • Objective functions for model optimization (parts from #77)
  • Wrappers for parameter estimation with SNPE
  • Spectral analysis
  • RMSE calculation to data (#83 - closed, needs PR)

In addition to moving the code, the following improvments will be made:

  • Code cleanup (Use library functions, formatting, etc.)
  • Unit testing
  • Documentation

git bisect to find bug

Running git bisect to find difference between HNN and mne-neuron results gave:

c76a27b is the first bad commit
commit c76a27b
Author: Mainak Jas [email protected]
Date: Wed Feb 6 17:03:18 2019 -0500

[MRG]: refactor parconnect (#6)

* MAINT: refactor parconnect

* FIX remove redundant self

* FIX A_weight

* MAINT: use postsyns instead of dendrites

* DOC synapse -> receptor

a better API for hnn-core

@blakecaldwell to continue off our discussion from today, I was proposing something along the lines of

tmax = 0.17
params = read_params('N20.json')

for t in np.arange(0, tmax, 0.5):
   net = Network(params)
   # extfeeds = create_extfeeds(params)
   extfeeds = [ExtFeed('proximal', t=0.05),
                      ExtFeed('distal', t=0.1)]
   dipoles = simulate_dipole(net, extfeeds, t=tmax, tstep=0.03)

I think you had two objections:

  1. How will this be backwards compatible with HNN?
  2. What is the advantage of doing this over params.update

For 2., I think the answer is that params is an antipattern in the same way kwargs is. There are many blog posts or stack overflow posts which explain this -- e.g., here.

As for 1., we could choose to be backwards compatible by adding an additional argument params to simulate_dipole that defaults to None but with a deprecation warning that it will be removed in the next release cycle.

Print simulation progress messages to screen with MPIBackend

The code for MPIBackend in #79 waits for the subprocess to complete after all trials have run before printing progress messages to the screen. This was done for simplicity as interactively communicating the stdout from the child process while the job is running will require some handling using select in parallel_backends.py.

In short, the progress messages should appear the same regardless of the backend used (JoblibBackend or MPIBackend).

multiple trials with consistent seeding

Implement a scheme for seeding multiple trials that allows for validating results with JoblibBackend and MPIBackend in #79. Ideally, this will produce the same results as HNN GUI. After this point, we can implement a plan for making changes such as how trials are seeded with provisions for being able to go back and test the "original" seeding scheme.

Note that #55 started to address this, but RMSE is a somewhat separate feature. The checks in test_hnn_core.py are more thorough than RMSE.

modernize setup file

  • custom mechanisms should be build during install time
  • requirements should be added to the install script or as a requirements.txt file

Replicate periodic evoked inputs with rhythmic inputs

Moving this to hnn-core from hnn

From @rythorpe in https://github.com/jonescompneurolab/hnn/issues/208

I currently have a param file with evoked inputs that have periodic spacing. My goal is to replicate the simulation with only rhythmic inputs, however, there seems to be a discrepancy in how synaptic weights are assigned for evoked vs. rhythmic inputs. The following screenshots demonstrate this issue and their corresponding param files are attached. Note that, according to the param files, the synaptic conductancies for proximal and distal inputs are congruent across simulations.

Parallel Context as a mutually exclusive option to joblibs

https://github.com/hnnsolver/hnn-core/blob/0aa973ef430e67e3266883e909ea02945d41a1ac/hnn_core/dipole.py#L27-L34

@jasmainak I'm preparing a PR to add ParallelContext (n_cores > 1, but not compatible with n_jobs > 1). A function similar to _clone_and_simulate will be called, and the net.* functions need to be part of it. Can you explain why these functions need to separate from Network(params, n_cores)? All of _clone_and_simulate gets pickled, right? Perhaps these net.* calls can be part of an appropriately named method of Network.

simplify process for initializing cell-to-cell connections

Currently, various functions and wrappers are involved in allowing a NeuronNetwork instance to establish cell-to-cell (i.e., local network and feed) connections:

  1. NeuronNetwork()._build() calls _parnet_connect()

  2. NeuronNetwork()._parnet_connect() then calls parconnect(), parreceive(), and parconnect_ext() within one of the child cell classes (e.g., Pyramidal).

  3. Pyramidal().parconnect() then calls Pyramidal._connect() and Pyramidal().parreceive() and Pyramidal().parconnect_ext() calls Pyramidal()._parconnect_from_src() (all of which are inherited from the parent _Cell class). (Same for the basket cell class.)

  4. Finally, _PC.gid_connect() is called from the Pyramidal()._parconnect_from_src() which is the lowest common python element in establishing a cell-to-cell connection.

It would be nice to rearrange and simply this process so that it is more transparent to Network how and when network connections are made. Ideally, there would be a single function, a method in _Cell, that Network could call in order to create a given cell-to-cell connection.

add CSD plot for comparing against animal studies

Background

The ability to calculate LFP for simulated electrodes in the model (#150) allow for time vs. depth laminar CSD plots to be generated. Laminar CSD (electrodes arranged vertically) is the primary goal of this feature because laminar CSD is commonly used in animal studies where a laminar probe is inserted into the cortext to measure extracellular potentials. In the experimental animal studies, CSD is calculated from the bandpass filtered LFP data (low-frequency portion of extracellular potentials) to identify underlying neuronal activity at specific cortical layers. Such activity is identified as a "sink" (net influx of current into cell) or "source" (efflux into extracellular space) in the temporal-spatial domain.

Figure 5 from Sherman et al. 2016 shows laminar CSD plots of an anesthetized mouse and an awake monkey.

Screen Shot 2020-09-08 at 8 41 05 AM

Sherman MA, Lee S, Law R, et al. Neural mechanisms of transient neocortical beta rhythms: Converging evidence from humans, computational modeling, monkeys, and mice. Proc Natl Acad Sci U S A. 2016;113(33):E4885-E4894. doi:10.1073/pnas.1604135113

CSD can be calculated via

numpy.diff(LFP_data,n=2,axis=ax)/spacing_mm**2

The sign may be adjusted to make a current sinks negative and sources positive.

Goal

Create laminar CSD plots from simulated data that are comparable to Figure 5 in Sherman et al. 2016

Requirements

  1. Averaging LFP over multiple trials is typically done before computing CSD

  2. CSD plots are created using a 2D spline over values at each electrode depth. Support various spline functions.

  3. Users need to be able to easily adjust the X,Y scales to match simulated data with their experimental data

  4. The CSD of the first and last channels cannot be computed using the 2nd spatial derivative. The Vaknin method is used to derive approximate CSD signals at these points

    G. Vaknin, P.G. DiScenna, T.J. Teyler, A method for calculating current source density (CSD) analysis without resorting to recording sites outside the sampling volume, Journal of Neuroscience Methods, Volume 24, Issue 2, 1988, Pages 131-135, ISSN 0165-0270, https://doi.org/10.1016/0165-0270(88)90056-8.

  5. Implement iCSD method as an alternative to numpy.diff() above

Procedure for building docs

So when trying to run Sphinx Gallery I get an error that is easy to understand. However, I feel that I'm probably missing something in your workflow rather than touch examples/README.txt. Could you include more verbose instructions in CONTRIBUTING?

(hnn) Blakes-MacBook-Pro:doc blake$ make html
sphinx-build -b html -d _build/doctrees   . _build/html
Running Sphinx v2.1.1
making output directory... done
generating gallery...

Exception occurred:
  File "/Users/blake/miniconda3/envs/hnn/lib/python3.6/site-packages/sphinx_gallery/gen_gallery.py", line 262, in generate_gallery_rst
    .format(examples_dir))
FileNotFoundError: Main example directory /Users/blake/repos/mne-neuron-new/doc/../examples does not have a README.txt file. Please write one to introduce your gallery.
The full traceback has been saved in /var/folders/zq/5r57yxls19q2wcq98g5r4gb40000gn/T/sphinx-err-ikb8ajow.log, if you want to report the issue to the developers.
Please also report this if it was a user error, so that a better error message can be provided next time.
A bug report can be filed in the tracker at <https://github.com/sphinx-doc/sphinx/issues>. Thanks!
make: *** [html] Error 2

add example for alpha waves

Looking at diff, probably only a few params need to be changed:

$ diff default.param OnlyRhythmicDistal.param 
2c2
<     "tstop": 170,
---
>     "tstop": 710.,
5c5
<     "N_trials": 1,
---
>     "N_trials": 0,
8,11c8,11
<     "save_spec_data": 0,
<     "f_max_spec": 100,
<     "dipole_scalefctr": 3000,
<     "dipole_smooth_win": 30,
---
>     "save_spec_data": 1,
>     "f_max_spec": 40.,
>     "dipole_scalefctr": 150000.0,
>     "dipole_smooth_win": 0,
119,123c119,123
<     "gbar_L2Pyr_L2Pyr_ampa": 0.0005,
<     "gbar_L2Pyr_L2Pyr_nmda": 0.0005,
<     "gbar_L2Basket_L2Pyr_gabaa": 0.05,
<     "gbar_L2Basket_L2Pyr_gabab": 0.05,
<     "gbar_L2Pyr_L5Pyr": 0.00025,
---
>     "gbar_L2Pyr_L2Pyr_ampa": 5e-4,
>     "gbar_L2Pyr_L2Pyr_nmda": 5e-4,
>     "gbar_L2Basket_L2Pyr_gabaa": 5e-2,
>     "gbar_L2Basket_L2Pyr_gabab": 5e-2,
>     "gbar_L2Pyr_L5Pyr": 2.5e-4,
126,134c126,134
<     "gbar_L5Pyr_L5Pyr_nmda": 0.0005,
<     "gbar_L5Basket_L5Pyr_gabaa": 0.025,
<     "gbar_L5Basket_L5Pyr_gabab": 0.025,
<     "gbar_L2Pyr_L2Basket": 0.0005,
<     "gbar_L2Basket_L2Basket": 0.02,
<     "gbar_L2Pyr_L5Basket": 0.00025,
<     "gbar_L5Pyr_L5Basket": 0.0005,
<     "gbar_L5Basket_L5Basket": 0.02,
<     "t0_input_prox": 1000.0,
---
>     "gbar_L5Pyr_L5Pyr_nmda": 5e-4,
>     "gbar_L5Basket_L5Pyr_gabaa": 2.5e-2,
>     "gbar_L5Basket_L5Pyr_gabab": 2.5e-2,
>     "gbar_L2Pyr_L2Basket": 5e-4,
>     "gbar_L2Basket_L2Basket": 2e-2,
>     "gbar_L2Pyr_L5Basket": 2.5e-4,
>     "gbar_L5Pyr_L5Basket": 5e-4,
>     "gbar_L5Basket_L5Basket": 2e-2,
>     "t0_input_prox": 2000.0,
136,138c136,138
<     "tstop_input_prox": 1001,
<     "f_input_prox": 10.0,
<     "f_stdev_prox": 20.0,
---
>     "tstop_input_prox": 710.,
>     "f_input_prox": 10.,
>     "f_stdev_prox": 20.,
151c151
<     "t0_input_dist": 1000,
---
>     "t0_input_dist": 50.,
153,155c153,155
<     "tstop_input_dist": 1001,
<     "f_input_dist": 10.0,
<     "f_stdev_dist": 20.0,
---
>     "tstop_input_dist": 710.,
>     "f_input_dist": 10.,
>     "f_stdev_dist": 20.,
158c158
<     "input_dist_A_weight_L2Pyr_ampa": 0.0,
---
>     "input_dist_A_weight_L2Pyr_ampa": 5.4e-5,
163c163
<     "input_dist_A_weight_L5Pyr_ampa": 0.0,
---
>     "input_dist_A_weight_L5Pyr_ampa": 5.4e-5,
166,167c166,167
<     "t_evprox_1": 26.61,
<     "sigma_t_evprox_1": 2.47,
---
>     "t_evprox_1": 1000,
>     "sigma_t_evprox_1": 2.5,
169c169
<     "gbar_evprox_1_L2Pyr_ampa": 0.01525,
---
>     "gbar_evprox_1_L2Pyr_ampa": 0.0,
171c171
<     "gbar_evprox_1_L2Basket_ampa": 0.08831,
---
>     "gbar_evprox_1_L2Basket_ampa": 0.0,
173c173
<     "gbar_evprox_1_L5Pyr_ampa": 0.00865,
---
>     "gbar_evprox_1_L5Pyr_ampa": 0.0,
175c175
<     "gbar_evprox_1_L5Basket_ampa": 0.19934,
---
>     "gbar_evprox_1_L5Basket_ampa": 0.0,
177,178c177,178
<     "t_evdist_1": 63.53,
<     "sigma_t_evdist_1": 3.85,
---
>     "t_evdist_1": 2000.0,
>     "sigma_t_evdist_1": 6.,
180,187c180,187
<     "gbar_evdist_1_L2Pyr_ampa": 0.000007,
<     "gbar_evdist_1_L2Pyr_nmda": 0.004317,
<     "gbar_evdist_1_L2Basket_ampa": 0.006562,
<     "gbar_evdist_1_L2Basket_nmda": 0.019482,
<     "gbar_evdist_1_L5Pyr_ampa": 0.142300,
<     "gbar_evdist_1_L5Pyr_nmda": 0.080074,
<     "t_evprox_2": 137.12,
<     "sigma_t_evprox_2": 8.33,
---
>     "gbar_evdist_1_L2Pyr_ampa": 0.0,
>     "gbar_evdist_1_L2Pyr_nmda": 0.0,
>     "gbar_evdist_1_L2Basket_ampa": 0.0,
>     "gbar_evdist_1_L2Basket_nmda": 0.0,
>     "gbar_evdist_1_L5Pyr_ampa": 0.0,
>     "gbar_evdist_1_L5Pyr_nmda": 0.0,
>     "t_evprox_2": 2000.0,
>     "sigma_t_evprox_2": 7.,
189c189
<     "gbar_evprox_2_L2Pyr_ampa": 1.438840,
---
>     "gbar_evprox_2_L2Pyr_ampa": 0.0,
191c191
<     "gbar_evprox_2_L2Basket_ampa": 0.000003,
---
>     "gbar_evprox_2_L2Basket_ampa": 0.0,
193c193
<     "gbar_evprox_2_L5Pyr_ampa": 0.684013,
---
>     "gbar_evprox_2_L5Pyr_ampa": 0.0,
195c195
<     "gbar_evprox_2_L5Basket_ampa": 0.008958,
---
>     "gbar_evprox_2_L5Basket_ampa": 0.0,
197c197
<     "sync_evinput": 0,
---
>     "sync_evinput": 1,
224c224
<     "Itonic_T_L5Basket": -1.0
---
>     "Itonic_T_L5Basket": -1.0,

alpha simulation fails (NEURON?)

Walkthroughs in examples for evoked and gamma work fine, but alpha fails mysteriously. I've set 'tstop': 110.0 for speed, but below is equally valid for the default 710.0.

In the hopes of finding a bug in params, I even tried this (3 keys differ, not that they seemed important):

params_fname = op.join(hnn_core_root, '../hnn/param', 'OnlyRhythmicDist.param')

But I get the same plots. The same example works fine in hnn, in the same conda env., where NEURON is from PyPi (7.8.0-127-g91f6e623+).

I guess what I'm asking is: any pointers on where to start debugging my setup? I suppose it's likely my NEURON installation, though I've been through the tests for nrnpython, which run quite happily in a parallel context (openmpi, mpi4py) on my dual-core macOS Catalina laptop.

discrepancy between simulation trial variability in HNN vs hnn-core

I'm experiencing an issue where I get significantly different results, particularly noticeable in the inter-trial variability, in HNN versus hnn-core. Running the attached param file in HNN (top) and hnn-core (bottom) using the attached python script, I get the figures shown below. Furthermore, I haven't been able to reproduce this issue using the default parameter set.

Screenshot from 2020-08-11 15-05-02
Screenshot from 2020-08-11 15-20-03

issue_demo.zip

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.