GithubHelp home page GithubHelp logo

gordonkoehn / nexus Goto Github PK

View Code? Open in Web Editor NEW
0.0 2.0 0.0 10.3 MB

Investigating the performance of a cross-correlation method of inferring functional connectivity in adaptive-exponential integrate and fire (aEIF) neuron model on small-scale neuronal networks of different activity patterns (synchronous & regular / asynchronous & regular) and topologies (random / scale-free).

Jupyter Notebook 96.08% Python 3.91% Shell 0.01%
connectivity correlation neuron-model neuron-simulations neuroscience topology connectivity-inference

nexus's Introduction

nexus

Investigating the performance of the cross-correlation method by English et al. [2017][1] of inferring functional connectivity in adaptive-exponential integrate and fire (aEIF) neuron model by Brette et al. [2007][2] on small-scale neuronal networks of different activity patterns (synchronous & regular / asynchonous & regular) and topologies (random / scale-free.)

Acknowledgements

This code was written as part of a semester project at ETH Zürich / Bio Engeneering Laboratory for my MSc Biotechnolgoy degree. The project was under Kim Taehoon's (@arahangua) thoughtful supervision.

Acknowledgements of Code

This project built upon the works other of members of the reseach group at ETH, who generously provided their brainworks/code:

  • Kim Taehoon (@arahangua)
    • /simulations/wp2*.py -> originals & variations of provided scripts
    • /conInf -> structure of analysis derived from his projects
  • Christian Donner (@christiando)
    • /tools/spyCon -> package written for functional connectivity inferrence

Structure of Projects

core.py - is the main script to generate networks, run neuronal networks ontop, classify their activity and infer their functional connectivity. core_runner.py - allows to force asynchonous/synchonous or physical behaiour by restarting simulations of core

modules

  • classifySim : clasisfies the activity of neuronal simulations from /simulations/simData - (some scripts capable of parallel processing)
  • conInf : contains scripts exploring the spyCon package and implements functional connectitiy inference with the usage of the spyCon package
  • netGen : generates networks using networkX of random and scale-free topologies and implemtns methods to anaylsis the topology
  • simulations : implements simulations of neuronal networks with the brain2 simulator and saves the results in /simulations/simData
  • tools : /spycon package (provided by Christian donner) + helper scripts

Abstract of Project

In the pursuit to comprehend the enigmatic nature of our brain, understanding operational principles of neural circuits is the main goal. The defining characteristic of any circuit is its connectivity. Yet, investigating the physical neuron to neuron connections in the living brain on a large scale is presently still infeasible. Thus, other methods to learn about the connectivity of neuronal circuits are being explored, as for instance via the neural activity. The dawn of high-density, multi-electrode implants gives hope to record large-scale neuronal activity on the single neuron level in next decades.

Given neuronal activity recordings, the functional connectivity of a network may be inferred from statistical correlation. The term functional connectivity separates the connectivity found by correlation from the physical, called structural connectivity. It may already contain the operational principles of our brain we seek to find.

Here, we evaluate the performance of a widely used functional connectivity inference method by English et al. [2017][1] on small scale networks. We generate neural activity in silico on a known random network structure, to then evaluate the performance of the algorithm against it.

The model used to simulate neurons is the prominent adaptive-exponential integrate and fire (aEIF) model by Brette et al. [2007][2], allowing to capture the fundamental exponential and adapting behaviour of the action potential.

The performance of the connectivity inference algorithm is evaluated at the extrema of synchrony of sensible network activity. Therefore an extensive parametric study in the adaption and conductance space of the used neuron model was conducted, successfully identifying regimes of a- and synchronous activity.

Particular cases of very synchronous network activity lead to a poor performance of the inference algorithm, yet this study fails to make quantitative statements aimed for.

Further, an attempt to explore network activity and performance of the algorithm at more neuro-physiological network topologies, namely scale-free networks is presented.

Full Report

Soon to be on g15n.net/ETH/nexus

Known Issues

brian gets slow / cython catche full

After running many simulaitons (>10000), brian2 got very slow for me. Python itself was fine. The issue turned out to be cython catch filed from brian2 accumulating so that no catch is free, which aparently obstructe normal funciton of brian. To fix run:

from brian2.__init__ import clear_cache) brian2.__init__.clear_cache('cython')

or delete all fined in ~/.cython/brian_extensions/ .. files are named *gnu.so or *.lock

if list of files it to long to delete by "rm *.lock" run

find . -name "*.lock" -print0 | xargs -0 rm

References

[1] English, D. F., McKenzie, S., Evans, T., Kim, K., Yoon, E., & Buzsáki, G. (2017). Pyramidal Cell-Interneuron Circuit Architecture and Dynamics in Hippocampal Networks. Neuron, 96(2), 505-520.e7. https://doi.org/10.1016/j.neuron.2017.09.033

[2] Brette, R., & Gerstner, W. (2005). Adaptive exponential integrate-and-fire model as an effective description of neuronal activity. Journal of Neurophysiology, 94(5), 3637–3642. https://doi.org/10.1152/jn.00686.2005

nexus's People

Contributors

gordonkoehn avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.