GithubHelp home page GithubHelp logo

pmalonis / arf Goto Github PK

View Code? Open in Web Editor NEW

This project forked from melizalab/arf

0.0 1.0 0.0 255 KB

advanced recording format for physiology and behavior

License: GNU General Public License v2.0

Python 22.39% C++ 58.19% MATLAB 19.42%

arf's Introduction

#-*- mode: org -*-

The Advanced Recording Format (ARF) is an open standard for storing data from neuronal and behavioral experiments in a portable, high-performance, archival format. The goal is to enable labs to share data and tools, and to allow valuable data to be accessed and analyzed for many years in the future.

ARF is built on the the HDF5 format, and all arf files are accessible through standard HDF5 tools, including interfaces to HDF5 written for other languages (e.g. MATLAB, Python, etc). ARF comprises a set of specifications on how different kinds of data are stored. The organization of ARF files is based around the concept of an entry, a collection of data channels associated with a particular point in time. An entry might contain one or more of the following:

  • raw extracellular neural signals recorded from a multichannel probe
  • spike times extracted from neural data
  • acoustic signals from a microphone
  • times when an animal interacted with a behavioral apparatus
  • the times when a real-time signal analyzer detected vocalization

Entries and datasets have metadata attributes describing how the data were collected. Datasets and entries retain these attributes when copied or moved between arf files, helping to prevent data from becoming orphaned and uninterpretable.

This repository contains:

  • The specification for arf (in specification.org)
  • A fast, type-safe C++ interface for reading and writing arf files
  • A python interface for reading and writing arf files (based on h5py).
  • A very rudimentary MATLAB interface

contributing

ARF is under active development and we welcome comments and contributions from neuroscientists and behavioral biologist interested in using it. We’re particularly interested in use cases that don’t fit the current specification. Please post issues or contact Dan Meliza (dan at meliza.org) directly.

installation

ARF files require HDF5>=1.8 (http://www.hdfgroup.org/HDF5).

The python interface requires Python>=2.6, numpy>=1.6, and h5py>=2.0 (http://code.google.com/p/h5py/). To install the module:

python setup.py install

To use the C++ interface, you need boost>=1.42 (http://boost.org). In addition, if writing multithreaded code, HDF5 needs to be compiled with --enable-threadsafe. The interface is header-only and does not need to be compiled. To install:

make install

To install the MATLAB interface, add the matlab subdirectory to MATLAB’s search path. The MATLAB interface is not yet up to the 2.0 specification.

version information

The specification and implementations provided in this project use a form of semantic versioning (http://semver.org). Specifications receive a major and minor version number. Changes to minor version numbers must be backwards compatible. The current version of the ARF specification is 2.0.

Implementation versions are synchronized with the major version of the specification. For example, the python arf package version 2.1.0 is compatible with any ARF version 2.x.

Because of this synchronization, minor version number increments to implementations may break API backwards compatibility, but retain compatibility with the underlying file format. These backwards incompatibilities will be clearly noted.

There was no public release of ARF prior to 2.0.

access ARF files with HDF5 tools

This section describes how to inspect ARF files using standard tools, in the event that the interfaces described here cease to function.

The structure of an ARF file can be explored using the h5ls tool. For example, to list entries:

$ h5ls file.arf
test_0001                Group
test_0002                Group
test_0003                Group
test_0004                Group

Each entry appears as a Group. To list the contents of an entry, use path notation:

$ h5ls file.arf/test_0001
pcm                      Dataset {609914}

This shows that the data in test_0001 is stored in a single node, =pcm}, with 609914 data points. Typically each channel will have its own dataset.

The h5dump command can be used to output data in binary format. See the HDF5 documentation for details on how to structure the output. For example, to extract sampled data to a 16-bit little-endian file (i.e., PCM format):

$ h5dump -d /test_0001/pcm -b LE -o test_0001.pcm file.arf

related projects

  • pandora (https://github.com/G-Node/pandora) has the most similar goals and is also implemented on top of HDF5. The data schema is considerably more complex.
  • neuroshare (http://neuroshare.org) is a set of routines for reading and writing data in various proprietary and open formats.

arf's People

Contributors

dmeliza avatar

Watchers

Peter Malonis avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.