GithubHelp home page GithubHelp logo

neurotechx / eeg-101 Goto Github PK

View Code? Open in Web Editor NEW
243.0 35.0 55.0 64.52 MB

Interactive neuroscience tutorial app using Muse and React Native to teach EEG and BCI basics.

License: ISC License

JavaScript 67.91% Java 28.18% Python 2.64% Objective-C 0.88% Starlark 0.38%
eeg muse android plotting react-native bci

eeg-101's Introduction

banner

An Interactive EEG tutorial that _taught_ EEG and BCI basics.

EEG 101 is dead, may it rest in peace.

Due to Interaxon rescinding support for its Muse SDK, it is no longer possible for open-source developers to build mobile apps that connect to Muse devices, including this one. Thus, we have stopped development here and removed the app from the Google Play Store.

If you are interested in learning or teaching EEG, we recommend you check out the EEGEdu project, which we see as the spiritual successor to EEG 101, expanding on the EEG 101 experience by adding features that we could only have dreamed of when we started this project back in 2016.

Good luck and happy neurohacking,
Dano

Overview

  • Teaches the basics of EEG, including where signals come from, how devices work, and how to process data
  • Contains a general purpose binary classifier for EEG data
  • Streams data from the Muse with the LibMuse Java API
  • Built with React Native for Android
  • Completely free, open-source, and available for use/adaption in any project

Video Walkthrough

https://www.youtube.com/watch?v=fDQZ5zWVjY0&feature=youtu.be

Lesson Content

  • Neurophysiology of EEG
  • EEG hardware
  • Filtering
  • Epoching
  • Artefact Removal
  • The Fourier Transform
  • The Power Spectral Density Curve
  • Brain waves
  • Brain-Computer Interfaces
  • Machine Learning

How it works

screens

Our goal with EEG 101 was to create a flexible base for EEG and BCI mobile development that novice programmers can build on top of for multiple platforms with different EEG devices. To satisfy those concerns, we've built the app in React Native, which allows for fast, straight-forward front-end development and the potential to port to iOS, Web, or Desktop (Electron) in the future.

Currently, EEG 101 is split right down the middle between Java and React. If you're interested in how we connect to the Muse, process EEG data, and plot the results in real time, check out the graph and signal classes in the android source folders. Our implementations are all (for the most part) typical Android components written in Java.

If you'd like to use EEG 101 as a base for your own app in React Native, take a look at how we've written the tutorial in the src folder. Connecting to a Muse and plotting real-time EEG data is as simple as using one of the Native components we have already prepared.

Setup (instructions may be out of date)

  1. Install and setup React Native. Note: EEG 101 uses lots of native code, so create-react-native-app and Expo are not an option. Follow the instructions for "Building Apps with Native Code." You may also need to install the JDK, Node, Watchman, and the Gradle Daemon as well
  2. Install yarn
  3. Clone this repo git clone https://github.com/NeuroTechX/eeg-101.git
  4. Download the LibMuse SDK from Muse's developer website. We've already taken care of integrating the sdk library into the app, so just make sure you end place libmuse_android.so in <clonedRepoName>/EEG101/android/app/src/main/jniLibs/armeabi-v7a/ and libmuse_android.jar in <clonedRepoName/EEG101/android/app/libs/
  5. run yarn install in the EEG101 folder
  6. Connect an Android device with USB debug mode enabled. Because the LibMuse library depends on an ARM architecture, EEG 101 will not build in an emulator
  7. Run react-native start to start React packager
  8. In new terminal, run adb reverse tcp:8081 tcp:8081 to ensure debug server is connected to your device and then react-native run-android to install EEG 101

Common setup problems

  1. Gradle build error: Attribute "title" has already been defined
  1. INSTALL_FAILED_UPDATE_INCOMPATIBLE: Package com.eeg_project signatures do not match the previously installed version; ignoring!
  • Solution: Uninstall any pre-existing versions of the app on your device
  1. Could not connect to development server
  • Solution: Make sure that the device is connected, run adb reverse tcp:8081 tcp:8081, and restart the React packager (react-native-start)
  1. Could not get BatchedBridge
  • Solution: Run adb reverse tcp:8081 tcp:8081 again and reload
  1. Error retrieving parent for item: No resource found that matches the given name 'android:TextAppearance.Material.Widget.Button.Borderless.Colored'

eeg-101's People

Contributors

braintrain3000 avatar dennlinger avatar dependabot[bot] avatar hubertjb avatar jdpigeon avatar kowalej avatar mikeperrotta avatar nicproulx avatar nuks avatar rcassani avatar rlhoste avatar tmcneely avatar twuilliam avatar vasyl91 avatar zaeburn avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

eeg-101's Issues

Color blindness compatibility

It's come to my attention that the colors we used to label electrodes, which we grabbed from the official Muse app, don't play too well with color blindness, specifically deuteranopia. It seems that the greenish and violet colors are hard to distinguish for some people.

We should probably review the colors that we use in all the electrodediagram assets and change them to a new color palette.

Redesign Sandbox screen

The sandbox screen is looking a little clunky. A redesign is probably necessary to make it look more professional. This might involve a dropdown menu and more advanced navigation for the app as a whole.

Localization

Application should be localized to support wider, international user base. Can make use of : https://github.com/AlexanderZaytsev/react-native-i18n.

Text should be centralized to translation files for each target language and referenced accordingly using key identifiers.

Suggested languages to begin: English, French, and Spanish.

Create offline CSV-reading options for graphs

  1. Add OpenCSV csv library to project and create a method (in new CSVReader class?) that can read CSVs (~5s of raw EEG data) into primitive java arrays.
  • Since most graphs only show single-channel data, might be easiest to go straight to 1D arrays. For a scene in which we want to switch to different electrodes, having each channel stored as a different 1D array would work
  1. Create OfflineDataListener class that updates eegBuffer and calls update plot at same frequency as the real MuseDataListener (220-256hz)

  2. Add bridging functions that allow EEGGraph, FilterGraph, and PSDGraph to be instantiated from JS with this offline mode, loading CSVs and starting the OffflineDataListener instead of MuseDataListener

Create new content

We've included what we think is a basic, yet thorough introduction to EEG in EEG 101. However, we're always open to new content and remixing of old stuff.

The app is composed of scenes, each of which has a graphic (either an image, animation, or EEG visualization) and one to several blocks of writing. Each block of writing has a subtitle and some text. Swiping to the left advances to the next block (graphic up above is unchanged). High-level info can be linked to with pop-up windows triggered by selecting certain highlighted sections of text

Here's an example:
scene 1 small

  • Title: Introduction
  • Subtitle: Your brain produces electricity
  • Text: Using the EEG...
  • Pop-up link: EEG

Selecting the pop-up link leads to something like this. This is where we can include 'undergraduate-level' information:
scene 1 popup small

All current lesson text for the app can be found in this document

Optimize Muse connection code for different phones

Some more work should go into making sure the connection code works for all types of phones (e.g. Google Pixel seemed to have problems, surprisingly). Support from Interaxon team would be nice here

Classifier data visualization

In order to help users understand the ML process, we should visualize some important data behind the classifier trained in the BCI screens.

This should probably include:

  • a line plot showing the relative importance of all the features used to train the GNB
  • a clustered xy plot to show the seperation of features

Remove dataSource runnable in EEGGraph and FilterGraph; Improve performance

Now that filtering and data recording are being performed in the muse data listener without any noticeable performance hit, we might as well re-evaluate how we are plotting incoming data.

Currently, we have two runnables, a plotUpdater and a dataSource. It is probably necessary to keep the dataSource for the PSD, where the FFT has to be performed, but for EEGGraph and FilterGraph with no FFT, it can probably be removed.

We should experiment to see whether incoming data should be added to the dataSeries in either the listener or the plotUpdater thread. Furthermore, we should experiment with adding multiple data points at a time and how fast our plotUpdater thread should run.

With some haphazard fiddling, I was able to get 4 seconds of fully sampled EEG plotting at around 23fps with the dataSeries code in plotUpdater. We should see if this could be improved

Make cute illustration of how to wear the Muse

It'd be nice to have little image of correct Muse position for all the people who's first intuition about how to wear it is horribly, horribly wrong.

Something like this, but ours, and better!

Add feedback button on End slide

Hopefully, this will allow us to see how we can improve the app, and also give people some easy way of contributing.

Could probably just open an email intent to my neurotechx account

App not found on Google Play

Hello, I'm new here and just discovered this project. Maybe it's a stupid question, but the link to the Play Store seems to be not working, and if I search in the store I only find „Neurodoro“, a pomodoro timer, from NeuroTechX.
So did you take it down again? Maybe the information in the readme file should be updated then.

Sleep monitor, detection of certain waves

I would like to modify your awesome app to create sleep tracking app with alarm clock feature. To do it I need to know how to transform BCI section to read and define particular waves and attribute them to particular sleep stages and perform certain actions eg. attribute Theta activity to Stage 1, Sleep spindles to Stage 2, Delta waves to Stage 3 & 4, triggering an alarm while sleep spindles detected.
Anyone willing to help?

EDIT: such function could also be suitable for EEG 101 itself as the app could show what kind of waves are visible on screen in real time.

Lottie Animation Issues

Current issues with animations

Epoching

  • Works when loaded through android assets
  • Throws UnexpectedNativeTypeException: TypeError: expected dynamic type int64, but had type double error when loaded in JS src. From ReadableNativeMap.getInt(Native Method)

Awake Asleep

  • Throws You must set an images folder before loading an image. Set it with LottieComposition#setImagesFolder or LottieDrawable#setImagesFolder when loaded through android assets
  • Throws UnexpectedNativeType error when loaded from JS

Fourier

(Shape Layers, Trim paths on the shape layers, no masks, null objects, and parenting)

  • Loads through android assets
  • Throws UnexpectedNativeType error when loaded from JS

Watermelon (example)

  • Works when loaded through android
  • Works when loaded through JS

Notes

  • Casting changing map.getInt to map.getDouble and casting to into results in animations that load but don't play (even when rounding to most accurate int)
  • Example jsons appear to have only ints in objects in the layers array
"ip": 44,
      "op": 90,
      "st": -44,
      "bm": 0,
      "sr": 1

Whereas our custom jsons have doubles in these same objects

"ip": -2.99999974034093,
      "op": 112.999990219508,
      "st": -2.99999974034093,
      "bm": 0,
      "sr": 1

UI changes need for tablet

Because of the larger screen size some things on the tablet are a little off. Here's a list I created when going through the tutorial.

  • Offline mode button text too big
  • ElectrodeSelector should be larger
  • SideMenu text should be a little bit bigger
  • SideMenu 'Offline Mode' or 'Connected' title should be centered
  • More space around boxes in PopUpList

Return Muse SDK to repo

Interaxon closed access to their SDK, maybe return it back to this repo? Because now new users who don't have .so and .jar files cant use EEG 101

Implement Java classifier for BCI task

#We want to write a simple, general purpose, supervised EEG classifier that EEG 101 users can play with at the end of the tutorial to try and distinguish between different brain states (eyes open vs closed / relaxe/ concentrated).

The first implementation of this will likely be in pure Java, using a fairly simple linear classification algorithm (i.e. Gaussian Naive Bayes)/

BCI Sandbox Screen

In order for people to easily use the BCI, it would be easy to create a new screen where all of the ClassifierModule functions can be accessed.

With ClassifierTest (in history) as example, design and code work should make this a pleasant screen to use.

Design lessons might inspire signal sandbox design

LibMuse Android SDK dependency

Hi,

Does your project depends on the LibMuse Android SDK? If yes, then the license of the SDK states that " distribution of your application in a commercial store is not covered by the LibMuse SDK developer license". Can you elaborate a bit on this topic? Can people base a project on your code eeg-101 and then put it in the PlayStore?

Fixing incorrect teaching material and translations

Going through the app again, I noticed there are a few inconsistencies or incorrect information in the teaching material. I suggest we keep this issue open to flag any such mistakes and then fix them.

In the English version:

  • "When the eyes are closed, there is often ... The ability to detect alpha waves when the eyes are closed varies greatly from person to person." I don't think 'greatly' is correct, although there is some variation. Maybe look for a reference here.
  • "However, when large numbers of cortical neurons fire rhythmically, their activity can produce electric fields that are large enough to cross the surface of the skull." Technically speaking, it always crosses to the surface of the skull. Instead we should say that it's large enough to be picked up there.
  • "Thus, data gathered from EEG devices with different reference electrode placement can vary considerably." It's not a matter of devices, but just of where the reference is... E.g., we could re-reference Muse data.
  • "They have been associated with alertness, concentration and the active firing of neurons hard at work." I feel like the part about neurons hard at work doesn't mean much. We should maybe rephrase this.
  • "USING YOUR MIND TO CONTROL THIS PHONE" It could be a tablet too. :P

In the translations (if ever you have a minute @rcassani @neurohazardous @dennlinger :) ):

  • In slide 1, the English version used to say the noise introduced by blinks is due to muscle activity. This has been changed to correctly state that noise is caused by eyeball movement. We need to fix the translations.

If you find any other mistake please add them on this issue!

Signal quality indicator

With our current BCI and future ERP slides, signal quality will be even more important. We should as a reusable signal quality screen with nice animations so that we can make sure that users will be guided to getting things working best

Connect to Muse2 Model: MU-03 crashes the app

Hi all - we bought the Muse2 from Muse and they shipped us a model MU-03 device. It appears to have some problems with older software. Attempting to connect to the device from this software on an Android device crashes the app. Suggestions?

Find citations

We've always liked the idea of having citations to support the material taught in EEG 101, but over the course of writing and development we lost tracked of where we learned things.

Going through our lesson material and adding citations would be extremely helpful. Even if it was just a few good review articles that we could point eager learners to

Current barriers to run on iOS?

@jdpigeon What are the current barriers to making this run on iOS?

Are the android and iOS folders in EEG101 still necessary with React-Native?

Does the project still build/run on Android as described in the Readme?

App crashes when bt is enabled

Clean app installed via react-native run-android as in setup instructions crashes after clicking "Ok, it's on" on "Step 1" slide. It happens when bluetooth is on. While disabled, crashes on next slide when app propmpts to on bt. App from Play Store works flawlessly.

Design Artefact Removal Graph

For our lesson on artefact detection and removal we want to have a dynamic visualization that illustrates how epochs with too much variance are excluded from analysis.

Our first idea was keep the same graph as the basic EEG Graph, but change the color of sections of the data when variance is above a certain threshold. However, it turns out we're limited in what we can do with the plotting library. Turning only certain sections of the line a different color is not possible

Signal Quality Indicator

With the updated Noise Detector code, it should be fairly easy to have a long running signal quality indicator component.

This could be a component mounted in index that will appear whenever noise is detected in an epoch with a graphic indicating at which point on the head the noise appears.

OR

this could be a transition screen when starting the tutorial (like in the calm app)

Perhaps a mixture of the transition screen (for the tutorial) and a little widget (for sandbox and BCI) would be best

Question about plotting alpha/beta waves

Hello,

I've been playing with this repository and it has been very helpful with furthering my understanding of EEG data! Thank you for the hard work and multiple implementations.

However, I am still confused on how I should go about plotting a series for each of the specific channels (alpha, beta, gamma, theta, and delta). I.e. I'd like a graph that shows the current alpha value vs the current beta value over time. I tried plotting data from listeners on ALPHA_ABSOLUTE and BETA_ABSOLUTE, but I do not get similar results to other muse apps (such as observing higher beta waves when breaking concentration).

Any advice on how to plot the values from these channels separately? Or should I follow the PSD example from this repository?

Thanks,
Matt

Navigation menu

We need a navbar to jump to any portion of the app to any other without messing up state or losing data.

This might require more Redux integration for certain things (Classifer), but hopefully react-router helps keep track of things

Write Cross Validator in Java

Writing cross validation with Java primitives is turning out to be harder than I thought. I've started a new branch, cross_val, with half of a cross validator written.

Weka and Smile's CV functions both rely on the ease of working with objects and lists over primitive arrays. We should probably do the cross val with LinkedLists instead of primitives.

Fix highpass filter issues

Highpass filter and Bandpass filter produce weird data, including ringing after blinks and odd spectrograms.

Hubert has confirmed that the coefficients generated by our filter code are appropriate and that issues are not resulting from our filter interpretation. Likely, the issue is doing to our code collecting data from the Muse, and the sampling rate.

Error during setup

I got an error during setup when I run the command: "react-native run-android".
The error is:

FAILURE: Build failed with an exception.

  • What went wrong:
    Could not determine java version from '11.0.2'.

Have you any suggestion to solve this problem?

Make body text change size depending on language

Currently, some of the body text overflows and collides with itself in other languages.

To fix this, we should make the size of the bodytext responsive to the current language. We can grab this data through the I18n library. It'll be best to do this by migrating some of the styles to a global stylesheet.

Upgrade AndroidPlot Version

Let's move from version 1.2.2 to 1.4.2.
... I don't know. Better performance?

Changing the gradle import throws some errors with the lineAndPointFormatter constructor, so it looks like we'll have to re-write some of our plot intialization code

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.