GithubHelp home page GithubHelp logo

finalproject's Introduction

Notes

I could not get npm run deploy to work.

If you run the code, you'll need to use Chrome which has Web MIDI API, and you'll need a midi synthesizer connected to whatever the default port is that Web MIDI chooses. I use SimpleSynth, a software synth for Mac.

Introduction

I've built a relatively simple musical rhythm visualizer. The musical rhythm is generated from simple live input from the computer keyboard and also from hard-coded rhythms played by a simple looper. The rhythm is interpreted using a simple musical analysis model of what I'm calling 'gravity' for now, and the results are applied to generation of an impulse-response curve that drives the motion of a multi-segmented 'tail' modeled using a spring-mass-damper simulation. Particle effects are created in various ways based on the motion of the tail.

The system has many options for changing behavior, and I've only begun to explore it for now.

Goal

My goal for this project is to use my recently gained skills in computer graphics to further a long-held vision I've had for creating music visualizations based on musical analysis, rather than the typical relatively limiting audio waveform analysis. I worked on similar ideas in a small arts performance startup 11 years ago, where we provided visuals for live performance of electronic-classical fusion. However my skills in the areas of graphics programming and general programming techniques were much more rudimentary, and the technology we could afford was much simpler. Suffice it to say we made some interseting visuals, but it didn't last as a project.

My work for this project is much closer to what I've envisioned. My goal is to interpret music into a lagnuage of motion and dynamic expression, and then have a mapping from that to visual expression. One part of this goal is to develop a system which builds on smaller pieces that follow relatively simple physical and emotive principles, and then are connected together to create emergent behaviors that can be explored and 'tamed' to tease out visually relevant expressions of musical experience.

Main features

Rhythm analysis

This is very simple at this point, but sufficient as a proof of concept. Impulse responses are assigned to notes in a rhythm based on tonality (just two tones, low and high) and metric phase. The mappings that define the impulse responses are user-assignable.

Visuals

A main path generated by apply the impulse responses generated from rhythmic analysis as displacement from the z-axis of a curve.

A sphere follows this main path, and is itself followed by a tail consisting of multple segments. The motion of the segments is modeled as spring-mass-damper system. This provides a nice organic movement in repsonse to the simple motion curve generated by the rhythmic analysis. All aspects are customizable.

A particle generator creates particles on demand, based on the position and velocity of each tail segment. Many paramaters allow for varied effects as can be seen in the short demo video. The shapes and curves that emerge from the particles are a simple example of the kind of emergent behavior I'm hoping to achieve and explore. From the simple motion of a sphere, followed by the simulated natural motion of the tail, the particles generate organic looking curves and surfaces that are intimately related to part of the musical dynamics.

Technical/algorithmic tools

Spring-mass-damper system - modified from a 2D follow-the-mouse demo 'Multi-Instantiable Elastic Mouse Trailer v0.1', by Matt Evans, 2013.

file:///Users/michael/Box%20Sync/Penn/MES/700-006-Procedural-Gfx-Spr-2017/final_project/elastic-mouse-orig/Physics%20Bungee-Rope%20Cursor%20Trailer%20_%20Matt%20Evans.html

Design

Hey, look! It's my authentic hand-drawn flow chart! Well, see diagram.jpeg, I couldn't get it to embed in markdown on short notice.

Results

Very rough clips of the visualizer in action on youtube: https://youtu.be/qOaH4mM9ke8

During the second to last clip, youtube's encoding makes the particles nearly impossible to see.

Evaluation

I'm very happy what I've accomplished, although of course I'd initially thought I'd get more features into it. The greatest thing is that I now have this framework built and can add to it. I'll need to cleanup the framework, but that's little work compared to what I've got in it already.

Future work:

Lots to do! I will continue working on this as time permits. My short list is:

  • Clean up the framework to allow for better musical expression and translation mapping, among other things.
  • Tighten up the simulation timing of the tail and particles to get more consistent and tightly controlled output. This should yield more consistent shapes and patterns that will look nice. From there, randomness can be added for effect. I've already started this.
  • Add a multi-note music expression that reacts to musical tension/release of various rhythmic patterns.
  • Expand the impulse mapping options of the 'gravity' musical expression translation, including alllowing for longer time-scale impules, e.g. from measure to measure.
  • Switch color controls from RGB to HSL. This will yield much nicer interpolation.
  • Add lighting and size change effects to older particles as they fade into the distance. They could pusle as the musical time progresses and matches the same part of the beat and/or measure during which each originally played. Should look very nice
  • Code some longer rhythmic sequences to show a more varied 'history' as particles fade to the background and linger.
  • Add a midi file reader.
  • Use the particles as control points to create surfaces.
  • Add multiple paths that respond to different simultaneous musical analyses, and see how they contrast and align.
  • At some point when time really allows, rebuild the whole thing for Houdini, Touch Derivative or something like that.
  • And so on, for many years to come if all goes well.

Acknowledgements

Thanks to:

Rachel: for the idea to use impulse curves instead of going for a PID controller. Definitely saved a lot of time, and with the smoothing provided by the dynamically modelled tail, it works well enough.

Austin: for help with getting MIDI output going. It was about to drive me nuts!!

Jen, my wife: for covering lots of extra ground around the house while I went deep down the rabbit hole.

finalproject's People

Contributors

mgstauffer avatar

Watchers

 avatar James Cloos avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.