GithubHelp home page GithubHelp logo

brogli / realtime-audioreactive-music-visualization Goto Github PK

View Code? Open in Web Editor NEW
4.0 4.0 0.0 272.65 MB

Me having a go at real-time audio- (and human) reactive music visualization.

License: MIT License

C# 74.22% ShaderLab 22.06% HLSL 3.72%

realtime-audioreactive-music-visualization's Introduction

Real-time audio-reactive music visualization

This is me having a go at real-time audio- (and human) reactive music visualization, made with untiy.

What I want:

  • Have Unity-scenes, midi-controllers and some real-time beat detection and frequency analysis.
  • Use the midi-controllers to steer prepared or procedurally generated elements in the unity scenes. The elements might or might not be acting on their own based on the information gathered through frequency analysis and beat detection.
  • Be able to improvise these "visuals" as my friends dj.

How I try to achieve this (technically):

  • Using Keijro's RtMidi wrapper, and a yet to be determined MQTT Unity PLugin I'm able to get user input from midi controllers or from an MQTT publisher into Unity.
  • Using a very nice third-party programm called Wavesum, I get a beatclock signal (over midi) into Unity. This signal is detected in real-time listening to the microphone input of my computer.
  • Using Unity's own FFT-Implentation I'm able to get a real-time frequency spectrum analysis.

Why do I want this:

  • I like the idea of molding the elements of music, as you listen to it, in visual shapes and behaviours. To me, adding a second dimension to the music visually would mean to support it in telling its story.
  • When I was looking for solutions to do this, the only really viable one I came across was Magic Music Visuals. It has a great community and tremendous devs, but due to some limitations (by design), I started looking further until I acknowledged, that a game engine like Unity is probably the perfect tool for me. The great advantage of Magic Music Visuals is that you don't need any programming knowledge, which was what scared me off game-engines earlier as well.

What does this mean for you:

Nothing really, this repo is for us to have a nice Git and Github workflow, and I don't see a reason to keep the code private. Most likely the project is not going to compile or run for you though, as there are assets in use, that I'm either not willing or able to share. Have a look at the licence, use at your own risk :), as usual.

If you still want to try, continue reading.

How to set up project

  • have a look at ProjectSettings/ProjectVersion.txt to find out which unity version to use
  • clone repo
  • open in Unity
  • resolve problems resuling from the last paragraph above. Specifically:
    • remove the line starting with "ch.brogli.richfx-as-unitypackage" from Packages/manifest.json. Rich FX is a great asset from the Unity Asset Store. For my convenience I've packaged it into a seperate repo, so the Unity Package Manager can automatically pull it and it doesn't clutter the Assets folder. Since this is a payed asset, I can't share this here. So you either have to buy the asset and replace any references to it, or remove any references to it allotgether.
    • any videos, images, image sequence or other files that I keep external of this git repo and the runtime itself. They're referenced by the app, so you'd have to remove any reference or live with the exceptions.

How to create your own scene

Use Ctrl+N or File > New Scene, then select the BSII_templateScene template. Save your new scene to Assets/_bsII/_Scenes.

The template scene contains "TempDevObjects", these are needed while developing the scene, but will remove themselves once in play mode. You can ignore them.

Please clone the volume profile (on the main cam), otherwise your changes there will edit the profile used in the templateScene. You can disable most overrides, except you must not remove "camera copy". This is needed so the output of the main cam is picked up and displayed on the UI cam.

Once you have finished implementing your scene, you must add it to the build settings. Then in Assets/_bsII/Resources/SceneScreenshots add a screenshot of your scene with its filename matching its scene index. It will then get picked up by the UI and scene selection logic.

realtime-audioreactive-music-visualization's People

Contributors

brogli avatar raphaelmaschinsen avatar

Stargazers

 avatar  avatar  avatar Luca Neukom avatar

Watchers

James Cloos avatar  avatar  avatar  avatar

realtime-audioreactive-music-visualization's Issues

feat: multi monitor setup 3

  • make sure all combos of different resoultions work both for main and secondary
  • have scene content react to two or one monitor
    • 2nd monitor rendertexture
    • content on main monitor

UI: sceneselection

Sceneselector

Handling selection

  • get all scene names and their buildindex
  • map image names to scenenames
  • throw warning if an image name cant be mapped, ir there's a scenename without image
  • set buildindex as Button values or name and
  • pass it to handleclick method

UX

  • start at first scene selected
  • left and right arrow move selection through all buttons: keep track of all buttons and currently selected one, change style of selected

Scrolling

  • keep track of alllbuttons an the visible subset
  • if current selection moves out of the window, translate all buttons accordingly and update the visible subset
  • update scrollbar position (in rows)

feat: multi monitor setup 1

  • relearn old example of copying main cam to rendertexture, apply to project

if one monitor:
display ui and scene in main monitor
if two monitors:
display scene in main monitor, display ui and rendertexture in seclndary monitor

design new input system

UML

input controller
- extract sets of keys
-> toggle, trigger, fader
- eg: MelodyKeys class, containing list of MelodyKey (extends on/off-button) objects, consisting of up-event, down-event, isPressed:bool
- eg: intensity-slider (extends fader), consisting of on-event, off-event, isOn:bool, currentValue, on/off-threshold
- streamline key states: up-event, down-event, isPressed

    -> userInputs: melodykeys (down, up, isPressed -> toggle), explosionkeys (trigger), streamkeys (toggle), multiPurposeFaders (fader), volume (toggle, fader), beatcontrol (4-4, 2-4, 8-4, 16-4, 1-4, 1-8) (toggle, fader), effects (stroboscope (toggle), colorInversion (toggle), colorJumper (toggle), fadetoblack (fader), fadetoblur (fader).
    -> musicInputs: volume values, fft values, beats (4-4, 2-4, 8-4, 16-4, 1-4, 1-8)

notes:
have userinput/musicinput updater update inputs but music vals only if inputs are on.

userInputUpdater has two implementations for midi and keyboard.

stuff gets updated in a container object "provider"? for the musicInputs, and userInput objects. other objects hook here for events.

read physical to logical input mapping from file

{
    midiDeviceInputs : {
         noteInputs: { 
             channelNr_noteNr : {
                 collection : true,
                 collectionType: ToggeledUserInputs,
                 elementType: ToggeledUserLnput,
                 targetProperty : "MelodyKeys",
                 targetIndex : 0
             }
        },
        ccInputs : {
            channelNumber_ccNumber : {
                targetProperty : "fourInFour"
            }
        }
    }
}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.