melanchall / drywetmidi Goto Github PK
View Code? Open in Web Editor NEW.NET library to read, write, process MIDI files and to work with MIDI devices
Home Page: https://melanchall.github.io/drywetmidi
License: MIT License
.NET library to read, write, process MIDI files and to work with MIDI devices
Home Page: https://melanchall.github.io/drywetmidi
License: MIT License
Hey @melanchall,
Thanks for providing this library!
How would you go about getting the total length of a MIDI file?
I've looked at the example you've provided in the README, but this gives you the timing of the last note-off event:
TempoMap tempoMap = midiFile.GetTempoMap();
TimeSpan midiFileDuration = midiFile.GetTimedEvents().LastOrDefault(e => e.Event is NoteOffEvent)?.TimeAs<MetricTimeSpan>(tempoMap) ?? new MetricTimeSpan();
I need the length of the file including any "unused space".
In the scenario above, what I'm looking for is the value 768 ticks (96 ticks * 8 beats), but if I use the provided code, I get 696 (96 ticks * 7,25 beats (the position of the cursor)).
I've tried to look through the wiki, but I can't seem to figure it out.
Hello and thanks for making the library!
I have just started working on creating a midi sequencer using DryWetMidi, and I’m trying to figure out a way to create a midi clock that I can use to synchronize my software with a DAW. What I’m wishing for is something similar to the MidiInternalClock* class in the Sanford Midi library, where some of the features are a way to configure pulses per quarter note, an OnTickEvent (as well as events for Start, Stop and Continue), and a simple way to read and set the current time in ticks. I haven’t really found anything equivalent in DryWetMidi. I’d appreciate any tips or suggestions!
During the Windows App Certification Kit the test failed in the Debug configuration test section
FAILED
Debug configuration
Error Found: The debug configuration test detected the following errors:
El archivo binario Melanchall.DryWetMidi.dll se crea en modo de depuración.
which translates...
The binary file Melanchall.DryWetMidi.dll is built in debug mode
Impact if not fixed: Microsoft Store doesn’t allow a debug version of an app.
How to fix: Please make sure the app isn’t linking to any debug versions of a framework and it is built with release configuration with optimization enabled. If this is a managed app please make sure you have installed the correct version of .NET framework.
Thanks in advance
Hi Melanchall: is it possible to send received midi events to multiple outputDevices? (Midi Routing) If so, could you provide an example or reference please?.
Thanks.
First… amazing library! Thank you, very much.
I want to use drywetmidi
to playback midi events. It seems that my source event times and lengths are not quite compatible with yours.
For example, my program is generating midi events with the following structure:
struct Event<T>
{
public int RelativeBar { get; set; }
public double Position { get; set; }
public double Length { get; set; }
public T Data { get; set; }
}
RelativeBar
: Simply the bar that an event starts.
Position
: The beat that an event starts. 1
would mean first beat. 2.5
would mean halfway before between 2nd and 3rd beat.
Length
: The duration of the event in beats. '1.33' would mean roughly 1 + 1/3 beats.
You can see that I'm representing the number of beats as a double
where the value represents {Beats}.{Cents}
. I borrowed this system directly from my favorite DAW, Reaper:
It doesn't seem that BarBeatTimeSpan
offers parity with this representation. It has Bars/Beats/Ticks
. Beats
is an whole number and Ticks
seems to store an additional MidiTimeSpan
value that doesn't really have anything to do with bars/beats.
I'm sure that I can do the math on my own to convert my fractional beats to your ITimeSpan
but I wanted to go ahead and ask you if I was missing something in the library that handles this for me.
Thanks for your help!
Hi, i have an inputDevice that are receiving incomind midi messages. However in the event i just have the MidiEventReceivedEventArgs object with the Event property, so how can i detect if that was a NoteEvent or ControlChange Event, or Program Change Event?
Thanks!!
BTW: Nice library!!
Consider the following code snippet:
playManager.EventPlayed += CCEventHandler;
private void CCEventHandler(object sender, MidiEventPlayedEventArgs e)
{
var ccEvent = e.Event as ControlChangeEvent;
if (ccEvent != null && ccEvent.ControlNumber == 15 && ccEvent.ControlValue == 0)
{
Console.WriteLine("Event triggered");
playManager.MoveToStart();
}
}
There is a single CC 15,0 event one bar before the end of my song, but when the playback manager reaches the CC event and jumps in time, it causes the playback manager to go into an infinite loop of playing the previous note (as a continuous, extremely loud tone).
I have tried changing the event to other types including SysEx events with the same result. I can't find any documentation nor sample code on the use of the EventPlayed event handler, so I am stumped for the moment.
I started writing a program to translate midi files into c source code for a digital music box created by @technoblogy
http://www.technoblogy.com/show?11TQ
I downloaded the midi file for this tune:
https://musicboxmaniacs.com/explore/melody/scarborough-fair_4694/
The notes collection contains all the notes, but the chords collection is missing the last chord of 4 notes.
Sorry to be such a total ignorant time-wasting newbie but how would I use the DryWetMidi package in a Windows Form Application?
I've downloaded the NuGet package and I can see the reference, have added the using Melanchall.DryWetMidi; statements and have added the BuildMoonlightSonata.cs from CodeProject and drywetmidi-examples-createfile.cs from git to my solution but I can't seem to access the methods.
The master build doesn't seem to have any forms (shock! horror!) and I'm all lost without my forms.
Ultimately I'd like to use the MIDI file split functions on a bunch of midi files. So maybe I need to learn WPF?
My apologies for asking such a dumb question.
Thanks
Is there any way to send raw MIDI messages using SendEvent() in the form of a byte[] and bypass the object wrappers? Because sometimes the user needs to be able to send a non-predetermined or ambiguous MIDI command, including ones not defined in the standard MIDI spec.
For example the managed-midi library allows you to do this as such:
outputDevice.Send(new byte [] {0x90, 0x40, 0x70}, 0, 3, 0);
I was hoping that drywetmidi has an equivalent function. Thanks!
Hi Melanchall,
Is there a way to send an MMC Start event to a DAW?. I want to send the Start (Play) to Cakewalk and Studio One. What event class should i use to do that?
Thanks in advance.
I've integrated DWM into my Unity project (2019.1.0f2) for MIDI file playback and parsing. I've called playback.Start()
and the MIDI file will not play. I have tried several MIDI files and nothing will play.
// Use the first available OutputDevice (Microsoft GS Wavetable Synth)
var outputDevice = OutputDevice.GetAll().ToArray()[0];
// The MIDI file is copied to this location earlier
var playback = MidiFile.Read(Application.dataPath + @"/Scripts/in.mid").GetPlayback(outputDevice);
Later, in a coroutine (so the MIDI file starts playing at a specific time):
playback.Start();
Playback doesn't want to start, even on the main thread.
Also, playback.Play()
works, but it freezes the thread which I don't want, and it also drops a lot of notes, even on really simple MIDI files.
No exceptions are thrown.
What am I missing? And is there some fix to the dropped notes?
I am having noticeable latency from when a key is pressed on digital piano and when sound is produced. The latency time is constant (around ~100ms if I had to guess).
Output device is MICROSOFT GS WAVETABLE SYNTH
.
The application is running on Unity.
How can I reduce this latency?
device = Melanchall.DryWetMidi.Devices.InputDevice.GetByName(deviceName);
device.EventReceived += OnEventReceived;
device.StartEventsListening();
private void OnEventReceived(object sender, MidiEventReceivedEventArgs e)
{
if (e.EventType == Melanchall.DryWetMidi.Smf.MidiEventType.NoteOn)
{
var note = e as NoteOnEvent;
Debug.Log(note.Velocity);
Debug.Log($"{note.GetNoteName()}{note.GetNoteOctave()}");
}
if (App.Settings.Midi.outputDevice != null)
{
App.Settings.Midi.outputDevice.SendEvent(e);
}
}
}
thanks!
As mentioned in #28 i think this should be a requirement for #1 due to the current midi device code only supporting windows. (and there being libraries that do support cross platform low level midi device access already such as https://github.com/atsushieno/managed-midi)
The rest of the library already works well cross platform on unity, and at least builds for .net core (haven't done a lot of testing with it on that runtime yet)
Thanks for this amazing library!
Did you write a code to create automatic chords?
I mean, If I want to use your library to make a playback. Do I have to enter code for all the sequences (drums, piano, guitar etc.) Or you have a better way to suggest me?
Thanks again!
Hi,
Me again and now with my game come to be ready soon using your source <3
Take a look: https://www.youtube.com/watch?v=fqLiBD0p2cI
But i am not here only to show you the game, but to ask about notes.
As you can see on some notes in this video, some notes next to other one have different distances, where on midi the distance is equals.
I think the way i using to get notes is not the best, but the song time is not the exat time of note, then, i cant get the note form midi only quering the song time, i need to get next nearest note based on last note time and new time in a interval.
Is there a best way to get the exat note using the song time?
I have created a Dictionary to cache the notes for difficulties based on note time. The way i get the note is:
tempNotes.Where(pairs => pairs.Key == actualTime || pairs.Key > lastTime && pairs.Key < position).ToList().ForEach(pairs =>
{
...
lastTime = actualTime;
}
Where the key is the note time.
And i run this loop on every frame.
If you know a better way to get and put the note time on my Disctionay or a better way to get the next note, ill glad if you can help.
Thank you!
Edit: This is only with too near notes, with other notes theres no problems.
Originally I was using this code
((NotesCollection)noteMan.Notes).ToList()
.GroupBy(gb => new {
Bar = gb.TimeAs<MusicalTimeSpan>(tempoMap).Bars,
Beat = gb.TimeAs<MusicalTimeSpan>(tempoMap).Beats + 1
})
But now it seems they are red after the update. Is there a way I can get those back?
Hi,
I'm trying to install using the NuGet instructions, and Powershell gives this error:
Install-Package : A parameter cannot be found that matches parameter name 'Version'.
At line:1 char:44
~~~~~~~~
I'm new to .NET, so maybe I'm missing something obvious.
I am listening to the InputDevice
and want to get the noteName and velocity from NoteOn
, NoteOff
events, and also the sustain pedal event (on piano).
I know the EventType
, but don't know what to do next:
if (e.EventType == DryWetMidi.Smf.MidiEventType.NoteOn)
{
// -> NoteName ???
// -> velocity ???
}
thank you
Right, so, still messing around with this library, and I noticed that you are using TotalMicroseconds
for MetricTimeSpan
, however that is wrong as it is not microseconds, but milliseconds.
Microseconds are 1000 times smaller than milliseconds, heh
Hello,
This is not an issue (sorry), but I am wondering how I can access all NoteOn events, and these attributes:
Note-on time (seconds),
Length of note (seconds),
I tried the code below, but it's giving incorrect values.
IEnumerable<Melanchall.DryWetMidi.Smf.Interaction.Note> notes = midi.GetNotes();
var tempoMap = midi.GetTempoMap();
foreach (var note in notes)
{
// note's duration (seconds)
note.LengthAs<MetricTimeSpan>(tempoMap).TotalMicroseconds / 1000000f
// note's start time (seconds)
note.TimeAs<MetricTimeSpan>(tempoMap).TotalMicroseconds / 1000000f
}
Thanks!
Sorry if this is a stupid question, but my MIDI file only plays the drum part.
var outputDevice = OutputDevice.GetByName("Microsoft GS Wavetable Synth"); var midiFile = MidiFile.Read(musicitem); var playback = midiFile.GetPlayback(outputDevice); playback.Start();
Here is the MIDI file: https://drive.google.com/open?id=1jlFHLMfyCNGQqlFlYv5aNXlcox6EGexd
I want my MIDI file to play all of the tracks/channels.
Hey
I am looking through the documentation but I can't seem to find what I need
I want to split a midi file into windows of 8 bars, with a step size of 1 bar
Any suggestions?
Cheers
Hi!
Is there a way to set the MIDI instrument to the Track?
I would like to append one midi file to another.
I did the following:
I opened the first file (multiple tracks)
I opened the second file (multiple tracks) - and changed the beginning of all tracks to the end of the first midi file.
I added then all tracks of file2 to file1.
That seems not be sufficiant. Do I have to recalculate all delta-times of file2 with the „ticks-per-count-value“ of file1?
Thanks for your support!
Regards
Thomas
I would like to send midi events to the midi Recording
without using the InputDevice
.
Perhaps If we change OnEventReceived
to be public then other objects could subscribe to it?
Something like this:
customEventGenerator.ProcessMidiEvent += recording.OnEventReceived
And the InputDevice could be optional in the Recording
class?
Hi!
Any plans for making this library dotnet core-compatible?
Kind regards,
Nico Witteman
How can I convert a Format-0 Chunk to several Chunks - splitted by the midi-channel?
I created this Code, but I assume this can be realized in a much bettre way:
// Create an array with 16 entries
var timedEvents = FourBitNumber.Values.Select(channel => new List<ChannelEvent>()).ToArray();
foreach (TrackChunk track in gMidiFile.GetTrackChunks())
{
long lTime = 0;
foreach (MidiEvent ev in track.Events)
{
lTime += ev.DeltaTime;
if (ev is ChannelEvent)
{
ev.DeltaTime = lTime; // Use absolute times to split
timedEvents[((ChannelEvent)ev).Channel].Add((ChannelEvent)ev);
}
}
track.Events.RemoveAll(ev => ev is ChannelEvent);
}
TrackChunk[] tcarray = new TrackChunk[16];
int index = 0;
for (int i = 0; i < 16; i++)
{
//---------------------------------------------
//Change the absolute times back to delta-time
//---------------------------------------------
ChannelEvent[] cev = (ChannelEvent[]) timedEvents[i].ToArray();
if (cev.Length == 0)
continue;
//------------------------------------------
// calculate deltas
//------------------------------------------
long[] deltas = new long[cev.Length];
deltas[0] = cev[0].DeltaTime;
for (int j = 1; j < cev.Length; j++)
deltas[j] = cev[j].DeltaTime - cev[j - 1].DeltaTime;
//------------------------------------------
// copy deltas to events
//------------------------------------------
for (int k = 0; k < cev.Length; k++)
cev[k].DeltaTime = deltas[k];
// Create a new TrackChunk with the events
tcarray[index] = new TrackChunk(timedEvents[i]);
index++;
}
gMidiFile.Chunks.AddRange(tcarray);
Hi @melanchall ! Great lib, trying to use it in my project - I need a visual midi "player" and found that "getTimedEvents" works quite well for this purpose, however one piece of puzzle is missing - I need to know note length for every noteOn - this seems to only be available for generated objects from smf.Interaction, but not from loaded with midifile - is that correct? If so I think we should try to add this functionality. Otherwise please tell me how to do this, maybe it's very easy but I just didn't figure out.
How can I mute/unmute midi-channels, while the Playback is running?
From discussion in #1:
Hi Max,
I switched to framework, and I think the library is very useful. Now I have this issue:
In a MIDI file bwv227.zip
there is a track named "Ténor". In SequenceTrackNameEvent.Text it is represented as "T?nor". That is a pity, because I want to write the tracks away as files with their trackname. This one fails. Any suggestions?
Nico
Hi Nico,
It is because the Text
of any text-based meta event is processed in ASCII encoding. Obviously the é
symbol doesn't belong this encoding. I'm planning to add other encodings support later. Maybe as a first iteration I will introduce kind of TextEncoding
property in ReadingSettings
/WritingSettings
so you will be able to specify desired encoding for strings serialization/deserialization. Yes, sounds like a plan.
Thank you for your feedback and for remebering me about non-ASCII encodings in text-based meta events. I'll try to provide a solution in the next release which should be at the beginning of August.
Max
Thanks Max, I will await the update.
Nico
I see that in verison 3.0.0 there's a lot of changes.
Are you going to update Wiki or should I try learn it by myself?
I use dryWetMidi to develope a site for building playbacks.
I need every track to be in different instrument, and I build the tracks using PatternBuilder.
If it is possible to add a ProgramChange function to PatternBuilder - I think it will be usefull not just for me...
I am wondering how I can adjust the tempo for a midi file by a percentage value.
For example - make it 20% faster. Thanks!
Hi,
Sorry but this is not an issue, but i want to use this library on a Unity project but i dont know how. Do you know if is possible, and how to use?
I found other midi readers but what i want its only a midi file reader, i dont want to play.
Thanks!
So right now if you need to try and handle multiple midi files of various text encodings at runtime you are somewhat out of luck and would need to continuously re-parse the whole midi until the correct encoding is found.
Going from this:
public string Text { get; set; }
protected sealed override void ReadContent(MidiReader reader, ReadingSettings settings, int size)
{
ThrowIfArgument.IsNegative(nameof(size), size,
"Text event cannot be read since the size is negative number.");
if (size == 0)
return;
textData = reader.ReadBytes(size);
var encoding = settings.TextEncoding ?? SmfUtilities.DefaultEncoding;
Text = encoding.GetString(bytes);
}
to something like this (this isnt 100% working code but just something to get an idea of what i'm saying here)
protected byte[] _textData = null;
protected string _textEncoded;
public string Text
{
get
{
if (_textData != null)
{
var encoding = settings.TextEncoding ?? SmfUtilities.DefaultEncoding;
_textEncoded = encoding.GetString(_textData);
return _textEncoded;
}
else
{
return _textEncoded;
}
}
set => _textEncoded = value;
}
protected sealed override void ReadContent(MidiReader reader, ReadingSettings settings, int size)
{
ThrowIfArgument.IsNegative(nameof(size), size, "Text event cannot be read since the size is negative number.");
if (size == 0)
return;
_textData = reader.ReadBytes(size);
}
It would need some other api changes perhaps with how settings and/or encodings are handled.
Hi melanchall,
I would like to know if drywetmidi is compatible with Mono. I need to know this because i want to compile my .net application (which uses drywetmidi) with mono in order to run on macos and linux.
Waiting for your kindly response.
Eduardo.
Would it be possible to implement an event-handling for changes in a event-list of a chunk?
I think for using the lists in a datagridview with data-binding this would be necessary.
Regards
Thomas
Code to reproduce:
PatternBuilder patternBuilder = new PatternBuilder();
patternBuilder.SetProgram(GeneralMidiProgram.Accordion);
patternBuilder.Note(NoteName.C);
MidiFile midiFile = new MidiFile(new[]
{
patternBuilder.Build().ToTrackChunk(TempoMap.Default, (FourBitNumber)1)
});
Expected Result: Program index will be changed on channel 1 and note will play on channel 1.
Result: Program index has been changed on channel 0 and note plays on channel 1.
Thank you for any attention brought to this issue. My apologies if this is the result of a misunderstanding of DryWetMIDI on my part.
Does this library support changing volume of notes you want to add to Trackchunk. If it does: How to implement it? If it doesn't: Are there any workarounds?
Thanks in advance
Hello,
I am using Logic Pro X 10.0.0 to create piano midi files.
Each of these files has a single tempo, and it is found in the last midi event (as a tempo-change event).
I used these midi files in another program, and the playback was fine.
However, Drywetmidi does not seem to correctly parse them and always uses the default (120bpm) tempo.
For a quick fix (untested) I modified the CollectTempoChanges()
method in TempoMapManager.cs
private void CollectTempoChanges()
{
int count = 0;
var lastEvent = _timedEventsManagers.SelectMany(m => m.Events).Last();
foreach (var tempoEvent in GetTimedEvents(IsTempoEvent))
{
count += 1;
var setTempoEvent = tempoEvent.Event as SetTempoEvent;
if (setTempoEvent == null)
continue;
// if:
// - the last event in the midi file is the tempo event, and
// - there is only one tempo event,
// then make the time of the tempo event = 0
// this is so if the last event is the tempo event
// we capture that tempo for the entire file
// (a fix for Logic Pro X)
bool oneTempoChangeAndIsLastEvent = count == 1 && lastEvent == tempoEvent;
long time = oneTempoChangeAndIsLastEvent ? 0 : tempoEvent.Time;
TempoMap.Tempo.SetValue(time, new Tempo(setTempoEvent.MicrosecondsPerQuarterNote));
}
}
I also attached an example midi file, which was exported from Logic Pro X 10.0.0:
prelude_124bpm.mid.zip
It seems possible to get the "Musical"-notes from a "MusicTheory.Chord" object by using "Parse".
But how can I get the "musical-name" of a "SMF.Interaction.Chord" object?
(Musical Name: e.g: "Cmaj")
Thanks and best regards
Thomas
If a chunk has - as the first entry - not a channel-event (e.g. trackname), then the Playback funktionality does not use the shifted time.
I think not the first entry should be used for the "Shift Event", Maybe the first channel-event should be shifted.
Should the „removeAt“ or „remove“ of the EventsCollection not update the delta-time of the following event?
Regards
Thomas
Instead sending midi events to an external 'output device', I would like to play local .aiff files. Is this possible?
For example, this Unity project uses NAudio to play aiff samples: https://github.com/catdevpete/Unity-MIDI-Piano/tree/master/Assets
thanks
So, I was wondering if it's possible to play a MIDI file and not output it to a MIDI device, but instead execute code whenever a MIDI event happens (such as NoteOn).
I've found drywebmidi just use to read & write midi file, but must to save the file before checking the music change.
Is it have a way to playing fragment?
If not, I suggest add it.
Do you have timeline for the next nuget release? I'm looking for netstandard support.
@melanchall I decided to open a separate bug for this since it's not really related to #8. I noticed that with the same test midi file I provided there it takes ~17 sec on my i7 laptop's core to extract each TimedObject's absolute time:
MetricTimeSpan t = event.TimeAs<MetricTimeSpan>(tmap)
and for Notes also:
MetricTimeSpan l = note.LengthAs<MetricTimeSpan>(tmap)
So it's rather slow... a few times slower than my older implementation. Maybe some optimization can be done around recurrent TimeAs / LenghAs calls, some unnecessarily repeated operation can be skipped?
test code:
using System;
using System.Collections.Generic;
using System.Diagnostics;
using System.IO;
using Melanchall.DryWetMidi.Smf;
using Melanchall.DryWetMidi.Smf.Interaction;
namespace test
{
class MainClass
{
public static void Main()
{
Stream stream = new FileStream("/home/dsutyagin/Downloads/test.mid", FileMode.Open, FileAccess.Read);
MidiFile midiFile = MidiFile.Read(stream);
Stopwatch stopwatch = Stopwatch.StartNew();
IEnumerable<ITimedObject> timedObjectsAndNotes = midiFile.GetTimedEvents().ExtractNotes();
Console.WriteLine(stopwatch.ElapsedMilliseconds);
stopwatch.Restart();
TempoMap tmap = midiFile.GetTempoMap();
Console.WriteLine(stopwatch.ElapsedMilliseconds);
stopwatch.Restart();
int i = 0;
foreach (ITimedObject x in timedObjectsAndNotes)
{
// comment me N1 - saves 6 sec
MetricTimeSpan t = x.TimeAs<MetricTimeSpan>(tmap);
i++;
Note note = x as Note;
if (note != null)
{
i++;
//comment me N2 - saves 10 sec
MetricTimeSpan l = note.LengthAs<MetricTimeSpan>(tmap);
}
}
Console.WriteLine(stopwatch.ElapsedMilliseconds);
}
}
}
sample output:
70
146
16918
Press any key to continue...
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.