Jump to content

MIDI Sync-Up - Calling All Innovators


Wingnut
 Share

Recommended Posts

Hi gang.

    Question:  Has anyone thought about sync MIDI to scene events?  MIDI sequences allow insertion of SYSEX messages...  and often these SYSEX messages (among the note on/off data)... contain something called MMC... midi machine code... used to automatically turn knobs/adjust settings... on certain midi playback gear (such as soundcards and midi synthesizers).

But we demented experimenters... would LOVE to access those sysex messages... and make our 3D mesh do things... at that playback time.  We (I) want to make my mesh... dance to the beat of the midi song.  If I insert sysex messages into my midi song, and IF I can "see" those events happen with a scene observer... that would just TOTALLY ROCK!  (literally!)

I know @davrous is a musician/composer, and he knows some seriously-advanced stuff about JS audio.  I hope I can "entice" him to come aboard this cause.

Also, I don't want to use code to play the song, noteON-by-noteOFF.  Ideally, I want to insert an HTML <audio> element into the dom tree, and if the song contains sysex messages of type-BJS, I want them to activate sysexMessage observers in the scene.

Would THAT be cool, or what?  You ain't NEVER seen "Mesh'n'Light Shows" like the ones Mr. Wingnut could produce... if that "choreography system" was operational!  YUM!  UberYUM!

Ok girls... can it be done?  Is there a wall to cross?  A river to ford?  Do we have the bridge-making gear?  Thoughts welcome... any thought from any one.  :)  Thx!

(Again, let's NOT make scene code play (poke) each note, step-by-step.  That type of system will not stay in-tempo.  Let's avoid that idea.)

Link to comment
Share on other sites

LOL!  What a great helper you are!  (Wingy hands you a broom)  Here... keep the thread clean.  :D  And don't molest the girlies. 

But yeah... asking JS to reach-across into... soundcard-land.  Or... into midi-port land.  hmm.  Or asking soundcard-land to yell at JS when you see a JS-targeted SYSEX message roll-thru.

But this is important.  This opens up... webGL movies.... all based upon LONG midi file.  Could be a 24 hour song... if your webGL story takes that long to present.  FUN!

Some events... could be told to start .wav/mp4/whatever files... too.  Want your mesh talking to each other?  No problems.

Link to comment
Share on other sites

Yea, could be cool. One starting direction might be:

1. review: each of  (many) MIDI.js libs (license, documentation), for JS event dispatch solution...

         - top of google: https://galactic.ink/midi-js/ 

2. proof example JS-MIDI events:  test-case eventing midi...

3. example-bjs: link it to the BABYLON.Mesh, Observer... 

Test: Multi-colored effusive rgb disco-ball. Or, 70's dance floor... Spherical Harmonics? Idk, Digress...

Similarly, possible to use Web Audio API  for: oscillators, gain,compression, reverb, delay, filter, pan, etc... same time (AudioContext).

...

Link to comment
Share on other sites

15 minutes ago, enwolveren said:

Similarly, possible to use Web Audio API  for: oscillators, gain,compression, reverb, delay, filter, pan, etc... same time (AudioContext).

Realtime song gen, right?  Can't do it.  Outlawed in the first post.  :)  Song won't stay in tempo... latency problem due to morphing spherical harmonic that Elderwolfy put into the scene.  :D

Gotta get sysex messages from midi file... real time, while being played.  No cheating.  :)  The midi port or soundcard... is the boss... sets the pace.... stuff like that.  If song tempo-slips at all, we're all gonna be arrested for interference with hardware.

Link to comment
Share on other sites

Hi D.  Yeah, thx.  Actually, hmm... in MS land... I think there is a midi-mapper thing.  The only part I would be interested-in... is "intercepting" the midi OUT data.  I have no need to trigger mesh with midi-IN device, but others might, someday.

Users are allowed to select which "player" handles their .mid mimetype, I guess.  Do all these players... use a "mapper" of some kind?  (asking anyone). 

Wouldn't it be great if the JS object... could interface to these mappers... AS IF the JS object were a midi hardware device?  Perhaps read-only, from the mapper?  hmm.

Link to comment
Share on other sites

How about the https://webaudio.github.io/web-midi-api/? It sounds like it's what we are looking for.

EDIT: Here is a neat tutorial with demo included (https://www.toptal.com/web/creating-browser-based-audio-applications-controlled-by-midi-hardware) but it looks like it's more about playing the MIDI via the web but the W3C specs also hinted on events from the MIDI device about the notes being played (https://webaudio.github.io/web-midi-api/#MIDIMessageEvent).

Link to comment
Share on other sites

Thanks guys... these look very interesting.  At least some folks are thinking about it, eh?

Unfortunately, there is no mention of MIDI in the html5 spec.  Not that there SHOULD be... but... I think the audio element is not being considered AT ALL... for playing midi files.  That's sad.

This probably means that only <embed> and <object> can play midi, and they will hand-it-off to system... guided by the .mid mimetype to find a player.

Compared to an .mp4 or .wav source for an <audio> element... umm... I don't know how it compares.  The <audio> element seems to have quite a few limitations for the allowed file/mime types, (.mid apparently isn't one of them).  <audio> probably uses a player built-into the browser, or something like that. 

"Stream-audio" (samples) like .wav, aiff, mp3, etc... don't need access to the wavetables on the soundcard/on-board audio.  Only midi needs the wavetable... to get its instrument sounds.  hmm.

Thanks for the responses and research, guys.  Reading reading reading.  :)

Link to comment
Share on other sites

I've thought about this as well.  I'm also a musician, performed over 100 shows in my life, used to teach lead guitar.  What annoys me is having animations that don't sync up to music.  I know its easier to sync MIDI cos of the sysex messages, but I'd love to be able to sync MP3s according to certain frequencies (most likely bass)

Link to comment
Share on other sites

Cool.  Yeah, syncing the sampled-type audio would definitely be nice, too.  I bet there's more enthusiasm for THAT... than there is for listening-for-sysex from a midi player.

People sort-of forgot about midi... but it won't die.  The Chuck E. Cheese band will ALWAYS need it.  :)

That thing that JcPalmer pointed-at... that was weird.  They send the midi file into a converter (pre-process into another file type), converter available live online, and it converts to something... looked like "blocks" of samplings.  Then they feed that into a JS synth, which has no wavetables, or maybe software based wavetables with a half-hour load-time.  And, the embeded sysex messages could be lost along the way. 

"Faked" midi playback, as Davrous puts it.  *nod*

Can you imagine how wonderful a midi timeline/eventScheduler would be?  Playground movies!  2 hour run times... an entire war saga... red mesh vs blue mesh...  big orchestration... tunt... tuh tuh tuh tunt... marching into battle...tymps... swell... swell... SWELL...  CRESENDO!!!  Cannon fire.  :)

A six-stringer eh?  You probably play bass just fine, too.  Likely some keys, too.  Very cool.

Once upon a time, I decided I would learn everything I could... about my Alesis s4 rack synth (a QuadraSynth without a keyboard).  The s4 is a wonderful sound module, and by sending the right sysex at the right time (from a Cakewalk midi)... I learned to make that S4 SING!  :) After some of my demented experiments were done, I could hear that s4 panting and wheezing.  :)  Check this mess out...  http://webpages.charter.net/wingthing/html/s4/s4sysex_p08.txt

What kind of idiot... would make such a... ahh nevermind.  We music tards are a little "off"... you know well.  :D

Yeah, keeping motion in-sync with music... most people still HAVE TO render to video.  What fun is that?  The gamepad doesn't work during play!  heh

Link to comment
Share on other sites

  • 2 months later...

There are no technical problem with synchronisation of scene events and sound. You don't need MIDI API for this task at all.

Look to http://tinyurl.com/yasvy3t9
More examples here
https://surikov.github.io/riffshare/tools.html

It sounds fine, it looks nice.
It uses WebAudioFont (see https://www.npmjs.com/package/webaudiofont)

But you should be musician to synchronise music pieces and actions in game. This is absolutly different task. 

You binds sound effects to scene events in ordinal games.
In your case, on the contrary, You should bind game events to musical events.


Feel free to ask any help to solve any problem in this area.

P.S.
I don't support SSSynthesiser anymore. Look to WebAudioFont. It has about 1000 instruments and drums, equalizer, reverberator etc.

 

sp320opt.gif

Link to comment
Share on other sites

Hey, thanks for the reply and great examples.

Still, this feels like some kind of work-around.  The scene code would need to "make" the calls to play the notes, right?

(perhaps I don't understand real well - sorry)

Ideally, in my fantasy, midi is playing in a classic windows media player "embed" (object or audio element)... and whatever midi player is being used, it can be told to stream all noteON, noteOFF, and sysex events... at canvas.addEventListener(midiEvent, "onMidiEvent").

Wouldn't that be the cat's meow?  (great)

Seeing that I sequenced (created) the midi song, I would be the one inserting sysex/mmc/showControl messages in-between downbeats (I am a musician - rumor has it)... and inside my onMidiEvent()... if event.type = "showControl"... ahem ahem ahem.  :)

You could do entire "movies" made from a 2 hour midi song...  packed with music and show-control events, using the wavetables built-into the sound cards.  The midi players are all running at hardware speeds, and they would not bog our BJS scenes at all.  They are the boss.  They set the pace.  Our scene does nothing at all about playing notes.  It just listens.

THAT is what I would truly wish-for.  I don't want to play any music with JS scene code.  Let the standard midi players - WMP, winAmp, bsPlayer, whatever... do that for me, and they ONLY send us midi events.  We're not talking waves/samples here at all.  Just midi.

Sorry.  I hope I didn't pee on the campfire.  I just don't like having to play notes WITH scene code... if that is indeed what your demos are doing.  I prefer that the hardware-level midi players... just stream their stuff at our scene... so we can try to sync-up.

Actually not necessarily "hardware-level", but "system-level"... stuff written in faster code... whatever Windows Media Player is coded-with.  :)

In my experience with Windows and webcams, I found that I needed to go through an "activeX" thing... to get my camera to appear on a local webpage.  I have a feeling that the same thing is needed... to "reach across Security Gulch" and connect to system midi players.

Link to comment
Share on other sites

Hi guys, thanks for the replies.  Apparently I am not explaining well.

@sssurikov ...

	<script src="https://surikov.github.io/webaudiofont/npm/dist/WebAudioFontPlayer.js"></script>		
		<script src="https://surikov.github.io/webaudiofontdata/sound/0291_LesPaul_sf2.js"></script>
		<script src="https://surikov.github.io/webaudiofontdata/sound/0280_LesPaul_sf2.js"></script>
		<script src="https://surikov.github.io/webaudiofontdata/sound/0250_Chaos_sf2_file.js"></script>
		<script src="https://surikov.github.io/webaudiofontdata/sound/0170_JCLive_sf2_file.js"></script>
		<script src="https://surikov.github.io/webaudiofontdata/sound/0000_Chaos_sf2_file.js"></script>
		<script src="https://surikov.github.io/webaudiofontdata/sound/0330_SoundBlasterOld_sf2.js"></script>
		<!--<script src="https://surikov.github.io/webaudiofontdata/sound/0340_Aspirin_sf2_file.js"></script>-->
		<script src="https://surikov.github.io/webaudiofontdata/sound/0390_GeneralUserGS_sf2_file.js"></script>
		<script src="https://surikov.github.io/webaudiofontdata/sound/0480_GeneralUserGS_sf2_file.js"></script>

		
		<script src="https://surikov.github.io/webaudiofontdata/sound/12835_0_Chaos_sf2_file.js"></script>
		<!--<script src="https://surikov.github.io/webaudiofontdata/sound/12840_26_JCLive_sf2_file.js"></script>-->
		<script src="https://surikov.github.io/webaudiofontdata/sound/12838_22_FluidR3_GM_sf2_file.js"></script>
		<script src="https://surikov.github.io/webaudiofontdata/sound/12841_26_JCLive_sf2_file.js"></script>
		<script src="https://surikov.github.io/webaudiofontdata/sound/12842_26_JCLive_sf2_file.js"></script>
		<script src="https://surikov.github.io/webaudiofontdata/sound/12845_26_JCLive_sf2_file.js"></script>
		<script src="https://surikov.github.io/webaudiofontdata/sound/12846_26_JCLive_sf2_file.js"></script>
		<script src="https://surikov.github.io/webaudiofontdata/sound/12849_26_JCLive_sf2_file.js"></script>
		<script src="https://surikov.github.io/webaudiofontdata/sound/12851_26_JCLive_sf2_file.js"></script>

	

Scene code... using a JS-based midi player, IF it plays midi at all (as opposed to a json conversion of a midi file).  I don't want to load or use a JS-based player.  I don't want JS having ANYTHING to do with creating the sounds/notes.  Obviously, the webaudiofont system is packed with JS music-playing-scripts, notes, etc.  This might be the best we have, currently, and I appreciate you showing it to me, and your coding work.  But, it's not what is sought... and it is much more complicated than what I described.

Same for you @Raggar.  view-source:http://playground.vormplus.be/webgl-music-video/js/music-video.js

Look at all the music-playing stuff in that file.  FAR more than canvas.addEventListener(midiEvent, "onMidiEvent");  Again, I don't want JS playing midi, not even with an external player.js, and I don't want to pre-process midi to json.  These are both "workarounds"... and subject to JS garbage collection delays, right?

Ideally, I just need Windows Media Player (or similar NON-JS, non-browser-based midi player)... to send each midi event to the browser events system.  Each event sent from these SYSTEM midi players... is just like an onKeyDown.  In the eventListener, we check if its control, shifted, alt, etc, and check which key was pressed. 

Why not same for events arriving from midi players... (if the Gods of security ever allow it).  Sure, you're free to use JS-based midi players, but I prefer fast'n'sure-footed SYSTEM midi players that aren't subject-to the GC delays of the browser.  I need FULL CPU power in JS... to react-to midi events, not create them.  Besides, how useless is a convert-to-json thing?  Pre-process?  yech.

I'd rather easily change to a different url in my <object> or <audio> element, and run my JS code again.  Even if automated, JS needs to be used to pre-process midi file to json.  Not good for me.   I prefer no JS involvement in music making... not even external js files.

Let's pretend I'm listening to events on midi channel 10... almost always the drum track.  Just change the DOS-based URL, and the mimestype .mid is mapped to a system midi player, and IT... sends every midi event... to browser event system. (as an option?)

Easy as pie.  Piece of cake(walk).  Remember a Windows thing called "midi mapper"?  I think it went obsolete, but... that sounded like something that could have been used for this.

I think this is something that would need to be enabled by computer owner/admin, not by default.  Midi players would need an option to "Send midi data to browser?" and browsers would have to allow input from midi mapper (just like they allow input from computer keyboards).  Sigh. 

SO close, yet SO FAR across Security Gulch.  :)  It's almost as if browsers suddenly decided to HATE midi, and OS's sand-boxed midi streams... from being piped-around.  Why?  (Wingnut starts crying)  Thanks for the replies, guys.  I think this is going to take a major attitude change at W3C.  But once they see what I can do with this added power.... they'll be wondering WHY they took so long to allow this piping... to us webGL artists.

Link to comment
Share on other sites

  • 3 months later...

Hi gang.  Well, I've been out searching for reasons WHY a browser is not allowed to "hear" MIDI SysEx messages coming from a OS-playing .mid song.  I have failed. 

Then I went searching for somebody who understood Windows/OS's enough... to tell me WHY browsers disallow addEventListener(sysexIn, onSysEx).  And, WHY hardware/OS midi players such as WinAmp and Windows Media Player... disallow sending sysEx/noteOn/noteOff/whatever messages... TO a place where a browser COULD listen to them.

So, I'm still here... confounded, dumbfounded, and likely unfounded.  :)

I REALLY need this done.

<object> and <param> elements (formerly <embed>) can launch OS midi players, playing pre-made midi songs WITH silent SYSEX messages inside them.  I NEED the browser... to allow eventListening of those messages.

Whether or not I understand how computers work... is not important.  That's why I am asking for help, or maybe more-correctly, I am asking for help convincing the W3C that this is a wise thing to do... for choreographed "dancing-to-the-beat" scenes.  This HAS TO happen.  It is really quite beyond an "imperative".  It's a giant DUH! 

SysEx used in a 6-hour-long "empty" midi song... can be used as a time-based "event sequencer" for a 6-hour webGL "movie"... as wanted, too. (mega-cool)   It can also trigger sampled sound-effects at the perfect times... during the webGL movie.

We "show-control" mesh and lighting choreographers... will EXPLODE the world of webGL... once "we" open that sysex-to-browser door. 

No, we're not talking about playing midi songs with JS.  Don't let your mind go there.  We want to listen-to OS players... which have been told to "pass-along" their midi events... to listening browsers.  It's really no different than listening for a mouse, right?  Right.

So, if ANYONE has power... within W3C, within MS, within hardware folk, within ANYTHING that can help get this done... we HAVE TO try.  I have to try. 

I'm willing to accept external JS to get it patched-in, but I don't know if WMP, WinAmp, RealPlayer, etc... will ever allow midi-event "passing-along-to-browser".  (Is that called piping?  Steaming?  Whatever it's called.)  

Can anyone help get this done?  I even have some money to apply... if needed.  Let's lift this blockade... soon, if possible... and even if NOT possible.  It's just a stupid wall.  Let's knock it down.  Thx. 

Also, here, once again, is @gryff's cool Christmas scene.  It's niiiiiice.  Thx Gryff!  'o' toggles music, 's' and 'f' adjust walk/fly speed.

Link to comment
Share on other sites

Quote

to tell me WHY browsers disallow addEventListener(sysexIn, onSysEx)

Browsers allow listen any MIDI events. See https://webaudio.github.io/web-midi-api/

Quote

midi players such as WinAmp and Windows Media Player... disallow sending sysEx/noteOn/noteOff/whatever messages

Ask developers of WinAmp and Windows Media Player

Quote

I REALLY need this done

You don't need this. 

Quote

SysEx used in a 6-hour-long "empty" midi song... can be used as a time-based "event sequencer" for a 6-hour webGL "movie"

Use JS timer. You don't need MIDI.

See http://tinyurl.com/y7y5hbge 

You can watch at this time-based WebGL "event sequencer" for a 6-hour or more, if you wish

 

xtree.gif

Link to comment
Share on other sites

Hi.  What .mid file is being played in that demo?  Can you show me it's URL?  thx.

I need this to stay on subject... .mid file playing, and listening for sysex msgs/showControl embedded in that .mid file.

Nothing more, nothing less.  That is what is sought.

Link to comment
Share on other sites

Yes, I understand what YOU speak-of, but the project uses .mid files with/without sysex embedded.  The objective is to have browser event-Listen for sysex and/or midi channel 10, which is often the drums. 

The .mid songs are already in existence (millions of them), and these are the source of the browser events which I want to capture/observe.

Definitely, I do NOT need this...

	<script src="js/tools.js"></script>
		<script src="https://surikov.github.io/webaudiofont/npm/dist/WebAudioFontPlayer.js"></script>
		
		<script src="https://surikov.github.io/webaudiofontdata/sound/0300_LesPaul_sf2_file.js"></script>
		<!--<script src="https://surikov.github.io/webaudiofontdata/sound/0280_LesPaul_sf2_file.js"></script>-->
		<script src="https://surikov.github.io/webaudiofontdata/sound/0280_LesPaul_sf2_file.js"></script>
		
		
		<script src="https://surikov.github.io/webaudiofontdata/sound/0250_Chaos_sf2_file.js"></script>
		<script src="https://surikov.github.io/webaudiofontdata/sound/0170_JCLive_sf2_file.js"></script>
		<script src="https://surikov.github.io/webaudiofontdata/sound/0000_Chaos_sf2_file.js"></script>
		<script src="https://surikov.github.io/webaudiofontdata/sound/0330_SoundBlasterOld_sf2.js"></script>
		<!--<script src="https://surikov.github.io/webaudiofontdata/sound/0340_Aspirin_sf2_file.js"></script>-->
		<script src="https://surikov.github.io/webaudiofontdata/sound/0390_GeneralUserGS_sf2_file.js"></script>
		<!--<script src="https://surikov.github.io/webaudiofontdata/sound/0480_GeneralUserGS_sf2_file.js"></script>-->
                <script src="https://surikov.github.io/webaudiofontdata/sound/0480_Aspirin_sf2_file.js"></script>
		
		<script src="https://surikov.github.io/webaudiofontdata/sound/12835_0_Chaos_sf2_file.js"></script>
		<!--<script src="https://surikov.github.io/webaudiofontdata/sound/12840_26_JCLive_sf2_file.js"></script>-->
		<script src="https://surikov.github.io/webaudiofontdata/sound/12838_22_FluidR3_GM_sf2_file.js"></script>
		<script src="https://surikov.github.io/webaudiofontdata/sound/12841_26_JCLive_sf2_file.js"></script>
		<script src="https://surikov.github.io/webaudiofontdata/sound/12842_26_JCLive_sf2_file.js"></script>
		<script src="https://surikov.github.io/webaudiofontdata/sound/12845_26_JCLive_sf2_file.js"></script>
		<script src="https://surikov.github.io/webaudiofontdata/sound/12846_26_JCLive_sf2_file.js"></script>
		<script src="https://surikov.github.io/webaudiofontdata/sound/12849_26_JCLive_sf2_file.js"></script>
		<script src="https://surikov.github.io/webaudiofontdata/sound/12851_26_JCLive_sf2_file.js"></script>

That looks like a synth, and AGAIN, no JS-commanded sound-playing is wanted/used.  If you can help with the objective, great.  If not, I will wait-for and hope-for others to reply/help.

No JS sound-playing/generating.  <--  :)  OS midi players with soundcard wavetables ONLY.  No converting of midi to something else (like json).  Must use html <object>. <embed>, or <audio> elements to play web-published .mid files.  (sorry, I don't explain things very well).  Also, I think browser GC will bog JS-driven sound playing.  The song MUST maintain meter/tempo, no matter what the browser does.  thx, be well.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...