I felt inspired by animusic animations and decided to try to do it myself.
I made the song only using the default plugins from LMMS. I transformed it to MIDI using this converter.
I then converted MIDI to a JSON that contains only what I need using a program in my Github repo (see below). And I import this generated JSON into blender animations using an extension (same Github repo).
Once I have the animations, I used drivers to get the animated values inside geometry nodes to launch and simulate the balls, vibrate the xylophone/marimba, oscillate bass strings and scale trumpets. The hi-hat, snare, drumstick and trombone were animated through drivers in properties GUI.
Rendered with cycles, all materials are the default material, I just changed color, metallic and roughness values, as I decided I didn’t want to spend more time into this.
Source for MIDI to JSON converter and blender extension: https://github.com/alansartorio/Music3DAnimation (disclaimer: not documented)
The video appears to be broken.
True, thanks for letting me know, it won’t work on my phone. Switched to YouTube url.
Neat.
The original did have a locally-ran demo (ATI pipe dreams), that’s better because it’s lower data and adds up with multiple songs/stages. Makes even more sense now with portable software, and better-yet web exports.
BGE seems like it may have been a fine option for exporting a live-rendered animation at one time, but I’m not sure if it is currently. Though I guess frameworks like Raylib, Ogre, libGDX etc could even work (or Godot).
It also could probably be branched off into a benchmark or maybe even a rhythm game.
Oh, I didn’t know they released a demo like that. My first idea was to make a program to play it using a game engine (bevy), but I switched to blender because I didn’t need it to be real time.