Writing
in polymeter can be very complicated. Often it's hard to visualize how
all the patterns and modulators are interacting with each other to
create the music. It might help if the software including more and
better visualizations. Here's a list of visualization ideas I'm thinking
about.
The core problem is that our ears are amazingly good at integrating multidimensional information in real time, whereas our eyes, not so much. People can comprehend a complex piece of music even at a fast tempo, but if that same piece is modeled visually it just looks like a blur. That's partly because vision processing is relatively slow compared to hearing, but mostly because the brain evolved to handle the demands of language and thus can easily process pitches, rhythm, and timbre all once in real time.
Polymeter visualization ideas
Modulation graph. Display a “boxes and arrows” type graph that captures the song’s modulations, including all tracks that are participating in modulation, connected by arrows that show the direction and type of the modulation. The graph would be dynamic in the sense that updating the document would automatically redraw the graph if necessary. The drawing could be done using GraphViz for example, as I did for the FFRend project. The user would have to install GraphViz for this feature to work, so the feature would have to be optional. Implementation would be fairly easy since GraphViz does all the heavy lifting.
Active modulation list. Display a table that shows each modulated parameter and its current state. Since modulation type is currently limited to mute, note, velocity, and duration, only these properties would need to be shown, and only for tracks that are modulated. Thus the table could have five columns (track name, plus one for each modulation type), and there could be one row for each track that’s modulated. The table would update dynamically while the song is playing, allowing the user to visualize the modulation state and how it corresponds to the audio output. This idea only works during playback. One issue is that for complex songs with many tracks, it won’t be possible to see the entire modulation state at once, because the table will be too long to fit on the screen without scrolling. The biggest challenge would be keeping the table adequately synchronized with the audio output. This view could also incorporate the song dubs, but at the cost of showing every track, not only the modulated ones.
Piano roll. Display a dynamic piano roll view of the sequencer’s output for a specified MIDI channel. The view would take all modulations into account, allowing the user to visualize the cumulative effect of modulations on an instrument. The main limitation is that it would only work for notes, and only for one channel at a time. A visualization for controllers would have to be implemented separately. The view could scroll automatically while the song is playing, but one disadvantage of this is that at fast tempos the data may scroll too quickly to be comprehensible, though this wouldn’t be an issue when playback is stopped. This idea would be fairly easy to implement since it’s non-interactive, unlike piano roll editing which is much harder.
Piano keyboard. Display a dynamic piano keyboard that shows the sequencer’s output for a specified MIDI channel, or for all channels. The view would take all modulations into account, allowing the user to visualize the cumulative effect of modulations. The main limitation is that it would only work for notes; controllers wouldn’t be shown. Unlike the piano roll view described above, the piano keyboard would only work during playback. Showing all channels at once will be less effective if the parts on the different channels have overlapping ranges. Implementation would be fairly easy since much of the code could be reused from the ChordEase project, though again synchronization may be problematic. Performance impact depends on whether the event data is obtained from the sequencer callback, or recreated. Recreating the data is more complex, but safer as it avoids multithreading problems.
MIDI dump. Display a dynamic list of all output MIDI events, or only those for a specified channel. The main limitations are that it would only work during playback, and in many cases the information will scroll too fast to be useful. The latter issue can be addressed by implementing filtering by event type. Some of the code could be reused from the ChordEase project.
The core problem is that our ears are amazingly good at integrating multidimensional information in real time, whereas our eyes, not so much. People can comprehend a complex piece of music even at a fast tempo, but if that same piece is modeled visually it just looks like a blur. That's partly because vision processing is relatively slow compared to hearing, but mostly because the brain evolved to handle the demands of language and thus can easily process pitches, rhythm, and timbre all once in real time.
Polymeter visualization ideas
Modulation graph. Display a “boxes and arrows” type graph that captures the song’s modulations, including all tracks that are participating in modulation, connected by arrows that show the direction and type of the modulation. The graph would be dynamic in the sense that updating the document would automatically redraw the graph if necessary. The drawing could be done using GraphViz for example, as I did for the FFRend project. The user would have to install GraphViz for this feature to work, so the feature would have to be optional. Implementation would be fairly easy since GraphViz does all the heavy lifting.
Active modulation list. Display a table that shows each modulated parameter and its current state. Since modulation type is currently limited to mute, note, velocity, and duration, only these properties would need to be shown, and only for tracks that are modulated. Thus the table could have five columns (track name, plus one for each modulation type), and there could be one row for each track that’s modulated. The table would update dynamically while the song is playing, allowing the user to visualize the modulation state and how it corresponds to the audio output. This idea only works during playback. One issue is that for complex songs with many tracks, it won’t be possible to see the entire modulation state at once, because the table will be too long to fit on the screen without scrolling. The biggest challenge would be keeping the table adequately synchronized with the audio output. This view could also incorporate the song dubs, but at the cost of showing every track, not only the modulated ones.
Piano roll. Display a dynamic piano roll view of the sequencer’s output for a specified MIDI channel. The view would take all modulations into account, allowing the user to visualize the cumulative effect of modulations on an instrument. The main limitation is that it would only work for notes, and only for one channel at a time. A visualization for controllers would have to be implemented separately. The view could scroll automatically while the song is playing, but one disadvantage of this is that at fast tempos the data may scroll too quickly to be comprehensible, though this wouldn’t be an issue when playback is stopped. This idea would be fairly easy to implement since it’s non-interactive, unlike piano roll editing which is much harder.
Piano keyboard. Display a dynamic piano keyboard that shows the sequencer’s output for a specified MIDI channel, or for all channels. The view would take all modulations into account, allowing the user to visualize the cumulative effect of modulations. The main limitation is that it would only work for notes; controllers wouldn’t be shown. Unlike the piano roll view described above, the piano keyboard would only work during playback. Showing all channels at once will be less effective if the parts on the different channels have overlapping ranges. Implementation would be fairly easy since much of the code could be reused from the ChordEase project, though again synchronization may be problematic. Performance impact depends on whether the event data is obtained from the sequencer callback, or recreated. Recreating the data is more complex, but safer as it avoids multithreading problems.
MIDI dump. Display a dynamic list of all output MIDI events, or only those for a specified channel. The main limitations are that it would only work during playback, and in many cases the information will scroll too fast to be useful. The latter issue can be addressed by implementing filtering by event type. Some of the code could be reused from the ChordEase project.
No comments:
Post a Comment