Sunday, April 22, 2018

Initial alpha release of Polymeter available for download

The initial alpha release of Polymeter is now available for download, at both SourceForge and GitHub. It's a fully functional MIDI step sequencer optimized for polymeter. Note however that this version is ALPHA software and many planned features are not yet implemented fully or at all. In particular, while this version is fine for drums or other percussive samples, it's less useful for melodic lines, due to its lack of the crucial "Legato" feature that ties consecutive steps together. Please check the Features list before requesting features, as the thing you want may already be planned.
https://polymeter.sourceforge.io/download.html
https://victimofleisure.github.io/Polymeter/download.html
https://polymeter.sourceforge.io/features.html

Tuesday, April 17, 2018

Input device and MIDI Thru

I got MIDI Thru working! No MIDI sequencer is complete without it. The input side was easy, I pretty much ripped it straight out of ChordEase.
The output side was tricky however, because you can't write events directly to a MIDI device that's been opened for synchronized streaming (as opposed to immediate output, as in ChordEase). Instead your callback function (which is being called at regular intervals by Windows) has to add the events to its next buffer, which it subsequently queues to the output device. My solution was to add a special input queue for "live" events. At the start of each callback, I check the live input queue, and if there are events in it, I dequeue them and add them to the start of my next MIDI buffer, before any events that may get added from the song's tracks. This is quite safe (provided a thread-safe queue is used) and it works pretty well. It does have some limitations though.
  1. MIDI thru is only operational while the sequencer is playing, not while it's stopped or paused. This is potentially annoying, and the only solution is to keep the MIDI output device open all the time, not only during playback. Doing so would have the additional benefit of reducing the lag between pressing "Play" and playback actually starting. Most of that lag is caused by opening the device, though some devices are worse than others. It's a cool idea and it's on the list, but I'm not going to deal with it right away.
  2. The delay can be longer than one might like, depending on how the sequencer's latency is set. At the default latency of 10ms, it's probably fine for controlling parameters, but Chopin is out of the question. The sequencer's latency can be set lower, all the way down to 1ms, but the lower it is, the greater the risk of timing glitches in song playback, due to the callback taking too long and not being able to keep up. This also depends on other factors such as tempo and the density of the sequence.
  3. The delay isn't constant. This is because I grab all the live input events that have occurred since the last callback and write them all to the start of the buffer. They probably didn't all happen at the same time, but they're all output at the same time. In other words, the live input can get "bunched up" into discrete packets.
Whether "bunching" is a problem depends on how fast and dense the input is. If the input is sparse, there might only be one input event per callback. But if there are multiple events per callback, the callback could try to approximate their original timing. The events have timestamps, which can presumably be used to space out the events in time. Note however that doing so makes the events even LATER than they already are. A nasty trade-off! This is a hard problem and I intend to ignore it for now. Possible bunching aside, the current scheme is straightforward and minimizes delay.
The live input queue also makes it possible to do live patch changes, along with volume and panning. This is already implemented in the Channels bar. The addition of a MIDI input device is a prerequisite to other useful things, such as MIDI mapping of all parameters, which is also on the list.

Sunday, April 15, 2018

Time to repeat: longer than the universe has existed

After a day spent studying number theory, my Polymeter sequencer has something I always wanted: a "Time to Repeat" command! The command computes how long it would take for the selected tracks to repeat, in both beats and time (along with the greatest prime factor) and displays the results. I tried it on my test track that uses all prime meters less than 32 (except 23). "Only" 22,428 days. That's longer than I've been alive! That's some serious polymeter. During stress testing I made some combinations that would take trillions of years. Thomas Wilfred would be impressed. https://en.wikipedia.org/wiki/Thomas_Wilfred

Saturday, April 14, 2018

The road ahead: lots of body work

Last night I got variable track length more fully implemented. It's now unlimited in the sequencer, but the UI only allows 256 steps, which is similar to Bongo's limit (384) and good enough for now. Unlimited steps in the UI will have to wait until the UI redesign that's coming next. Right now the UI is implemented as an array of track dialogs, but that's still no good. The step area needs to be implemented as a single window (a scrolling view), and the configuration area also needs to be implemented as a single window (a grid control). The two windows need to be children of a splitter window, with synchronized vertical scrolling. Probably! But this is where things start getting really hard.
I know it seems like a lot of progress had been made, but I expect it will take at least a year from now before the program has all of Bongo, Jock, and Stencil's features. But the good news is, I've generally been proceeding in descending order of risk, and most of the stuff that I had serious doubts about is already working. The epic risk was making a Windows MIDI sequencer that uses streaming mode and doesn't glitch or crash. I no longer have doubts about that. That was tentatively proved in October 2016, just before my mother died, but then the project got derailed for more than a year, and didn't get underway again until March 2018. Overall I figure I've put roughly three months into the Polymeter port so far, maybe a bit more if you count thinking about it in the bathtub but not coding anything. It's still a brand-new project, not even an alpha: pre-alpha.
The sequencer engine is high risk stuff, fun to develop and relatively quick. But most of the road ahead isn't that, it's UI design. UI design is slow and tedious and design mistakes are expensive because it takes so long to start over. Essentially it's the difference between engine work and body work. You need both but they require different temperaments. Engine work is more mercurial and adventurous, with easily defined goals--the engine runs or it doesn't--whereas body work is more of a slow steady grind with goals that are highly subjective.

Friday, April 13, 2018

Sequencer fully operational, alpha release imminent

I just finished Polymeter's MIDI export. The export reuses the exact same code that's used for playback, eliminating parallel maintenance along with the utlra-nasty possibility of an export that differs subtly from what's heard during playback. Such behavior would evil and bad, and preventing it at all costs was one of the holy grails of this project.
DOS Bongo also shared the core event processing code between playback and export, though via a relatively crude C-style method. The new version can even export DURING playback, without disrupting playback at all, thanks to crafty pointer magic. I plan to release this version shortly so that people can take it out for a spin.
The sequencer is rock solid timing-wise, and handles most core Bongo features, notably excepting legato, masking, velocity editing, controller tracks, and patch changes. It also adds new capabilities that Bongo didn't have, e.g. unlimited track count and length. The UI is lagging behind however, and has a long way to go. Much of the UI will need to be completely redesigned, as my first approach, while expedient, is clumsy and won't scale. On the good side, the new UI can optionally indicate the current position within each track while playing, which is handy and fun to watch.
I'm in the process of writing a track based on that drum demo I posted a few weeks back, that used all prime meters less than 32 (excepting 23). It has a piano part that's masked (i.e. a loop in one meter that's being muted and unmuted by a loop in a different meter), but because the app doesn't support masking yet, the masking has to be hard-coded down in the guts of the sequencer. Hard-core. But it sounds very groovy and trancey.
I intend to re-appropriate the word trance. For my European friends, the word "trance" has unsavory connotations, as they associate it with the ubiquity of cheesy psytrance and its accompanying hippy vibe, but for me, trance means something totally different and positive. For me trance is the hypnotic experience of hearing or seeing polymeter or phase shift.

Heptatonic scales with a minor third

Which heptatonic scales consist entirely of semitones, whole tones, and a single minor third, without having two semitones in a row? The he...