There's, for one, the idea for an "acoustic incense candle" or a "musical screen saver", based on music composing artificial intelligence. The system is thought to be modular: I can devise separate components which produce parts of a complete composition, and can be plugged together in the player app.
I have started extensive work on a "development studio" for music AI components, project name "Audiowerk". It is possible to create and route DSP nodes comfortably, and there are base classes for generating MIDI events.
|DSP routing for my music AI|
The other idea you've probably already heard about on Twitter: it's a new kind of creative environment UI for graphic design, music, animations and coding, preferably on touch screen devices, codename "Lightboard". A few weeks ago, I posted this:
it was meant to be a battle cry to motivate myself. Also, I noticed that it's fun to organize thoughts in Inkscape like this. I also started doing a similar mind map for application features:
and I implemented a basic prototype for the UI in Python:
That's where I am right now, and again, I'm stuck. As it is, the application can not be ported to Android devices, because the front end is based on Python, and Python is badly supported. But I could also just wait for a Linux tablet with multi-touch display and run my stuff there, which is probably the better choice, considering the CPU power I'd need for decent DSP.
The other issue is that the UI view code already needs a small rewrite, e.g. for support of infinite zooming and better distribution and handling of input events. I feel that I could save some time if I had a finished design for the user interface I could just follow. Unfortunately, this is research and development; I don't deal with finished designs ;)
So that's where both projects currently stand. I should look into whether I can fuse both ideas, and rewrite Audiowerk to run within Lightboard, so both projects progress in parallel, and Lightboards infrastructure is based on true requirements;
But Audiowerk has been written for a GTK desktop, with extensive keyboard- and virtually no mouse support. DSP routing and event editing for a multi-touch environment will have to look and feel considerably different.
Update: I believe taking this too seriously is not going to help me much. I'm going to lay aside both ideas for a while and work on something else instead.