Iphone/Android as Organelle ui extension?


Guessing I’m not first person to ask but couldn’t find other threads.

Is it potentially possible to link iPhone/Android to organelle via CCK and build an app that allows for a kind of ‘patch partner’ mobile extended controller/UI app to be used?

Like for example if someone built a synth app with 6 pages of params that works standalone in organelle, they could also share a ‘patch partner’ file for use in the mobile/tablet app to make navigating params/pages easier/more expressive? The app could have a browseable directory of Patch Partner UI’s to load accordingly or ideally would automatically load whatever patch is on Organelle if that info can be sent/received?

Not even sure it’s something I’d use, quite like the focused nature of Organelle by itself in terms of UI, and not a huge fan of touchscreen, but curious if it’s a feasible possibility? Potentially could be faster/less blind than page switching on organelle for stuff like step sequencers, patches with lot of transport/sampling functions etc tho… Just thinking out loud really…


if your not into programming, then Id suggest the simplest solution is to looking into Lemur, and use OSC or Midi, (first works over wifi, second via usb/cck)… with Lemur you could write tailored UI for particular patches without much difficultly.

as for the rest, well anything is possible, lets wait and see :wink:


Another possibility is MobMuPlat which will actually run PD patches as well as a custom GUI. I usually end up just making entirely separate instruments for this, though and have them all networked together.

Lemur is nice because it already has a lot of amazing sequencer preset GUI items + functions built into it.


Running a web based UI is also possible. I’ve been experimenting with these:

This requires running a web server on the Organelle which pass the messages received from the browser to Pd. This is a clunky approach, and wouldn’t work for anything expressive (latency is huge and jittery), but could work for configuring or things like step sequencers or routing. also cool because there isn’t anything to install or configure on the client side: iPhone, iPad, laptop, or whatever has a web browser will just work.


This sounds like it would be a really amazing approach for interactive performances! So much fun


in a similar vein, when we discussed this on the Blokas forum, someone mentioned ‘Open stage control’


Ive not played with it, but does look nice, and seems to be in active development.