I thought I’d share this. I have been working on it since I got the Organelle last month. It’s a pretty simple arduino based ldr interface but I found it really adds an expressiveness to the Organelle. In the video I’m using it with a modified version of the Quad Delay patch and the simple strings patch by msghmr. It’s in a relatively early state but it’s usable for performing (just about).
I was wondering though if anyone has tried to use these machine learning externals on the Organelle yet?
I think they could work really well with interfaces like the one in the video.
I also use the footswitch on the Organelle as a calibration tool which sets the threshold off the light, so you can use it in a number of lighting environments.
This is cool! I’ve been wanting to play with the gesture recognition toolkit and machine learning externals.
With some fiddling I compiled the ml-lib externals for Organelle:
The documentation is a little light, do you know of any example patches to test with?
I realized sending a [help] message to any of the objects prints out useful info (attributes and methods) which is helpful. Since it is a wrapping GRT, a lot of the info here is probably useful too:
Great. I’ll get to work trying to use them with the lightbox. I don’t know of any specific patches but I was at a talk by Ali Momeni who compiled them for Max and PureData and he had some cool patches but I can’t seem to find them anywhere. There is also a kadenze course on Machine Learning that seems pretty cool.