I was wondering if it were possible to do real time convolution on the Organelle with live input? I’m aware of the bsaylor partconv~ external and also the Huddersfield HISStools stuff, but I’m not too sure how to implement it.
I’m wanting to write a patch that can convolve live input with a number of pre-loaded sounds which can be cycled through, sort of like the DJ patches. So it can be used for multiple reverbs or non IR convolution.
@scumvullage recently said he’d got bsaylor partconv~ to work… but it consuming all cpu.
generally convolution reverbs are pretty cpu intensive.
not sure if there are any that take enough shortcuts to be done on smaller devices.
that said… bela faq says:
Can it do convolution reverb?
“A few years ago a student in our lab implemented a convolution reverb on a standard, unoptimized Beaglebone Black, with impulse responses as long as 12 seconds. So yes, the processing power is there.”
and the Organelle has the same kind of processing power, so perhaps the right algo can make it happen
Ok, great! I’ll start experimenting and see what works. I suspect that maybe cycling through files might be a bit much for the cpu. I had the idea to have two separate channels of convolution so that the transitions between switching impulses would be smooth, but I think that this might be too cpu intensive.
I’ll try it out though and let you know how it goes.
Yeah, the bsaylor~ stuff works great on my PC, not so much on the Organelle. There’s another convolution external floating around too, but I haven’t tried it on the Organelle yet.
I’m working on some separate research stuff at the moment, but am hoping to return to the idea of convolution and get it running on the Organelle. Also going to be working on a patch that implements this interesting delay network for room simulation. I’m gonna start with basic pd objects, and hope that it doesn’t eat the entirety of the Organelle’s CPU - if not, I’ll be more than happy to share