C&G surveyed me after my EYESY purchase. One question asked what I’d like C&G to do next? I’d like to amend my answer.
Eurorack Ableton Link Module
Top to bottom
USB via WiFi
LED (switches from red to green when Link is achieved and phase is locked
Jacks for 1. Gate Out (24 PPQN/click) and 2. Reset
Ableton FX or instrument (delay comp) provides sync’d recording Euro->Ableton
I did try video on OTC (so same as old ETC hardware) - it didn’t have enough processing power, and pygame was pretty inefficient at handling video, also the lower frame rate was a bit of an 'issue.
But it did ‘work’ … just wasn’t really useable. ( here is my post about it )
the ETSY is more powerful (based on CM3), so hopefully that will make it more viable.
and for sure, Id expect openframeworks to be more efficient.
of course, we have to have limit our ‘expectation’ here - the CM3 is not a desktop level processor, so there are limits, with something like video which is at 30-60fps, with a lot of data!
if you are interested in what levels of performance you might achieve you could run pygame / openframeworks on a rPI3b … see what you get.
as for why it’s not ‘advertised’?
I’d assume at initial launch EYESY first goal is to be an upgraded ETC, as this is very popular ‘as is’ , and I’m sure that in itself is quite a task for C&G (they are not a big company )
But C&G and the community have a reputation of taking things further after release!
Thanks! It’s a good point!
They are great news. I guess I’ll experiment around this, when eyesi arrive. I’m so interesting around the mix videos and visuals. For OTC/ETC, maybe it’s just an issue of the workaround (codec/driver), because, in theory Raspberry Pi is capable to manage videos on 720p/60fps and 1080p/30fps. Anyway, I guess with EYESI and its CM3 will be better because the better performance of CPU.
the organelle-1 and ETC don’t use same processor as rPI (unlike the organelle-m/eyesy).
Ive a feeling I could get an organelle-1 to probably play video (e.g. thru ffmpeg) , but I think python/pygame is just not efficient enough to process video signals.
bare in mind, in my above experiment, all I was trying to do was to render frames, not real processing.
the other issue with organelle-1/etc is its a bit problematic to get newer versions of software on it, as its a pretty old linux build.
this is not an issue with organelle-m/eyesy.
so quite a few reasons for optimism for eyesy - but as always, we wont know till we try it
WOO! mine just arrived today! does anyone know where we can go for info on developing on this thing? I know there’s a wireless interface but how can we enter our wireless SSID and WEP-key so we can get to it?
I’m thinking about buying a video capture device for a project that will involve my Eyesy’s visuals. To get the best quality possible to match the Eyesy, the minimum specs I should be looking for out of my capture device is 720p, 60fps, correct?
probably when they get close to shipping all the original kickstarter pledges. They are averaging 100 every 2 weeks now but they said they hope to speed that up now. they just announced 200 and I’m # 459.