New Video Synthesizer: EYESY!

We just launched a Kickstarter for our newest video synthesizer, the EYESY.

Check out the Kickstarter page for more details!!

The EYESY is a close relative to our older ETC video synth, and adds some cool new features like composite video output.


Hi, moving the ETC platform to a faster main board was seriously necessary. With moving to raspberry pi: will it be possible to upgrade the raspberry mainboards in the EYESY? That would be perfect!

And will the Audiosignal from L an R be available separately in the modes (longing for oscilloscope music…)

Hey guys.
I am really looking forward to this project, but I’ve got a few questions.

  1. I can see Raspberry Pi Compute Module 3 has got some kind of GPU. So, EYESY use it, right? It’s not software rendering?
  2. If so, is there support for OpenGL? Can I write custom shaders?
  3. Is it also based on Pygame?
    Sorry if I am asking something stupid, but I couldn’t find any detailed information about specs.

Wow this is so cool and exciting! Is there any footswitch capability through MIDI? I use my ETC in my live set but since I play guitar the only way I can change scenes if via footswitch. Is that a thing? Sorry if this is a stupid question!

I’m also really hoping that Audio L and R can be processed separately! I’ll almost definitely be backing this if so.

Hi, I’ve just support this with reward on kickstarter.
I hope find enough documentation to dev our own stuff inside this. 3d support would be great. Anyway I find on the net resources to simulate 3d on pygame… I don’t know if would be work… But I’ll chance to test :smiley:

Hope this corona-crisis don’t delay too much shipments.

1 Like

@nokulture Thanks for your support!

@Petajaja Yes, L and R can be processed separately.

@chrisfarren Not a stupid question, and something we should point out. The EYESY no longer has a foot switch input, but you can use a MIDI footswitch to send program change messages and make selections that way.

@RandomUser Out of the gate the EYESY will do everything that our older ETC product did. This is using Pygame to render graphics in software. But yes the EYESY has much better 3D / OpenGL capabilites that we are excited about exploring.

@fanwander It should be possible to upgrade the Compute Module if a new one comes out. Raspberry Pi has always said the modules will remain pin compatible, but we don’t really know until they release something new.


Hi @oweno

May I redirect your interest ;-):
In Korg nanopad2 with ETC I pointed out how to change the ETC core to read all MIDI-CC’s instead of only five. May be you take this over to the EYESY. Also it would be cool if you feed more USB-ports of the Raspberry board to the outside. Then it would be possible to add hardware controllers directly to the EYESY.


I have many custom created ETC modes. How easy will it be to port them over to this?

Hey guys, awesome Project.
Do I have access to the Midi Messages in the custom Python Skripts?
I was thinking of using the Midi Outs of my Drum Machines and Sequencers as the trigger for the visual events, instead of the Audiosignal.

If I understand it right, then the EYESY is 95% of the ETC, with a few addons. So I am quite sure, that the Modes are compatible.

This is already possible in the ETC, so it should be possible with the EYESY too.

That is correct, the idea is that anything that runs on the ETC should run on the EYESY without much or any modification. One difference is the EYESY can do multiple resolutions (NTSC and PAL in addition to the 720p of the ETC). we are adding some variables to the api so that modes know what resolution they are operating under and can adjust drawing accordingly. If you don’t add these changes, then your mode will look different depending on the screen resolution, but should still work.

Hi @oweno:

Will you release the code on github again?


Kickstarter newsletter today:



Any idea if this new hardware platform would be able to work with Jpgs or Gif, for example creating frame by frame animation in sync with a beat? Etc was not suitable for this use, but this hardware upgrade gives me hope!


Can it be done on a raspberry, if so then it’s likely…


@thetechnobear - glad you saw that video of our EYESY experiments with OpenGL/openFrameworks/Lua!


So cool!
I hope the experiments becomes to stable and functional framework.
New dimension of possibilities with that!

1 Like

This is an extremely exciting development. I’m looking forward to the EYESY regardless as my programming skills have improved radically since the ETC came out, but I have been recently working with Lua for developing Monome software. Being able to harness that knowledge to build crazy video experiments is the stuff of my dreams!