New Video Synthesizer: EYESY!

This looks like lots of fun and some significant upgrades on ETC. It looks like the audio line in attenuator knob is gone. Is there still a way to control audio in volume?

Good eye! This will be done in software.

1 Like

That’s interesting @chrisk - will one be able to tweak the line in while using the device? Or would you need to go into the code and adjust the audio level response?

I ask because I find that attenuator on the ETC useful for creative effects while performing. Of course one could add a mixer or other device to control line in volume, or control it on the instrument’s out.

The ETC can do this. See Using Giphy Stickers to create animation - #6 by fanwander
The only restriction is/was the low RAM.

yeah, this is really exciting… Ive been considering trying out openframeworks for a while.
I’d also heard of the lua ofx bridge but wasn’t sure stable it was…definitely makes ofx more accessible :slight_smile:

Combined with stereo input, and the more powerful processing capabilities (so possibilities for using fft for frequency analysis), it really opens up lots of possibilities for some more elaborate visualisations that are much more aligned with the musical input. :heart:

2 Likes

really excited about the new video synth!

quick question as I’m new to all this. is there a technical reason Raspberry Pi 3 was used in EYESY and not Raspberry Pi 4?

Raspberry foundation has not released a CM4 (compute module) yet, which is what is used by products that embed the rPI into products.

CM is preferred over the rPI board as it does not define form factor, nor include components you don’t required and consume power (which in turn generates heat/lower battery life etc)

its unclear if/when raspberry will do a CM4, given it consumes quite a lot more power, to the point that passive cooling might be difficult… so i suspect its not going to be ‘any time soon’ - but who knows, what the future will bring :slight_smile:

2 Likes

ok, I understand! thanks for taking the time to answer the question :slight_smile:

@Petajaja Here’s an example of using the Left & Right inputs for separate oscilloscopes:


(if you are curious about the graphics, these were made with OpenGL/openFrameworks on the EYESY)

1 Like

Love so much this new features around openGL

Can’t wait to put the hands on :slight_smile:

3 Likes

OMG! So happy for this!! Can’t wait to get my Eyesy!!

The Open GL + openFrameworks + Lua is what I’ve been hoping for!!

With composite out!! I love HDMI, but a lot of my equipment uses composite, so happy!

Thank you to everyone at Critter and Guitari for all the awesome hard work they do!!

Cheers!

Our EYESY Kickstarter has ended - Thank you to all the backers!

For those interested in purchasing an EYESY in the future, please:

  1. Stay tuned to the forum for news about availability, and/or
  2. We can send you an email when it’s available on our site. Please contact us to be included in this announcement.

Thanks!

2 Likes

I was so excited to see the Kickstarter close successfully (and way over goal!) That bodes well for an active community once the these are released to the wild :slight_smile:

I’m looking forward to learning more about video programming and creating modes – is there already a place to get started learning that will be applicable to Eyesy once it’s released?

Hi,
I wonder if the EU contributors will get the EU power adaptor. Do you know something about this?

Thanks!

2 Likes

i hope so too. plz make it happen!
we already have to pay 65dollars more and we dont know if we get taxproblems. if this come there too we have to pay even more than the eyesy will cost later. and then we have extra to invest in a special adaptor…that would be sad.

2 Likes

very new to video synthesis but i’ve heard that video feedback is one of the easier ways to get into it. I own a CRT and DLSR camera but im not sure if it’s possible to connect the two. Any help/advice appreciated!! :slight_smile: .
Regards.

Depends on whether your camera has a S-video out or composite video out.

But be aware that feedback alone is not very exciting. You need at least a device that manipulates colours, contrast, resolution or similar parameters. Old video mixers can do this. And in fact: that is not video synthesis. It is video manipulation.

The EYESY cannot handle incoming video signals, but it can create new video signals.

Interesting! Does it means eyesy would use video files as a source to create textures or backgrounds or whatever?

No. It does not use videofiles. It creates videosignals. It can use static imagefiles thatfor.

http://www.florian-anwander.de

1 Like

Ahm, ok. Thanks.