Eyesy Test Harness for PC

Someone asked this on the patchstorage discord and I was hoping it would get more visibility here. Is anyone aware of a test harness for rapid prototyping of eyesy modes?

Looking for a fast way to implement on the PC without having to bounce to hardware.

I’ve looked through the manuals and done some searching but either there’s not one readily available or I have completely missed it.

I have not heard of one, but this would be extremely useful! I was thinking it would be relatively easy to make one, but I just started learning python for Eyesy so I’m sure I won’t be the first to make it. There aren’t that many things that are going on behind the curtain, so an approximation of those functions would probably be enough to be usable.

@okyeron may have something along those lines. I know he was using something other than Eyesy hardware to test the oF modes he’s working on.

I don’t have much to suggest on the Python side unfortunately.

I’m doing all my oF dev on a RasPi or EYESY itself.

The main things you’re missing are the sound inputs (which might need to be configured differently on a PC) and knob inputs - although those can be emulated with OSC

I found a lead, KP Kaiser on youtube had some cool tutorials for the ETC, and the last he briefly talked about a test program. There’s a link to all of his code in the video description, and I think it would do exactly what we’re hoping for (even control knobs with your keyboard!), but I am not experienced enough with python and am not understanding how he is able to type the two file names and BOOM it’s running. I’m using pycharm and I’m on windows and maybe it’s different enough between the two?
Can one of you wizzes get this working? (and then give me a hint on what I’m not understanding)

2 Likes

I got it working thanks to mitcheo’s find! Here were my steps.
Go to KP Kaiser’s github, there’s a link to it on the webpage in the video description.
Download his critter-and-guitari-etc-master zip file
Extract it in a familiar location
Then, I had to get my computer able to run it, by downloading python 3.9 from the windows store, and pycharm community edition from the pycharm homepage, a free python IDE

I started up pycharm, made a new project with python 3.9 as the interpreter. pycharm makes a handy virtual environment (venv) at the folder where the new project is created. Take the contents of KP Kaiser’s zip file and paste them in the same folder where the venv folder is (C:/pathtoprojectfolder/venv,
C:/pathtoprojectfolder/paste contents of critter-and-guitari-etc-master)
Once I did that, I installed the pygame module in pycharm. This can be done from the terminal in the bottom left (and using the command pip install pygame) or by going into File, Settings, ProjectProjectName, Interpreter, and then clicking the + icon and searching pygame.

Once you have pygame, and the critter-and-guitari-etc-master files copied, open etc-test.py in pycharm by clicking it in the list of files in the project. This will let you see all the code that’s about to be run. To keep it simple, you can click the etc-test.py drop down menu in the top right, and go to “Edit Configurations”. From here, there is a box labled “Parameters”. You can type the name of one of the effects included in the original zip file from KP Kaiser, such as “sine-wave.py”.

After adding the parameter, click the run button and it should make a popup window running whatever effect you put in the parameters text box.

If you want to run an effect that wasn’t originally in KP Kaiser’s critter-and-guitari-master-etc zip, simply copy it into the same location (the same folder where etc-test.py and sine-wave.py are), and then type the filename in the parameters text box like before.

The etc-test file has some neat controls. Out of the box, you can change the values of the five knobs by holding down the number key of the etc/eyesy knob you want to change, then press the up or down key to increase/decrease the knob value. Press Spacebar to start an audio trigger and z to end the audio trigger. Press q to quit. You can see these controls near the bottom of the etc-test.py file (they’re a bunch of if statements).
I added a couple of controls to simulate audio if anyone is interested in including them. All you would do is copy this text and paste it after the other if statements:

 if key[pygame.K_x]:
    etc.audio_in = [random.randint(-32768, 32767) for i in range(100)]
 if key[pygame.K_c]:
    etc.audio_in = [random.randint(-300, 300) for i in range(100)]

these controls allow you to simulate audio coming in by pressing the ‘x’ key and no/low audio by pressing the ‘c’ key.

If you try running an eyesy effect and get an error related to the background, I’ve had a little difficulty with getting the code that changes the background color to work as intended (I think due to eyesy and etc having different background color control methods). So, I just commented out the line in the effect, “#etc_color_picker_bg(etc.knob5)” in the effect I was trying to run, and it got around the error.

Wicked!

This is what I wasn’t getting, how to pass the parameter to the test program.
Thanks for figuring it out, and the audio simulation is an amazing add, we basically have all the controls from the EYESY!

1 Like

I’m halfway done making a compatibility library for EYESY’s ofLua mode so modes written for the beta firmware can be run using loaf. This is a live coding environment from the same author as ofxLua, which powers the EYESY’s Lua mode. I’ve found it much quicker to iterate locally since this updates everything on every file save, and then just do final touches/audio reactivity on the EYESY.

loaf also integrates with OSC so it’ll be possible to simulate knob[1-5] with a MIDI controller. As discussed above, I guess random values are good enough for simulating inL and inR, but I haven’t implemented that yet.

If there is interest in this kind of thing, I could try to put more effort into making it easier to use for others. Just to be clear, this is for the openFrameworks/Lua version, not the stock Python one.

In the meantime though, for anyone else developing modes locally, you can use scp and curl to develop locally and avoid the web browser if you like (change the mode name and paths depending on your setup/language):

scp -r mode_name music@eyesy.local:/sdcard/Modes/Python
curl -d 'name="..."' http://eyesy.local:8080/reload_mode

Any updates on this? I’d like to learn another language outside of python and this would be a cool opportunity.