I’d like to make an ETC patch that scrubs through a short clip of video, making surreal, jittery loops. Is that within the realms if possibility? (Does it support playing back and manipulating video?)
I haven’t seen any ‘video’ stuff from the ETC vids that I’ve seen so far. Only the graphics kind of stuff…
I’d jump on it in a second if it’s able to do some of the stuff you see people doing with the Tachyons video hardware. I don’t know enough about this stuff to know if that’s even possible without using vhs/tube tv feedback etc tho…
Luminancer on iPhone does an ok job for some video synth stuff but doesn’t react to music etc and feels like it has some other limitations too compared to the real deal.
The ETC doesn’t officially support video playback. @distropolis mentioned trying out the moviepy module:
We haven’t tried the moviepy module however and can’t speak to its capabilities.
Yes, Ive got it working on OTC… (sorry no access to an ETC, but should also work)
unfortunately, pygame.movie has been removed, so indeed you need to use moviepy with pygame, and that needed quite a few packages installed to get it working ( and took about 20-30 minutes to compile/install)
(we would also need to cross reference this against an ETC image to see what it already has and doesn’t)
on a small file its was scrubbing fine… though it was a bit slow going backwards…
that said, its the first time Ive looked at moviepy, so it may be I can find some ways to improve performance
I also need to do a bit more testing, and with some proper sized files to see if its useable.
but thats going to wait until I get a small projector for the studio at the back end of next week (and assuming it works), as got pretty bored wandering back and forth to a TV to test it
I’m also on your OTC release. Looking forward to hearing if you get moviepy working more smoothly
Yeah, I’ll know a bit more next week…
one thing, I tried a full 720p (1280x720 ) @ 30fps mpeg4 video last night, and OTC was not keeping up. not sure if compression was a factor, or other optimisations to be had.
but it started me thinking:
what it would be useful for? whats are modes likely to do?
I know I don’t want it for straight playback (if I did there are other ways to do that better)… so I guess, I’m thinking something lo-res (in keeping with other ETC modes), that reacts to music…
there might be some other interesting ideas to play with too e.g. moviepy allows static images to be generated from video.
its probably quite a balancing act to play, as you dont want to use the entire CPU bandwidth for playing the video, as you need to have some left over to apply effects/overlay…
at the moment I should say, I haven’t decided if I will ‘release’ this, as Ive some concerns:
As expressed before, I really don’t want OTC to diverge from ETC too much, partly due to technical concerns (code divergence, we might need some small changes to make this work), and partly, I don’t want to upset C&G/ETC users by OTC going further, this seems a bit ‘unfair’ (bad choice of word, not quite what i mean, but cant think of a better one) - I think there should be whats called ‘feature parity’ as much as possible.
Related to above - who would write the video modes? I would write a few, but I really don’t have the time/bandwidth, to start explaining the API, or helping people when their modes don’t work…
even worst, if no-one else writes any video modes, the my efforts are pretty much in-vain.
Its a lot of work to create an installer, bare in mind just one test install will take around 45 minutes, and you whilst developing I have to do a lot of these… so realistically, Im likely to end up spending 6-8 hours doing this, especially if you take into account ‘initial release support’, writing some tutorial modes - this is time I could spend making music.
all that said, Ive not made a decision yet, I just want to highlight some concerns (and I guess limit expectations) … I guess I’ll see how I feel next week once Ive played with it a bit more, and seen how much fun there is to be had.
Agree about lo-fi video. That was my thoughts too. I’d excited to turn a short clip of shitty video into something interesting. I feel like that takes it from posh iTunes visualiser (fine for VJ but not much else) into terratory where it could be published as a music video or piece of visual art in its own right. But with more happy accidents and human touch than automating stuff in aftereffects.
Also understood about you not undermining the other products. I may dig into this on my own in time if novody else does. I’m finishing a record just now so I can’t get deep into it, have never used linux so would be starting fresh. Perhaps in the new year
Id like to hear more of what you have in mind…
perhaps I can come up with something (simple) to test what you have in mind is possible.
e.g. do you have resolutions in mind? size of videos?
here is what Ive seen so far, about moviepy:
- its built around a concept of clips,
- initially the clip is the whole movie, but you can create new clips, by taking subclips, compositing with other clips, apply fx to clips etc…
- then during the pygame draw, I grab a frame from the clip , and then render it (similar way to drawing)
- this means we are limited to the 30fps , that we have in the main pgygame rendering (unless i change etc_mother)
- im not sure why, but when i accessed frames in sequential order, it was quicker… so I’m guess there must be some optimisations.
- Ive not tested, but I believe i could also still draw onto the screen , during this render . using the pygame api
what I’m not sure of yet is… when you ‘create new clips’ does this really take copies/pre-process the clip, or does it just form a ‘render chain’ on a modified timeline - I think this is the case.but hopefully with some optimisations
e.g. one test Id like to do is:
render a clip in reverse by asking for frames from (N to 1) … (ive done this , and its ‘slow’)
compare this to using a clip fx, asking for a reverse timeline… is this effectively the same, or does it do something extra.
also can we do a certain amount of the rendering ‘up front’
this also leads to questions of where are we bound? the speed of the SD card read (Im reading off the sdcard, not usb fortunately!) and/or the processing required e.g. uncompressing the data stream… is there a way of doing this ‘upfront’
(or put another way, why was the 720p video I rendered, so much slow that the small video i initially tested with)
also I need to think about possibilities ‘upfront rendering’,
one other thing to note… is there is a ‘cost’ to load the movie, so if you had a lot of movies in modes, it might take a few seconds for ETC to start. … also i need to check memory consumption to see if this is viable.
also, Id like to look at the ETC/OTC use of python/pygame, in particular can I give it a higher priority?, or setup the graphics rendering to be more efficient?
this is all rather experimental as moviepy is not very well documented, and where are very few examples of pygame with moviepy, hence many questions.
so as i said… Ive a suspicion this is going to not so much about ‘is it possible to X?’ rather, thinking creatively about using the constraints we find, and seeing what we can make within those…
EDIT: ok, got ETC running on my desktop now, much easier to develop the graphics modes here…
though I think i need to now develop something for the Organelle so I can use it as a a control surface