ETC: Modify output resolution to 640x480?

Hi folks!

First post from a long-time user and fan. I’ve been running my ETC through a cheap HDMI -> composite converter for the past year or so, as a part of a larger analog video rig that outputs to a series of CRT screens at 640x480. The converter itself (Kanexpro HDRCA) will accept any resolution that it’s given and cram it into the smaller resolution, to fit the smaller screen. Aside from squashed-looking animations, the video mixer (Panasonic MX10) that I’m running it into also seems to lose colour when the converted signal hits it. One thing that I’ve seen suggested is to ensure that the source feed (in this case the converted HDMI signal) is in 640x480.

This brings me to the question, Is there a way for me to get into the system files to force a 640x480 aspect ratio so that the converter doesn’t need to “squash” it? I’ve seen conversations around increasing resolution, but I don’t see anything that discusses reducing the output resolution of the ETC. Will it suffice to simply update the code in the modes to reflect the lower resolution, or is the output resolution of the ETC defined somewhere within the OS?

Or, am I perhaps overthinking/off-base here and there is a simpler solution?

Thanks in advance for any suggestions that you may have!

2 Likes

Hello

There is an easy part and a difficult part about changing the resolution:

1.) The resolution is defined in /root/ETC_Mother/etc_system.py in line 19:
RES = (1280,720)
That far for the easy part

2.) Now for the difficulties.
Many many following settings rely on the dimensions. And you have to change all this stuff.
I don’t want to look at other system settings for now, but for example assumingly each and every mode relies on these sizes. Just a search for the number 1280 in the standard modes brings this up:

$find . -type f -exec grep -H 1280 {} \;
./0 - Sound & MIDI Evaluation/main.py:    pygame.draw.rect(screen, (255,255,255), ((0,0),(1280,720)), 0)
./0 - Sound & MIDI Evaluation/main.py:    pygame.draw.line(screen, (255,0,0), (0,239), (1280,239), 1)
./0 - Sound & MIDI Evaluation/main.py:    pygame.draw.line(screen, (255,0,0), (0,481), (1280,481), 1)
./S - Aquarium/main.py:            modSpeed = countList[i]%(1280+width*2)
./S - Bits Horizontal/main.py:xpos = [random.randrange(-200, 1280) for i in range(0, lineAmt+2)]
./S - Bits Horizontal/main.py:        xpos = [random.randrange(-200, 1280) for i in range(0, lineAmt+2)]
./S - Bits Vertical/main.py:width = 1280
./S - Circle Row/main.py:    space = (1280/circles)
./S - Classic Vertical/main.py:    position = int(etc.knob1*1280)
./S - Concentric/main.py:    x = int(etc.knob1*1280)
./S - Cone Scope/main.py:    x0 = (int(etc.knob1*1280))
./S - Cone Scope/main.py:    #pygame.draw.line(screen, color, [x0 + (640 - (int(etc.knob1*1280))), y + i], [x1 + (640 - (int(etc.knob2 *1280))), y+10], 10)
./S - Connected Scope/main.py:        pygame.draw.rect(screen, etc.color_picker(), [random.randrange(0,1280),random.randrange(0,720),5,5], 0)
./S - Connected Scope/main.py:    xoffset = (1280 - (99*10)) // 2
./S - Connected Scope/main.py:    xoffset = (1280 - (99*10)) // 2
./S - Feynman/main.py:    pygame.gfxdraw.filled_polygon(screen, [(640-offset3,0),(offset1,offset1),(1280-offset1,offset1),(640+offset3,0)], (etc.bg_color[0]*.8,etc.bg_color[1]*.8,etc.bg_color[2]*.8))
./S - Feynman/main.py:    pygame.gfxdraw.filled_polygon(screen, [(1280,0+offset2),(1280-offset1,360),(1280,720-offset2)], (etc.bg_color[0]*.7,etc.bg_color[1]*.7,etc.bg_color[2]*.7))
./S - Feynman/main.py:    pygame.gfxdraw.filled_polygon(screen, [(640+offset3,720),(1280-offset1,720-offset1),(offset1,720-offset1),(640-offset3,720)], (etc.bg_color[0]*.6,etc.bg_color[1]*.6,etc.bg_color[2]*.6))
./S - Feynman/main.py:        x = 1280
./S - Football Scope/main.py:    x1 = (int(etc.knob1 * 1280) ) + (etc.audio_in[i] / 64)#random.randrange(0,1920)
./S - Grid Circles/main.py:        xoffset = int(etc.knob1*(1280/8))
./S - Grid Circles/main.py:            x = (j*(1280/8))-(1280/8)
./S - Grid Circles/main.py:                x = j*(1280/8)-(1280/8)+xoffset
./S - Grid Circles - Filled/main.py:        xoffset = int(etc.knob1*(1280/8))
./S - Grid Circles - Filled/main.py:            x = (j*(1280/8))-(1280/8)
./S - Grid Circles - Filled/main.py:                x = j*(1280/8)-(1280/8)+xoffset
./S - Grid Circles - Filled/main.py:            #else : x = j*(1280/8)
./S - Grid Polygons Zoom/main.py:        xoffset = int(etc.knob1*(1280/8))
./S - Grid Polygons Zoom/main.py:            x = (j*(1280/8))-(1280/8)
./S - Grid Polygons Zoom/main.py:                x = j*(1280/8)-(1280/8)+xoffset
./S - Grid Random Polygons/main.py:        xoffset = int(etc.knob1*(1280/8))
./S - Grid Random Polygons/main.py:            x = (j*(1280/8))-(1280/8)
./S - Grid Random Polygons/main.py:                x = j*(1280/8)-(1280/8)+xoffset
./S - Grid Random Polygons - Filled/main.py:        xoffset = int(etc.knob1*(1280/8))
./S - Grid Random Polygons - Filled/main.py:            x = (j*(1280/8))-(1280/8)
./S - Grid Random Polygons - Filled/main.py:                x = j*(1280/8)-(1280/8)+xoffset
./S - Grid Squares/main.py:        xoffset = int(etc.knob1*(1280/8))
./S - Grid Squares/main.py:            x = (j*(1280/8))-(1280/8)
./S - Grid Squares/main.py:                x = j*(1280/8)-(1280/8)+xoffset
./S - Grid Squares - Filled/main.py:        xoffset = int(etc.knob1*(1280/8))
./S - Grid Squares - Filled/main.py:            x = (j*(1280/8))-(1280/8)
./S - Grid Squares - Filled/main.py:                x = j*(1280/8)-(1280/8)+xoffset
./S - Grid Triangles/main.py:        xoffset = int(etc.knob1*(1280/8))
./S - Grid Triangles/main.py:            x = (j*(1280/8))-(1280/8)
./S - Grid Triangles/main.py:                x = j*(1280/8)-(1280/8)+xoffset
./S - Grid Triangles - Filled/main.py:        xoffset = int(etc.knob1*(1280/8))
./S - Grid Triangles - Filled/main.py:            x = (j*(1280/8))-(1280/8)
./S - Grid Triangles - Filled/main.py:                x = j*(1280/8)-(1280/8)+xoffset
./S - Interfering Lines/main.py:    #x = int(etc.knob1*1280)
./S - Interfering Lines/main.py:            pygame.draw.aalines(screen, color, True, [[0, R-i*point], [640, L-i*fan], [1280, T-i*point], [960, E-i*fan], [340, F-i]], 1)
./S - Interfering Lines/main.py:            pygame.draw.aalines(screen, color, True, [[0, R+i*point], [640, L+i*fan], [1280, T+i*point], [960, E+i*fan], [340, F+i]], 1)
./S - Mirror Grid/main.py:        #space = int(1280/lines)
./S - Mirror Grid/main.py:        #space = int(1280/lines)
./S - Mirror Grid/main.py:        pygame.draw.line(screen, color, (0,space), (1280,space), linewidth)
./S - Mirror Grid Inverse/main.py:        pygame.draw.line(screen, color, (-1,j*spacevert), (1280,j*spacevert), linewidth)
./S - Perspective Lines/main.py:    last_point = [(int(etc.knob1*1280)), (int(etc.knob2*720))]
./S - Sound Jaws/main.py:    teethwidth = int(1280-128*teeth)
./S - Sound Jaws/main.py:    if teethwidth == 0 : teethwidth = 128#1280-(teeth*51)
./S - Square Shadows/main.py:    x = int(etc.knob2*1280) + (etc.audio_in[i * 4] / 35)
./S - Tall Towers/main.py:    #pygame.draw.rect(screen, colorr, [random.randrange(0,1280),random.randrange(0,720),5,5], 0)
./S - Triangle Row/main.py:    space = (1280/triangles)
./S - Two Scopes/main.py:        x0 = int(etc.knob1*1280) 
./S - Two Scopes/main.py:        x0 = int(etc.knob2*1280)
./S - Zoom Scope/main.py:    offy = int(etc.knob3 * 1280) 
./T - Ball of Mirrors/main.py:last_screen = pygame.Surface((1280,720))
./T - Ball of Mirrors/main.py:        x = random.randrange(0,1280)
./T - Ball of Mirrors/main.py:    thing = pygame.transform.scale(image, (int(etc.knob3 * 1280), int(etc.knob4 * 720) ) )
./T - Ball of Mirrors/main.py:    screen.blit(thing, (int(etc.knob1 * 1280), int(etc.knob2 * 720)))
./T - Bits H/main.py:width = 1280
./T - Bits H/main.py:xpos = [random.randrange(-200,1280) for i in range(0, lineAmt + 2)]
./T - Bits H/main.py:        xpos = [random.randrange(-200,1280) for i in range(0, lineAmt + 2)]
./T - Draws Hashmarks/main.py:        pygame.draw.line(screen, color, (0,(j*linespace)+linespace/2), (1280,(j*linespace)+linespace/2), linewidth)
./T - Fonts Grid/main.py:vwidth = 1280 / vlines
./T - Fonts Grid/main.py:        pygame.draw.line(screen,gridColor, [hlength, (i * hwidth) -1], [1280 - hlength, (i * hwidth) - 1 ], linewidth)
./T - Fonts Grid/main.py:        pygame.draw.line(screen, gridColor, [0,720],[1280,720], linewidth)
./T - Fonts Grid/main.py:        pygame.draw.line(screen, gridColor, [1280,0],[1280,720], linewidth)
./T - Holzer Scroll/main.py:scrollX = 1280 
./T - Holzer Scroll/main.py:        scrollX = 1280 + text.get_width() 
./T - Line Rotate/main.py:    L = etc.knob2*1280 + linewidth*1.78
./T - Line Rotate/main.py:    if L > 1280 : L = 1280
./T - Origami Triangles/main.py:    posx = int(etc.knob1*1280)
./T - Reckie/main.py:    if myRect.center[0] >= 1280 : myRect.center = (640,360)
./T - Rotation Grids/main.py:vwidth = 1280/lines

This is only the list, where the screen width is mentioned in clear number. There are also things like the width of 320 of some object which appears four times side by side.

I think you get an idea, what a change of resolution would mean :slight_smile:

Florian

2 Likes

Hi Florian, thanks for your response!

Looks like it would be a lot of work, indeed, to properly re-scale the various modes. I’m not afraid of putting in some tedious work to get it done, but I’ll first try to adjust the resolution to see if it actually fixes the colour problem that I’ve been having. IF it does, then it will definitely be worth editing the modes accordingly.

Regarding the setting in /root/ETC_Mother/etc_system.py, I have a beginner’s question. I had tried removing the microSD card in the ETC to make these changes, but I get an error stating that the memory card is not formatted and cannot be read. Is there a proper way to gain access to the contents? I’m using MacOS.

Thanks again for your help so far!

Dan

1 Like

This won’t work, because the SD card is formated with an Linux filesystem.

Establish SSH-access to the ETC (it is described here in the forum somewhere).
Then use SCP on a Mac or WinSCP on Windows and transfer the file, edit the file, and transfer it back (or edit it directly in on the ETC using vi or emacs, if you are used to Linux)
Always make a safety copy of all edited files in the directory in the ETC before editing/transfering

I may try this on my Laptop-Version of the ETC-program. But I won’t be able to do it before next week. Nevertheless I am quite interested in it as I use the ETC also with a HDMI to composite converter.

1 Like

Aaah, here it is
https://forum.critterandguitari.com/t/etc-as-running-ftp-or-ssh-instead-of-webserver/
(I check the later questions of the user MungoBBQ too)

1 Like

Ok thank you very much! I’ll take a look at the thread and will post back once I’ve had the time to try these changes.

Much appreciated!! :smile:

Hi

ok, I quickly tried it out: It works like I described. There is a slight increase of colour intensity if I am driving the HDMI-to-composite with 640x480, but it is not that much, that I would be worth all the changes in the modes - at least to my opinion (my converter looks quite similar to your Kanexpro HDRCA). Also I had the impression, that the converter took longer to find the right sync and also it had sometimes short troubles when the picture switches from complete black to some colour.

I think it is easier to increase the colour intensity at the beamer or the screen.

best regards
Florian

Oh very cool! Thanks for the update! I will still give it a shot eventually, as the issue that I’m having seems to be with the video mixer not producing any colour at all when I route the converter to it.

Previously, I was running the convertor through an old DVD player for its time-based correction, but that introduced a lot of latency. When I run the converter straight to the mixer, then the latency is gone, but the image is only in B&W.

I’ll look into some other options as well, and will probably still try adjusting the resolution at some point to see if the MX10 responds better to direct input that way.

Thanks so much for your input and tips, I greatly appreciate it!

All the best,
Dan

I thought a little bit about your (or even my) problem, and I think the problem is, that we need a video buffer amplifier, which provides a correct 75Ohm 1V pp signal.

I found a page, that describes the electronics and also provides two example pichtures
https://www.raphnet.net/electronique/videobuf/index_en.php

without the buffer:

and with the buffer


He writes: The vertical stripes are much less pronounced. In fact, they are as faint as they were with the original

I think it is also clearly visible, that the colours are much better.

Unfortunately I did find only DIY-pages, but none with a device which one could buy.

Hold it ! :wink:

I found something:

https://www.camboard.de/elektronik-kramer-extron-gefen/kramer-electronic-produkte/kramer-composite-video-schalter/kramer-pt-102-v.html

(thats ok for me as I live in germany, I don’t know where you’re from)

Hmmm, very interesting! So, do you also have the same problem with losing colour with the ETC -> Converter -> CRT configuration?

I’ve found that same Kramer amplifier on a Canadian site (I’m from Montreal), but it’s a little more expensive that I would want to pay. However, if it’s the only solution then it will be worth it. Do you think you will pick one up? If you do, I’d love to know if it works!

I may have access to another means of amplifying the signal. I won’t be able to test until next week, but when I do I’ll definitely advise you of the outcome!

Im not sure if you are still needing to go to 640x480, if you are a couple of thoughts…

also be aware that the hardware framebuffer is configured in uEnv.txt, I think its very likely that will need updating, to ensure its initialised correctly in both the kernel and fsquares which is automatically run just before python to set up the buffer.

I’ll be honest, my experience with changing resolution with the SOM used by ETC, has not been that great, from what i can tell the (fairly old) kernel driver is not reacting properly to the HDMI information sent by the device (EDID?) or at least from some devices - which I think is why its acting up a bit on some HDMI hardware.

for testing purposes I would not go around changing the modes.
what you should be able to do is to change it so that the modes get a buffer of 1280x720, but you blit that down to 640x480 (either by scaling or cropping)
look in etc.py for hwscreen and screen, where they are initialised and then later use blit in the main render loop.

IF you get that working, then for sure updating modes is a good idea (in fact id just remove the hardcoding) as you can get better performance at lower resolution, so perhaps extend them a bit.


note: I dont have an ETC, I’ve an Organelle but its the same basic hardware and my tests on that with a much newer kernel, show that the HDMI support is much better.
(unfortunately, to get the newer kernel requires the entire OS image to be rebuilt, so its a pretty big task)

Hey @thetechnobear, thanks for the suggestion! I’ll check this out as a potential test for sure, but as a Python novice, it’ll probably take me some time to figure all of this out. You guys have been incredibly helpful, and I greatly appreciate your input! Over the next little while I’ll try some of these solutions out and will post back here with the results.

Thanks again!

Hello Dan

I meanwhile bought a used Kramer amplifier in an ebay auction for ~50 Euros.

There are two trimmers in the box: LEVEL and EQ.
The LEVEL-Trimmer simply makes the allover signal brighter - means: a black background will become greyer with more level. Assumingly necessary for wrong loads or long cables.
The EQ-Trimmer is described in the manual as “Adjusts the cable compensation equalization level”.
To my impression it does not change anything for most of the trimmers range, but there is a very narrow spot, where I had the impression, that the picture becomes more intense. Unfortunately there isn’t any further explanation of this parameter in the manual. I have to test that under different conditions (various cable materials and lengthes).

All in all:
After my test with a short video cable (3meter) the result is ok. The picture gets somehow sharper (which confirms what the guy in the raphnet-link describes), but the colour intensity did not change that much.
I think it will be quite useful for the longer cables, which are typical in stage installations.

Regards
Florian

EDIT: I found https://cdn.kramerav.com/web/downloads/manuals/cable.pdf, which explains the “EQ”-parameter and also provides a lot of information about cables in the video realm.

so I gave this a bit of a go, on OTC , but should be the same for ETC (or very similar)
lines you need to change in etc.py are:
(note: commented out line shows original line, so you can search for it :wink: and new replacement line)

here we change the resolution of the hardware device to 640x480

    #hwscreen = pygame.display.set_mode(etc.RES,  pygame.FULLSCREEN | pygame.DOUBLEBUF, 32)
    hwscreen = pygame.display.set_mode((640,480),  pygame.FULLSCREEN | pygame.DOUBLEBUF, 32)

create a surface of the full resolution, so modes are ‘unaffected’

#screen = hwscreen.copy()
screen = pygame.Surface(etc.RES,0,32)

scale the ‘virtual screen’ that was drawn to by modes, and then blit it to the hardware screen.

    #hwscreen.blit(screen, (0,0))
    hwscreen.blit(pygame.transform.scale(screen, (640,480)),(0,0))

perfomance seemed fine on my organelle.

note: 640x480 has a different aspect ratio to 1280x720. so image looks squashed, so you could scale the image down further to maintain the aspect ratio, then just shift the blit to centre the image.
but depending on modes used, that may or may not be necessary, I only really noticed because I was drawing circles so they came out as slightly oval :wink:


(*) I was trying this to see if I could get better performance for remote display of x11, and better frame rates from direct frame grabbing on the organelle… I can get about 10fps with 720p, but thought perhaps at 480p i might well be closer to 24-30

I’ll throw my screen capture command here too, just so I can find it later :wink:

ffmpeg -cpuflags vfpv3 -f fbdev -i /dev/fb0 -r 24 -c:v libx264rgb  -preset ultrafast -crf 0 -b:v 500k /sdcard/output.avi

gets about 10fps on my quad core organelle for full resolution…
note: I record to sdcard which I suspect is quite a bit faster than usb, but the fps seems to be limited by cpu, even on its own core. think i might just have to run this on a desktop to capture the visual :wink:

Hey Florian! Thanks, it’s good to know what the amplifier will and won’t achieve! We’ve been finding ourselves in situations where long RCA cables have been happening more frequently, so it’s good to know that this will help with some distant CRTs. Much appreciated!!

Hi @danofsteel

After various experiments I finally ended in spending money for a Atomos ATOMCCNSA1. It is a high quality HDMI (and SDI) to analogue video converter. Using some control software you can select between various colour profiles, set brightness, contrast, luminance and hue and store it on the device. If required you can set the basic settings without a computer using DIP switches.

It costs 360 to 400 Euros (assumingly the same numbers in $$ in the US). Its a lot of money, but the result now is really ok.

Hey Florian, that’s super cool, seems like an effective solution, albeit an expensive one haha! For the time being I’ve had to put the ETC work on the back-burner, but will look into some of these options down the line. I can imagine that having some analogue control over contrast would be nice for the ETC, since I’ve noticed that the blacks aren’t as deep as they could be when using the converter that I currently have.

Cheers!