Rendertarget to udp broadcast

Hi, i currently use ffmpeg ro do mpegts h264 udp broadcast on lan by piping ffmpeg into socat.

Can anybody point me in a direction of how to get rendertarget drawn using ffmpeg but with udp output?

Hi @supernihil ! Welcome to the forum :slight_smile:

You can probably use the VideoWriter which has a frame() method that takes a colorBuffer. The RenderTarget has a colorBuffer, so you could call myVideoWriter.frame(myRenderTarget.colorBuffer(0)).

Maybe you could create a VideoWriterProfile like the ones found in orx/orx-jvm/orx-video-profiles/src/main/kotlin at master · openrndr/orx · GitHub to broadcast your graphics, basically by overriding arguments and setting all the ffmpeg arguments you use for piping things.

Might that work?

2 Likes

Yup! That sounds like it could work. Could i then avoid calling the .output part as that would be taken care of in the args (“pipe:1” | socat etc)

Looking at VideoWriter I wonder if it will need some tweaking…

Calling output sets the filename. The filename is appended automatically as a last argument, which may break your command. As a workaround, I would try calling output("#foo.mp4"), which would add a useless bash comment as a last argument.

First I thought about calling output(""), but line 191 adds the extension if it’s not present, so the filename might become “.mp4”.

Let us know how it goes :slight_smile:

ps. Alternatively, download VideoWriter.kt and add it to your project, making any needed changes. Later I could change it upstream to take into account cases like this.

1 Like

I gave it a try, but it’s a bit more convoluted than I thought:

  • In VideoWriter.kt, the preamble arguments currently can not be overridden. You can discover the ffmpeg arguments being used by the ScreenRecorder by opening log4j2.yaml and changing the debug level to level: debug on line ~18. I see the following: /usr/bin/ffmpeg, -y, -f, rawvideo, -vcodec, rawvideo, -s, 768x576, -pix_fmt, rgba, -r, 30, -i, -, -an, #foo.mp4
  • Throwing a copy of VideoWriter.kt into your project to customize it sounded like a nice idea, but it requires dealing with these dependencies
    import io.github.oshai.kotlinlogging.KotlinLogging
    import org.bytedeco.ffmpeg.ffmpeg
    import org.bytedeco.javacpp.Loader
    import org.lwjgl.BufferUtils

Things as they are, I think the simplest option for now is to clone the openrndr repo, customizing VideoWriter.kt and building a local snapshot which you can then use in your template.

1 Like

those preamble arguments are not bad at all, they only deal with the pixel format from openrndr to ffmpeg stdin.

currently i use this for demo purpose:

# sender
ffmpeg -hide_banner -threads 1 -filter_threads 1 -f lavfi -i \
"testsrc=size=hd1080:rate=60,drawtext=text='%{localtime\:%S-%6N}':fontsize=144:box=1:boxcolor=black:fontcolor=yellow:y=(main_h/2)-text_h,format=pix_fmts=yuv420p" \
-threads 0 -frame_drop_threshold -1 -g 1 -fps_mode:v vfr \
-c:v libx264 -tune zerolatency -muxdelay 0 -flags2 '+fast' \
-f mpegts "pipe:1" | socat - udp-sendto:255.255.255.255:12345,broadcast

#receiver (can be many since we use udp broadcast)
socat -u udp-recv:12345,reuseaddr - | mpv --no-cache --untimed --profile=low-latency -no-correct-pts --fps=60 --osc=no -

so with your preamble it would then be nice to say:

/usr/bin/ffmpeg, -y, -f, rawvideo, -vcodec, rawvideo, -s, 768x576, -pix_fmt, rgba, -r, 30, -i, -, -an, 

# and somehow append the following:

-threads 0 -frame_drop_threshold -1 -g 1 -fps_mode:v vfr \
-c:v libx264 -tune zerolatency -muxdelay 0 -flags2 '+fast' \
-f mpegts "pipe:1" | socat - udp-sendto:255.255.255.255:12345,broadcast

I made something work :slight_smile:

I made ffmpeg send via UDP directly, instead of using a pipe.

Maybe there’s a simpler way?

I tried to view the output in VLC (didn’t work), ffplay (worked with fast frame rate) and obs (worked with slow frame rate).

Basically I added includeDefaultArguments to allow fully customizing the arguments when set to false.

It’s a prototype :slight_smile:

ps. Replacing the preamble maybe wasn’t necessary.

update: it does work without changing the framework, but in that case it produces an mp4 video file (even when streaming) and I don’t know how to avoid that.

yeah i tried that as well, basically running my own ffmpeg process, but the issue then is that ffmpeg udp output is not “broadcast” as in the sense of one to many. its something else (undocumented it seems …) this is why i use socat.

I haven’t tried, but for running pipes one might need one of these approaches?

1 Like

Amazing that you made it work! Congratulations :slight_smile: Did you use the ScreenRecorder / VideoWriter? Or did you write your own?

Ah I see at

:slight_smile: Great!!