Render Target video exporting question

I have succesfully written a simple program inside my main program that exports a video off-screen using Render Targets.
The problem is: my computer sucks. Therefore, the video output sucks as well, with lots of lag and frameskipping.
I want to ensure that the exported video has 100% of the frames exported, even if it takes a long time. I guess this means decoupling video exporting from video playback. I was able to do this in Processing (but it was slooow and it was pretty much a sloppy workaround).

here’s my current simple exporting program

import org.openrndr.application
import org.openrndr.color.ColorRGBa
import org.openrndr.draw.renderTarget
import org.openrndr.draw.isolatedWithTarget
import org.openrndr.ffmpeg.VideoPlayerFFMPEG
import org.openrndr.ffmpeg.VideoWriter
import utils.MediaManager
import utils.MediaType
import java.io.File
import java.util.logging.Logger

fun main() = application {
    configure {
        width = 960
        height = 540
        title = "Off-Screen Video Export with Progress Counter"
    }

    program {
        val logger = Logger.getLogger("SimpleVideoPlayer")
        val mediaManager = MediaManager()
        val videoFile = File("c:\\videos\\godgangs\\playlist\\01.mp4")

        var videoPlayer: VideoPlayerFFMPEG? = null
        val playDuration = 10_000L // 10 seconds in milliseconds
        var startTime: Long? = null
        var videoStopped = false
        var elapsedTime = 0L

        // Set up the video writer
        val videoWriter = VideoWriter()
        videoWriter.size(width, height)
        videoWriter.output("video/output.mp4")
        videoWriter.start()

        // Set up the render target
        val videoTarget = renderTarget(width, height) {
            colorBuffer()
            depthBuffer()
        }

        extend {
            // Render video frames off-screen
            drawer.isolatedWithTarget(videoTarget) {
                clear(ColorRGBa.BLACK)

                if (videoPlayer == null) {
                    logger.info("Loading video: ${videoFile.absolutePath}")
                    videoPlayer = mediaManager.loadMedia(videoFile, MediaType.VIDEO) as? VideoPlayerFFMPEG
                    videoPlayer?.play()
                    startTime = System.currentTimeMillis()
                }

                val currentTime = System.currentTimeMillis()
                startTime?.let {
                    elapsedTime = currentTime - it
                    if (elapsedTime >= playDuration && !videoStopped) {
                        videoPlayer?.pause()
                        videoPlayer?.seek(0.0)
                        videoStopped = true
                        logger.info("Stopping video after 10 seconds")
                    } else if (!videoStopped) {
                        videoPlayer?.draw(drawer, 0.0, 0.0, width.toDouble(), height.toDouble())
                    }
                }

                // Write the frame to the video
                videoWriter.frame(videoTarget.colorBuffer(0))
            }

            // Display a black screen with a progress counter
            drawer.clear(ColorRGBa.BLACK)
            drawer.fill = ColorRGBa.WHITE
            val timeString = String.format("%.1f", elapsedTime / 1000.0)
            drawer.text("Exporting... Time: $timeString s", width / 2.0 - 100.0, height / 2.0)

            // Stop the video writer after the play duration
            if (elapsedTime >= playDuration) {
                videoWriter.stop()
                application.exit()
            }
        }

        ended.listen {
            logger.info("Application ending")
            mediaManager.disposeAll()
        }
    }
}

it uses this MediaManager that handles media throughout my application

package utils

import org.openrndr.draw.ColorBuffer
import org.openrndr.ffmpeg.VideoPlayerFFMPEG
import java.io.File
import java.util.logging.Logger

private val logger = Logger.getLogger(Constants.MEDIA_MANAGER_LOGGER_NAME)

enum class MediaType {
    IMAGE, VIDEO
}

class MediaManager {
    private val loadedImages = mutableMapOf<File, ColorBuffer>()
    private val loadedVideos = mutableMapOf<File, VideoPlayerFFMPEG>()

    fun loadMedia(file: File, mediaType: MediaType): Any? {
        return when (mediaType) {
            MediaType.IMAGE -> loadImage(file)
            MediaType.VIDEO -> loadVideo(file)
        }
    }

    private fun loadImage(file: File): ColorBuffer? {
        return loadedImages[file] ?: tryWithLogging("load image ${file.name}", logger) {
            logger.info("Loading image: ${file.name}")
            val image = org.openrndr.draw.loadImage(file)
            loadedImages[file] = image
            logger.info("Image loaded successfully: ${file.name}")
            image
        }
    }

    private fun loadVideo(file: File): VideoPlayerFFMPEG? {
        return loadedVideos[file] ?: tryWithLogging("load video ${file.name}", logger) {
            logger.info("Loading video: ${file.name}")
            val video = VideoPlayerFFMPEG.fromFile(file.absolutePath)
            video.ended.listen { video.restart() }
            loadedVideos[file] = video
            logger.info("Video loaded successfully: ${file.name}")
            video
        }
    }

    fun disposeAll() {
        loadedVideos.values.forEach { it.dispose() }
        loadedVideos.clear()
        loadedImages.values.forEach { it.destroy() }
        loadedImages.clear()
    }
}

If anybody can point me in the right direction I’ll be glad and thankful!
(Abe Pazos’ photo will be loaded at the starting screen of my app with bells and whistles around him already)

1 Like

Hi! :slight_smile: Short time no see! XD

I think this might help:

val v = VideoPlayerFFMPEG.fromFile(path,
    clock = { frameCount / 60.0 / 30.0 },
    mode = PlayMode.VIDEO,
    configuration = conf
)

The thing behind clock = is a function. The normal approach is clock = { seconds } so the time used for the video is the current time in seconds. By making it depend on frameCount you can calculate which frame you want to display. If the frame rate goes down, the video location should still match.

I don’t know the effect of using conf, but here one based on the source code:

        val conf = VideoPlayerConfiguration()
        conf.let {
            it.videoFrameQueueSize = 5
            it.packetQueueSize = 1250
            it.displayQueueSize = 2
            //it.synchronizeToClock = false
        }

I don’t know if this conf is appropriate in this case. Just leaving it there in case it’s useful. I would first try without any custom configuration.

Looking forward to see what you create with all of this :slight_smile:

1 Like

Thanks again, pal. I tried to incorporate your suggestion, but i have this:

Unresolved reference: PlayMode

Don’t know what I’m missing.

import org.openrndr.application
import org.openrndr.color.ColorRGBa
import org.openrndr.draw.renderTarget
import org.openrndr.draw.isolatedWithTarget
import org.openrndr.ffmpeg.VideoPlayerFFMPEG
import org.openrndr.ffmpeg.VideoWriter
import utils.MediaManager
import utils.MediaType
import java.io.File
import java.util.logging.Logger

fun main() = application {
    configure {
        width = 960
        height = 540
        title = "Off-Screen Video Export with Progress Counter"
    }

    program {
        val logger = Logger.getLogger("SimpleVideoPlayer")
        val mediaManager = MediaManager()
        val videoFile = File("c:\\videos\\godgangs\\playlist\\01.mp4")

        var videoPlayer: VideoPlayerFFMPEG? = null
        val playDuration = 10_000L // 10 seconds in milliseconds
        var startTime: Long? = null
        var videoStopped = false
        var elapsedTime = 0L

        // Set up the video writer
        val videoWriter = VideoWriter()
        videoWriter.size(width, height)
        videoWriter.output("video/output.mp4")
        videoWriter.start()

        // Set up the render target
        val videoTarget = renderTarget(width, height) {
            colorBuffer()
            depthBuffer()
        }

        extend {
            // Render video frames off-screen
            drawer.isolatedWithTarget(videoTarget) {
                clear(ColorRGBa.BLACK)

                if (videoPlayer == null) {
                    logger.info("Loading video: ${videoFile.absolutePath}")
                    videoPlayer = VideoPlayerFFMPEG.fromFile(
                        videoFile.absolutePath,
                        clock = { frameCount / 60.0 / 30.0 }, // Synchronize video with frame count
                        mode = VideoPlayerFFMPEG.PlayMode.VIDEO
                    )
                    videoPlayer?.play()
                    startTime = System.currentTimeMillis()
                }

                val currentTime = System.currentTimeMillis()
                startTime?.let {
                    elapsedTime = currentTime - it
                    if (elapsedTime >= playDuration && !videoStopped) {
                        videoPlayer?.pause()
                        videoPlayer?.seek(0.0)
                        videoStopped = true
                        logger.info("Stopping video after 10 seconds")
                    } else if (!videoStopped) {
                        videoPlayer?.draw(drawer, 0.0, 0.0, width.toDouble(), height.toDouble())
                    }
                }

                // Write the frame to the video
                videoWriter.frame(videoTarget.colorBuffer(0))
            }

            // Display a black screen with a progress counter
            drawer.clear(ColorRGBa.BLACK)
            drawer.fill = ColorRGBa.WHITE
            val timeString = String.format("%.1f", elapsedTime / 1000.0)
            drawer.text("Exporting... Time: $timeString s", width / 2.0 - 100.0, height / 2.0)

            // Stop the video writer after the play duration
            if (elapsedTime >= playDuration) {
                videoWriter.stop()
                application.exit()
            }
        }

        ended.listen {
            logger.info("Application ending")
            mediaManager.disposeAll()
        }
    }
}

import org.openrndr.ffmpeg.PlayMode ?

I hit alt+enter on top of the highlighted word to import it.

yeah, I had already tried that but it doesn’t work :frowning:

1 Like

Did you manage to make it work? I’m not sure if pressing alt+enter didn’t work, or was it the import itself?

Yeah, i don’t remember how did I solve it but I did. In fact I have advanced a bit on my VideoExporter, it is exporting what I want to, but I’m still struggling with exporting video consistently.

package utils

import org.openrndr.Program
import org.openrndr.draw.*
import org.openrndr.ffmpeg.VideoWriter
import models.Project
import org.openrndr.color.ColorRGBa
import renderers.TimelineRenderer
import renderers.drawProgramContents
import java.util.logging.Logger

class VideoExporter(
    private val program: Program,
    private val project: Project,
    private val timelineRenderer: TimelineRenderer,
    private val imageManager: ImageManager,
    private val videoManager: VideoManager
) {
    private val logger = Logger.getLogger(Constants.VIDEO_EXPORTER_LOGGER_NAME)
    private var isExporting = false
    private var progress = 0.0
    private var renderTarget: RenderTarget? = null
    private var videoWriter: VideoWriter? = null
    private var currentFrame = 0
    private var totalFrames = 0
    private lateinit var exportProject: Project
    private val frameRate = 30 // Frames per second

    fun exportVideo(outputFilePath: String) {
        if (isExporting) {
            logger.warning("Export already in progress")
            return
        }

        isExporting = true
        progress = 0.0
        currentFrame = 0

        renderTarget = renderTarget(program.width, program.height) {
            colorBuffer()
            depthBuffer()
        }

        videoWriter = VideoWriter().apply {
            size(program.width, program.height)
            output(outputFilePath)
            frameRate = this@VideoExporter.frameRate // Set the frame rate
            start()
        }

        totalFrames = (project.totalDuration / 1000.0 * frameRate).toInt()
        exportProject = project.copy(playing = true, startTime = 0L)

        logger.info("Starting video export: $outputFilePath")

        // Start the export process
        exportAllFrames()
    }

    private fun exportAllFrames() {
        while (currentFrame < totalFrames) {
            renderExportFrame()
            currentFrame++
            progress = currentFrame.toDouble() / totalFrames
        }
        finishExport()
    }

    private fun renderExportFrame() {
        val currentTime = (currentFrame / frameRate.toDouble()) * 1000.0 // Convert frame to milliseconds
        exportProject.startTime = System.currentTimeMillis() - currentTime.toLong()
        exportProject.currentSceneIndex = exportProject.scenes.indexOfLast { it.startTime <= currentTime }

        renderTarget?.let { rt ->
            program.drawer.isolatedWithTarget(rt) {
                clear(ColorRGBa.TRANSPARENT)
                program.drawProgramContents(exportProject, timelineRenderer, "Exporting...", imageManager, videoManager, isExporting = true)
            }
            videoWriter?.frame(rt.colorBuffer(0))
        }
    }

    private fun finishExport() {
        videoWriter?.stop()
        renderTarget?.destroy()
        renderTarget = null
        videoWriter = null
        isExporting = false
        progress = 1.0
        logger.info("Video export completed")
    }

    fun cancelExport() {
        if (isExporting) {
            videoWriter?.stop()
            renderTarget?.destroy()
            renderTarget = null
            videoWriter = null
            isExporting = false
            progress = 0.0
            logger.info("Video export cancelled")
        }
    }

    fun getProgress(): Double = progress

    fun isExporting(): Boolean = isExporting
}

It uses a MP3 file to set the duration of the playback, then loads images and videos sequentially based on Scene Numbers, which are determined by a timestamped transcript .txt file. (It’s not exporting audio, for now I’m happy with syncing things with FFMpeg later).

Problem is that in the exported video, the videos included in the playback are lagging, with frameskips and weird behavior.

Here’s the video output exported by my OPENRNDR program.
Here’s one of the videos used in the playback.
Here’s another one used in the playback.

I’m trying to find a way to ensure a smooth video export. I’m quite sure if I have a nice PC it would be better, but the point here is to make sure a slow computer exports the same video as a fast computer.

This is a lightweight video editor I’m working on, it will apply shaders and other effects to the playback as someone would apply them to a page using CSS (with a fx.txt text file that applies shaders to scenes).

I had a workaround in mind, which was to merge the original video files in the export process instead of loading them into the program and re-exporting them.

Problem is, this way I won’t be able to apply shaders to the videos, which is the whole point.

Maybe the exporting process can be streamlined in a similar fashion… process the videos one at a time to ensure consistency then merge them back?
I don’t know what benefits can come out of this, I feel I’m missing something.

Just saw this:

Using the Screenshots extension to export individual frames of the visual output to generate the video. Something I may try, only concern is about performance regarding capturing screenshots from video rendered inside OPENRNDR in my machine.
Key would be ensure it takes all the necessary screenshots of a Render Target.

Speculations

I came up with a different approach: avoid using the video player and extract all video frames. Then you can be sure you get them all. Assumes all video files have the same frame rate. This can potentially be faster too, as it saves the frames as fast as possible, detached from the programs frame rate.

import org.openrndr.application
import org.openrndr.color.ColorRGBa
import org.openrndr.draw.*
import org.openrndr.extra.imageFit.FitMethod
import org.openrndr.extra.imageFit.imageFit
import org.openrndr.ffmpeg.VideoWriter
import java.io.File
fun getVideoDimensions(path: File): IntVector2 {
    val pbGetVideoDimensions = ProcessBuilder(
        "ffprobe",
        "-v", "error",
        "-select_streams", "v:0",
        "-show_entries", "stream=width,height",
        "-of", "csv=s=x:p=0",
        path.absolutePath
    )

    val videoDimensions = String(pbGetVideoDimensions.start().inputStream.readAllBytes()).trim().split("x")
    return IntVector2(
        videoDimensions[0].toInt(),
        videoDimensions[1].toInt()
    )
}
/**
 * A class to provide access to all the frames of a video file.
 * It extracts all frames of the [path] video file into a subfolder in PNG format.
 */
class VideoFrames(path: File, private val drawer: Drawer) {
    private val videoDimensions = getVideoDimensions(path)
    private var rt = renderTarget(videoDimensions.x, videoDimensions.y) {
        colorBuffer()
    }

    // A folder with the same name as the video file with ".frames" appended
    private val folder = File(path.absolutePath + ".frames")

    // Don't throw random files into the generated folder, it will mess up
    // the frameCount calculation
    var frameCount = folder.listFiles()?.size ?: 0
        private set

    // Used to avoid regenerating a frame if it was just requested.
    // Useful if we draw the same frame multiple times (aka pause).
    private var requestedFrame = -1

    // Returns a frame loaded from disk, or TRANSPARENT if out of range
    operator fun get(frame: Int): ColorBuffer {
        if (frame != requestedFrame) {
            drawer.isolatedWithTarget(rt) {
                ortho(rt)
                if (frame < 0 || frame >= frameCount) {
                    clear(ColorRGBa.TRANSPARENT)
                } else {
                    val imagePath = "$folder/frame${String.format("%05d", frame + 1)}.png"
                    val loadedImage = loadImage(imagePath)
                    image(loadedImage)
                    loadedImage.destroy()
                }
            }
            requestedFrame = frame
        }
        return rt.colorBuffer(0)
    }

    // Create the frame folder if it doesn't exist
    // and fill it with extracted frames
    init {
        if (!folder.exists()) {
            folder.mkdir()
        }
        if (frameCount == 0) {
            val args = listOf(
                "ffmpeg",
                "-r", "1",
                "-i", path.absolutePath,
                "-r", "1",
                "$folder/frame%05d.png"
            )
            val pb = ProcessBuilder(args)
            pb.start().waitFor()
            frameCount = folder.listFiles()?.size ?: 0
            println("$frameCount frames extracted")
        } else {
            println("$frameCount frames found")
        }
    }

    fun destroy() = rt.destroy()
}
// A sample program making use of VideoFrames.
// It just draws all the frames from in.mp4 into out.mp4, adding a text on top.
fun main() = application {
    program {
        val videoWriter = VideoWriter()
        videoWriter.size(width, height)
        videoWriter.output("/tmp/out.mp4")
        videoWriter.start()

        val canvas = renderTarget(width, height) {
            colorBuffer()
            depthBuffer()
        }

        val frames = VideoFrames(File("/tmp/in.mp4"), drawer)

        val font = loadFont("data/fonts/default.otf", 200.0)

        for (i in 0 until frames.frameCount) {
            val frame = frames[i]
            drawer.isolatedWithTarget(canvas) {
                imageFit(frame, bounds, fitMethod = FitMethod.Cover)
                fill = ColorRGBa.CYAN
                fontMap = font
                text("$i", 10.0, height - 20.0)
            }
            // Save the canvas into the video file
            videoWriter.frame(canvas.colorBuffer(0))
            println("Saved frame $i")
        }
        videoWriter.stop()
        frames.destroy()

        println("Done")

        extend {
        }
    }
}

In the example we play the frames in the exact same order (creating an output matching the input), but we could easily go backwards, or jump to random frames. We could create a video longer or shorter than the original, apply effects, etc.

One has to be careful with the sizes of the produced frames folders, and maybe delete them when done.

Wow, thanks a lot. I will try to incorporate this into my program and I will come back with the results. There are situations that 3 videos are being rendered on screen at the same time, but assuming all are at the same framerate it shouldn’t be a problem. As soon as I have results I will return here. Thanks again.

1 Like