Video mapping 2d

hi i am trying to figure out how to do the following:

  1. a video is played using ffmpeg and into a render target (A)
  2. four coordinates (Q) describe a polygon inside the render target (A)
  3. four other coordinates (E) describe a polygon inside an “output render target” (B)
  4. use the Q cords to extract texture data from (A) and use it as texture inside polygon (E) on render target (B)

or in other terms : how do i immitate madmapper in order to have input textures, texture polygons, surface polygons and map from texture cords to surface cords?

here is an example taken from ofxpimapper

my problem is i am new to glsl and cant figure out how to specify the texture coordinates and the surface coordinates inside openrndr, as i like terminology understanding (tried hard but doesnt stick)

thinking about how to have a texture plus 4 coordinates each a corner in a rectangular like polygon and then use that info as “texcoords” to tell where to get tex data for another polygon

i now have learned that vertexbuffers contain the coordinates of the shape as well as the normalized coordinates of the texture (where in the texture is each shape coordinate) so i guess i need to find a way to edit those vectors seperately .

my first goal is to be able to have a texture and a triangle shape and be able to edit the texture coordinates as well as the shape coordinates seperately with the mouse. and see the textured shape being rendered in the output

It sounds like you’re on the right path :slight_smile:

I quickly wrote this, maybe it helps. The write method expects 4 screen coordinates and 4 texture coordinates. The deformation looks odd because the quad is made out of two triangles. But at least it should show the components you might need.

import org.openrndr.application
import org.openrndr.draw.*
import org.openrndr.math.Polar
import org.openrndr.math.Vector2
import org.openrndr.math.Vector3
import kotlin.math.sin

class QuadWriter(val tex: ColorBuffer) {
    private val vf = vertexFormat {
        position(3)
        textureCoordinate(2)
    }
    private val quad = vertexBuffer(vf, 6)
    private val shader = shadeStyle {
        fragmentTransform = "x_fill = texture(p_tex, va_texCoord0.xy);"
        parameter("tex", tex)
    }

    fun write(positions: List<Vector3>, textureCoords: List<Vector2>) {
        quad.put {
            write(positions[0])
            write(textureCoords[0])
            write(positions[1])
            write(textureCoords[1])
            write(positions[3])
            write(textureCoords[3])

            write(positions[1])
            write(textureCoords[1])
            write(positions[2])
            write(textureCoords[2])
            write(positions[3])
            write(textureCoords[3])
        }
    }

    fun draw(drawer: Drawer) {
        drawer.shadeStyle = shader
        drawer.vertexBuffer(quad, DrawPrimitive.TRIANGLES)
    }
}

fun main() = application {
    program {
        val tex = loadImage("data/images/pm5544.png")
        val quadWriter = QuadWriter(tex)

        extend {
            val center = drawer.bounds.center
            quadWriter.write(
                List(4) {
                    (center + Polar(it * 90.0, sin(seconds + it) * 100.0 + 150.0).cartesian).xy0
                }, listOf(
                    Vector2(0.1, 0.1),
                    Vector2(0.3, 0.2),
                    Vector2(0.4, 0.6),
                    Vector2(0.2, 0.5)
                )
            )
            quadWriter.draw(drawer)
        }
    }
}

ps. The quad coordinates are clockwise.

thank you so much! i am not great at math and since my departure from software development (due to stress) i am having a hard time grasping new concepts :slight_smile: so it means a lot to get some code i can fuzz about with until i understand

1 Like

i have tried to read up on the issue of the texture being warped down the diagonal but the closest i’ve come to a definition of the issue is this stack overflow question: Quad texture stretching on OpenGL - Stack Overflow
talking about the interpolation of each of the two triangles in the mesh being rendered on their own. and therefore you get skewed quad texture

can anybody send me in a direction of getting the quad texture to be “not warped down the diagonal”

this article also talks about the issue, reading it now! :slight_smile:

1 Like

this repo is also talking about the “nonaffine” transformations needed to get a proper non-affine quad texture going

Very interesting issue :slight_smile:

There’s also a part 2:

I have learned that a cheap way to get around the issue is to have more triangles. The subject is called tesselation. And apparently can be done in a shader

That gave me an idea: orx/orx-shapes at master · openrndr/orx · GitHub has a bezier patch that could be used for this. Unfortunately I discovered issues 350 and 351. Once they are fixed I think that might be an easy way to map irregular (even curved) shapes.

Mmm… would it be enough for you if only the target is irregular, but the source is rectangular? Or do you want both to be irregular?

Meanwhile, I think it would be simple to create a 3D Plane generator with configurable x and y resolution and uv texture coords, which can be used in 2D (and then add it to orx-shapes).

feels like i’m nearly there. my solution (albeit rookie) is to calculate interpolated points first on the vertical sides of the quad and then from between those calculate and interpolate the horizontal points.
i do it twice, once for position and once for texture

import org.openrndr.application
import org.openrndr.draw.*
import org.openrndr.ffmpeg.VideoPlayerConfiguration
import org.openrndr.ffmpeg.VideoPlayerFFMPEG
import org.openrndr.math.Polar
import org.openrndr.math.Vector2
import org.openrndr.math.Vector3
import kotlin.math.sin
import org.openrndr.ffmpeg.PlayMode
import org.openrndr.math.mix


class QuadWriter(val vid:VideoPlayerFFMPEG) {
    private val vf = vertexFormat {
        position(3)
        textureCoordinate(2)
    }
    private val segments =8
    private val quad = vertexBuffer(vf, 6*segments*segments)


    fun write(positions: List<Vector3>, textureCoords: List<Vector2>) {
        //        b
        //   1   --   2
        //   |      / |
        // a |   /    | c
        //   | /      |
        //   0   --   3
        //        d
        quad.put {
            for (v in 0 until segments) {
                for (u in 0 until segments) {

                    // vertical first

                    val p0y = mix(positions[0], positions[1], (1.0 * v) / segments)
                    val p3y = mix(positions[3], positions[2], (1.0 * v) / segments)
                    val p1y = mix(positions[0], positions[1], (1.0 * (v+1)) / segments)
                    val p2y = mix(positions[3], positions[2], (1.0 * (v+1)) / segments)

                    val p0 = mix(p0y, p3y, (1.0 * u) / segments)
                    val p3 = mix(p0y, p3y, (1.0 * (u+1)) / segments)
                    val p1 = mix(p1y, p2y, (1.0 * u) / segments)
                    val p2 = mix(p0y, p3y, (1.0 * (u+1)) / segments)

                    val t0y = mix(textureCoords[0], textureCoords[1], (1.0 * v) / segments)
                    val t3y = mix(textureCoords[3], textureCoords[2], (1.0 * v) / segments)
                    val t1y = mix(textureCoords[0], textureCoords[1], (1.0 * (v+1)) / segments)
                    val t2y = mix(textureCoords[3], textureCoords[2], (1.0 * (v+1)) / segments)

                    val t0 = mix(t0y, t3y, (1.0 * u) / segments)
                    val t3 = mix(t0y, t3y, (1.0 * (u+1)) / segments)
                    val t1 = mix(t1y, t2y, (1.0 * u) / segments)
                    val t2 = mix(t0y, t3y, (1.0 * (u+1)) / segments)



                    write(p2)
                    write(t2)
                    write(p1)
                    write(t1)
                    write(p0)
                    write(t0)

                    write(p0)
                    write(t0)
                    write(p3)
                    write(t3)
                    write(p2)
                    write(t2)

                }
            }
        }


    }

    fun draw(colorBuffer: ColorBuffer, drawer: Drawer) {
            drawer.shadeStyle = shadeStyle {
                fragmentTransform = "x_fill = texture(p_tex, va_texCoord0.xy);"
                parameter("tex", colorBuffer)
            }
            drawer.vertexBuffer(quad, DrawPrimitive.TRIANGLES)

    }
}

fun main() = application {
    program {
        val videoPlayer = VideoPlayerFFMPEG.fromFile("data/video2.mov",
            PlayMode.VIDEO,
            VideoPlayerConfiguration().apply {
                useHardwareDecoding = false
                videoFrameQueueSize = 500
                displayQueueCooldown = 5
                //synchronizeToClock = false
            })
        videoPlayer.play()
        videoPlayer.ended.listen {
            videoPlayer.restart()
        }
        val renderTarget = renderTarget(width, height) {
            colorBuffer()
        }
        val quadWriter = QuadWriter(videoPlayer)

        extend {
            drawer.withTarget(renderTarget) {
                videoPlayer.draw(drawer)
            }
            val center = drawer.bounds.center
            quadWriter.write(
                listOf(
                    Vector3(50.0, 450.0,0.0),
                    Vector3(50.0,100.0 ,0.0),
                    Vector3(550.0, 50.0,0.0),
                    Vector3(300.0, 400.0,0.0)
                ), listOf(
                    Vector2(0.0, 0.0),
                    Vector2(0.0, 1.0),
                    Vector2(1.0, 1.0),
                    Vector2(1.0, 0.0)
                )
            )
            quadWriter.draw(renderTarget.colorBuffer(0), drawer)
        }
    }
}

now i just need to figure out why its only half of my triangles that are showing…?

any ideas @abe ?

figured it out :slight_smile:
here is the diff gist:

1 Like

Nice!! Congratulations :tada:

About performance: if you are not animating the target shape, you can move the quadWriter.write() out of extend so it only calculates the vertexBuffer once (maybe useful if you draw hundreds of such shapes :slight_smile: )

Is this for doing projection mapping with a projector?

yup, yeah it only needs to be calculated when either the texture quad or the surface quad changes its cords.

yeah i am making a projection mapping vj app for linux:

  • 4 video channels
  • 1 output via udp broadcast to mpv (video player clients) each taking a quadrant of the udp video (think videosplitter)
  • a mapping editor where you create texture quads and surface quads.

basically like madmapper but with udp network output instead

on that note @abe is there an example of a gui creating objects that then can be selected in a drop down?

i need some way to create new texture/ surface quad pairs and be able to delete them as well as edit their corners with a mouse

1 Like

i can see that you can create data driven ui’s like this here:

orx-gui has a dropdown, but it uses an enum and I don’t think elements can be added / removed. Maybe we should add one based on mutableList

code: init · sloev/fremkalder@413d522 · GitHub

@abe i have now made some progress. but i am lost as to make the whole “edit surface points” thing going correctly as can be seen in the video, do you maybe have some ideas?

Nice progress :slight_smile:

By looking at the video it seems like the bottom one is working correctly, but the top one is inverted vertically.

If I do

...
val t0 = mix(t0y, t3y, (1.0 * u) / segments).vFlip()
val t3 = mix(t0y, t3y, (1.0 * (u + 1)) / segments).vFlip()
val t1 = mix(t1y, t2y, (1.0 * u) / segments).vFlip()
val t2 = mix(t1y, t2y, (1.0 * (u + 1)) / segments).vFlip()
...

private fun Vector2.vFlip() = Vector2(x, 1 - y)

or

            fragmentTransform = """
                vec2 c = va_texCoord0.xy;
                x_fill = texture(p_tex, vec2(c.x, 1.0 - c.y));
            """.trimIndent()

Then the image on the bottom looks as expected, but flipped vertically.
It’s almost there but not sure how to fix this right now.

I played a bit with the code. Most of it just simplifying things, but I think one should not create many vertexBuffers but reuse them. Also I believe they should be destroyed when no longer needed.

Surface.kt (3.2 KB)
Surfaces.kt (3.4 KB)
TemplateProgram.kt (7.3 KB)

thank you so much. @abe
i’ve only written kotlin since i wrote on this forum some weeks ago so i’m still getting my hands dirty on structuring the codebase plus how oop works in kotlin. so your splitting up, cleaning and reorging of my code is so appreciated.
i’ve added your files with a link back to here in this commit: add files from abe :-) see: https://openrndr.discourse.group/t/video-… · sloev/fremkalder@4df47f2 · GitHub

1 Like