Trying to Use the Alpha Channel When Writing to an ArrayTexture

Hello everyone,

I’m trying to create an ArrayTexture by writing the values to a ByteBuffer as suggested by the guide, but it seems that the alpha channel is only binary, i.e. when I draw the texture as image, it is completely transparent if the value is zero and completely opaque if it’s a positive number. I’m using the defaults for ColorFormat, i.e. RGBa, etc. when creating the ArrayTexture.

Is the problem in the texture or could it be in the way that I draw it? Thanks in advance for any support!

Hello there, and welcome! :slight_smile:
If possible could you past the part of the code used to fill the ArrayTexture? I can then try to look into it and try to reproduce what is happening.

1 Like

Sure, thanks for the fast reply! This is my code for creating the texture:

        val arrayTexture = arrayTexture(width, height,1)
        val buffer = ByteBuffer.allocateDirect(width * height * 4)

        for (y in 0 until height) {
            for (x in 0 until width) {
                val color = getColor(x, y)
                buffer.put((color.r * 255).toInt().toByte())
                buffer.put((color.g * 255).toInt().toByte())
                buffer.put((color.b * 255).toInt().toByte())
                buffer.put((color.alpha * 255).toInt().toByte())
            }
        }
        buffer.rewind()
        arrayTexture.write(0, buffer)

getColor returns a ColorRGBa, obviously. Instead of using the color’s alpha channel, I also tried (y % 256), which lead me to the discovery that the alpha channel seems to be interpreted binary.
For drawing the texture, I just invoke drawer.image.

It seems to be an issue with drawer.image rather than ArrayTexture: could you substitute that line with the following bit of code and see how it goes?

    drawer.stroke = null
    drawer.shadeStyle = shadeStyle {
                fragmentTransform = """
                    vec2 uv = c_boundsPosition.xy;
                    uv.y = 1.0 - uv.y;
                    vec4 col = texture(p_img, vec3(uv, 0.0));
                    x_fill = col;
                """.trimIndent()
                parameter("img", arrayTexture)
            }

            drawer.rectangle(drawer.bounds)
1 Like

Yes, that fixed it! Thanks a lot! :tada:

I guess in order to understand the fragmentTransform code I need to learn GLSL, right? :sweat_smile:

1 Like

You are welcome. :slight_smile:

Kinda, yeah :sweat_smile: May I ask why you want to use an ArrayTexture? From the example I can’t appreciate it, but possibly there is some context that I am missing.

May I ask why you want to use an ArrayTexture? From the example I can’t appreciate it, but possibly there is some context that I am missing.

Sure! I’m working on a Mastodon bot for procedural image generation (a successor of this Twitter bot), and I use textures generated from OpenSimplex noise for various purposes, e.g. cosmic nebulae and moon surfaces. I got the impression that ArrayTexture is an appropriate (and fast) way to generate the textures.

2 Likes

I see, interesting. :slight_smile:
In my experience, I prefer using ArrayTexture when I need to switch between textures in real-time (video effects, etc.), since that reduces the uploading/downloading to GPU costs. In batched situations, i.e. no real-time needed, though, I usually stick with colorBuffers.

1 Like

Good to know! I just started using openrndr on Friday, so I’m pretty new to all the APIs, but I will look into color buffers if those are easier to use or in any other way better suited for what I need.

1 Like