In this one I used the Rectangle.grid
method recursively. First applied to drawer.bounds
and then to the resulting rectangles until they were either too small or were out of luck
Now I have 6 different fill patterns for plotting. The last one is based on noise and it has 5 configurable parameters:
I plan to use it tomorrow with this kind of design and plot it on paper.
Usually I use OPENRNDR for gluing together my GLSL with input/output devices. This time it was extremely useful to output frame-perfect video from my shader:
Researching some data visualization for Krisenchat focused on comorbidities among suicidal children.
http://www.instagram.com/p/CfXGOwQOmKY/
Hi, my first openrndr test! excited to be here (:
Very nice! I also found super interesting the image you shared in Slack with the hardware setup you used to produce the video, with layers of glass, objects and lighting. Quite special to mix analog and digital that way
The original is animated. An endless loop of dashed lines. The shape is made by creating a grid of hobby curves, then connecting those hobby curves with closed hobby curves.
Using compute shaders and the Physarum algorithm.
Hey, Lukas here,
wrote a module for bpm based control that I think might be a cool addition to OPENRNDR EXTRA.
Not sure how well it fits into the existing (vast) ecosystem of orx’s.
Showcased it with a video:
Btw, i hit the keys 1 to 4 when i feel like it.
A transition takes place and modulates from envelope A to B, similar to crossfading.
Happy to see new posts here! Thank you for posting and welcome to the forum!
About the orx… hard to say where it can fit without seeing the code. I think this one might be in the same direction: orx/orx-time-operators at master · openrndr/orx · GitHub
I use several interpolators but I never made an orx out of them. Something to consider
Hello there, here it is the first experiment with compute shaders which I’m happy to share.
It’s a system of 300k particles with react with their own trails and with a radial force. The particles come in two types, and have opposite behaviour with respect to how they interact with the trail and force fields. The trails are decayed and diffused at each step, and at the moment there is no external noise injected, apart from the initial conditions of the system. What you see in the video are the trails themselves, or rather a colorized version of it using the particle type information, and there’s no parameter modulation. I really like seeing the formation of those wavefronts, and the transport of particles that happens around it. Next step is to modulate the various parameters with noise, I guess.
If there’s interest in the code let me know and I’ll share it: it’s quite basic, but it’s maybe a useful starting point for how to (and probably how not to as well!) use compute shaders.
Very nice colors @Alessandro Somehow it seems very organic to me. Reminds me of dark rounded stones from a river mixed with Autumn leafs
Here a still from the project I’ve been working on in recent months. Supported by BKM, NEUSTART KULTUR and Deutscher Kuenstlerbund.
@abe Thanks! I was indeed going for something organic feeling. I also tried to use Voronoi triangulation which also gave some soothing results
I really like the still of the project, super cool. I think the lighting comes out really nicely.
Nice How did you bring the shape into Blender?
@abe I used a little OBJ exporter class that I made since I wanted to try the File I/O you added , here it is with a little example
import org.openrndr.application
import org.openrndr.extra.noise.uniform
import org.openrndr.math.Vector3
import java.io.File
class PolyLineExporter() {
private data class Curve(var points: List<Vector3>, var closed: Boolean)
private var curves = mutableListOf<Curve>()
fun add(points: List<Vector3>, toClose: Boolean = false) {
curves.add(Curve(points, toClose))
}
fun toFile(fname: String) {
val output = buildString {
var vertexCount = 0
curves.forEach { c ->
val points = c.points
val verts = c.points.size
val closed = c.closed
points.forEach {
append("v ${it.x} ${it.y} ${it.z}\n")
}
append("l ")
(0 until verts).forEach {
append("${(it + 1 + vertexCount)} ")
}
if (closed) append("${1 + vertexCount}\n") else append("\n")
vertexCount += verts
}
}
File("$fname.obj").writeText(output)
println("File Saved")
}
}
fun main() = application {
configure {
width = 1000
height = 1000
}
program {
val pointsA = List(10) {
Vector3.uniform(-1.0, 1.0)
}
val exporter = PolyLineExporter()
exporter.add(pointsA)
exporter.toFile("test")
extend {
}
}
}
It allows to add multiple polylines, both open and closed.Fairly basic, but it does its job Then in Blender you can convert the mesh to a curve and bevel it.
Motivated by something I saw at the latest Berlin Creative Coding meeting, which was a CPU implementation of this, I made a GPU implementation, more precisely using a fragment shader and a backbuffer
This approach does not suffer from having to recompute the various shapes, and it’s very reactive (the gif doesn’t do it justice). As with any image warping process, It does suffer though from blurring due to repeated iterations. One idea to move it forward could be to use an image flow approach. The key difference here is that since elements are added, one would probably need to work with layers, warp each one individually by iterating the advection, and then recombine it at the end in the given order. Maybe this is the time I finally get comfortable with ArrayTexture …