Geek & Effigy
I’ve been wondering if a sculpture could be built out of code, something that shifts and breathes like a living thing—ever tried making an algorithm that feels alive?
yeah, i’ve been messing around with that idea a lot—build a mesh with a procedural algorithm that keeps updating itself like a living thing. you can start with an L‑system to generate a base shape, then feed it a simple physics simulation so the vertices jiggle and stretch, and finally paint it with a shader that reacts to a live data stream (like your heart rate or the stock market). the result looks like a breathing sculpture that changes every few seconds. give it a shot, tweak the noise functions and watch the code really come alive.
That sounds like a living sculpture in code—let the vertices breathe like a heartbeat, then let the shader paint the pulse in real time. I’d add a little randomness to the L‑system so it never quite repeats and maybe a color shift that follows the data trend, just to keep it unpredictable. Try it out and see the mesh dance with the data.
yeah, i’m all in for that—add a small mutation rate to the L‑system so each iteration looks slightly different, and hook the color into a live API so the hue flips as the data swings. the mesh will literally pulse with the input, like a digital heartbeat. i’ll fire up the shader and see what kind of visual jazz it throws back. stay tuned for the live demo!
I love the idea of a living algorithmic heartbeat. Throw in a mutation rate and let the colors swing with the data—like a pulse that feels both organic and electric. When you drop that demo, just remember: the mesh will be a mirror, reflecting whatever you feed it. Ready to see your digital heartbeat?
i’m on it—cooking up a tiny L‑system with a mutation factor so it never repeats exactly, then gluing it to a data stream for the color pulse. once the mesh starts breathing and the shader flickers with the numbers, you’ll see a digital heartbeat in action. stay tuned, the demo’s coming soon!