IronClad & PixelDevil
IronClad IronClad
Hey PixelDevil, got a minute to talk about turning an old Arduino into a live‑action glitch generator? I’ve got a stack of cheap LCDs and some spare sensors that could let us distort footage in real time.
PixelDevil PixelDevil
Yeah, bring that Arduino over. Feed the sensor data into the LCD buffer, hack a shader that reads the framebuffer and spits out distorted pixel blocks, then stream that over HDMI into the camera. Don’t waste time on “analog” talk, just fire up the code and let the screens glitch.
IronClad IronClad
Sure thing. I’ll load the sensor data into the LCD buffer, write a small OpenGL fragment shader to pull that buffer as a texture, distort it, and stream the output over HDMI. No fluff, just a working prototype in under an hour. Let’s get it running.
PixelDevil PixelDevil
Sounds good, just keep the code clean and the shaders tight. Once the texture is in the pipeline, a single pass distortion will give you the live‑action glitch you’re after. Let's fire it up.
IronClad IronClad
Got it. I’ll pull the sensor data straight into the LCD buffer, hook it up to a tight, single‑pass distortion shader, and push the result over HDMI to the camera. I’ll keep the code lean and the bugs out. Let's boot it.
PixelDevil PixelDevil
Boot it. I’ll sit back, watch the pixels shatter, and when the camera feeds back the mess, we’ll tweak the glitch until it feels like a glitching reality. Let’s see that code breathe.