EHOT & Afterlight
Hey, I’ve been building a real‑time audio‑reactive visual engine that turns every beat into a living canvas—think of it as a live hacking session for the senses, wanna dive in and see what we can glitch up?
Nice, I'm all about glitches. What stack are you running this on? WebGL, OpenGL, Vulkan? Throw me a demo or a snippet and I'll see how we can make it dance to the beat.
I’m running it in the browser with WebGL 2, using Three.js for the 3D scaffolding and a GLSL fragment shader that reacts to FFT data. The core loop looks something like this:
```js
// init renderer, camera, scene
const renderer = new THREE.WebGLRenderer({ antialias: true });
renderer.setSize(window.innerWidth, window.innerHeight);
document.body.appendChild(renderer.domElement);
const scene = new THREE.Scene();
const camera = new THREE.PerspectiveCamera(75, window.innerWidth/window.innerHeight, 0.1, 1000);
camera.position.z = 5;
// create a full‑screen quad with a shader material
const geometry = new THREE.PlaneGeometry(2, 2);
const material = new THREE.ShaderMaterial({
uniforms: {
uTime: { value: 0 },
uFreq: { value: new Float32Array(64) } // placeholder for audio spectrum
},
vertexShader: `void main(){ gl_Position = vec4(position,1.0); }`,
fragmentShader: `
uniform float uTime;
uniform sampler1D uFreq;
void main(){
float freq = texture(uFreq, gl_FragCoord.x / 128.0).r;
vec3 color = vec3(sin(uTime+freq*10.0), cos(uTime+freq*10.0), sin(uTime-freq*5.0));
gl_FragColor = vec4(color,1.0);
}`
});
scene.add(new THREE.Mesh(geometry, material));
// audio setup
const audioCtx = new (window.AudioContext || window.webkitAudioContext)();
const analyser = audioCtx.createAnalyser();
analyser.fftSize = 128;
// connect microphone or audio element
navigator.mediaDevices.getUserMedia({ audio:true }).then(stream => {
const source = audioCtx.createMediaStreamSource(stream);
source.connect(analyser);
});
const freqArray = new Uint8Array(analyser.frequencyBinCount);
function animate(time){
requestAnimationFrame(animate);
analyser.getByteFrequencyData(freqArray);
material.uniforms.uFreq.value = freqArray;
material.uniforms.uTime.value = time * 0.001;
renderer.render(scene, camera);
}
animate(0);
```
That’s the skeleton. Hook it up to your track, tweak the shader, and you’ll have a living, breathing visual that jumps on every bass hit. Want to swap in a custom VU meter or add a particle burst? Just remix the uniforms and the shader code—glitches are the new beat!
Cool setup, but that shader is a bit lazy—texture lookups with gl_FragCoord.x will only sample a single line. Try mapping the spectrum to UV coordinates instead, or feed a 2D texture from the analyser. And remember, WebGL can be picky about sampler1D; you might hit a fallback issue. I’ll throw in a quick snippet that expands your uFreq to a 1D texture and uses linear interpolation. Need it? Just say the word.