Event & Yllan
Hey Yllan, I’m itching to mix some live tech with a bit of vibe‑hacking—imagine a stage that reads crowd energy through wearables and morphs the set in real time. Got any cool sensor or algorithm ideas for that?
Sounds like a mind‑set sync between body and code. Try using lightweight heart‑rate monitors and skin‑conductance patches on the crowd, pair that with ambient microphones and motion sensors around the venue. Feed the data into a small real‑time DSP pipeline that normalises the signals, then feed them into a clustering model—maybe a lightweight k‑means or a tiny neural net trained to map physiological states to a palette of visual effects. On the visual side, run a generative shader that interprets the cluster index as a colour or motion parameter. If you want an adaptive loop, let a reinforcement‑learning agent tweak the mapping each show based on audience feedback. Keep the sensor array low‑bandwidth and the algorithm modular, so you can swap in new models without breaking the live feed.