Hacker News new | past | comments | ask | show | jobs | submit login

I've been working on a texture/video synthesis framework for VJing / music visualization on a large LED installation. It uses a node editor (no relation to node.js) visual scripting approach to pipe data between different shaders (like a fluid simulation) and signal generators (like MIDI or Xbox controllers).

Fluid simulation: https://www.youtube.com/watch?v=C1JzOv4w65w

Audio reactive: https://www.youtube.com/watch?v=KyDpnzfSg_o

It's done with the Unity game engine, and is open source! https://github.com/SotSF/canopy-unity




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: