Oculizer 
Over the past couple of years, I've been developing a DMX lighting automation system that creates real-time, music-reactive lighting. Oculizer combines Spotify metadata with live audio analysis, using mel-scaled FFT to analyze frequency components and map them to DMX values through configurable scenes. Currently, I am working on training a predictive model to automate the selection of lighting scenes based on the audio signal itself rather than relying on the Spotify metadata, allowing for the predictable structure of music to modulate the mapping from sound to light.
My hope is that this project is the precursor to developing systems that facilitate fruitful social interactions. Our brains are highly sensitive to the spectra of ambient light, sound, and odor in the environment, but the ways in which these factors impact our feelings and behavior often evade conscious awareness. Through a systematic research program of testing the relationships between these factors and human behavior, it's possible to build the knowledge necessary for us to optimize the spaces we inhabit towards connection, cooperation, and a really fun time.
The project is open source and available on GitHub.
Core Features
- Real-time audio reactivity using mel-scaled FFT analysis
- Spotify integration for metadata and playback control
- Support for RGB lights, dimmers, strobes, and lasers
- Live scene switching and MIDI control support