Motion
Real-time Art
Real-time visuals, audio-reactive experiments, generative art, and motion design tests. Where sound becomes image and code becomes performance.
What I Play With
TouchDesigner
MainPrimary tool for real-time visuals and interactive systems.
Ableton Live
AudioAudio source and MIDI control for reactive visuals.
Blender
Learning3D modeling and animation (exploring).
After Effects
SecondaryMotion graphics and video post-production.
Experiments
Audio-Reactive Particles
Particle systems driven by audio frequency analysis.
Generative Patterns
Algorithmic pattern generation and noise-based visuals.
Live VJ Sets
Real-time visual performance synced to DJ sets.
Abstract Motion
Experimental motion graphics and loop explorations.
Gallery
Short looping videos and visual experiments coming soon.
How It Works
Most of my visual work connects to music. The typical setup: Ableton Live sends audio and MIDI data to TouchDesigner, which generates real-time visuals that react to frequency, amplitude, and beat information.
This creates a feedback loop between sound and image — the music shapes the visuals, but the visuals also influence how I approach the music. It's performance art made with code.
I'm currently exploring 3D workflows with Blender and thinking about how these tools can enhance live DJ sets and installations.
Sound Meets Image
The visual world is deeply connected to the music world. They're designed to work together — in live sets, in recordings, and in the creative process.
Explore the Music →Shop This World
Abstract prints, generative art posters, and visual experiment outputs.
Coming soon to the Inksky Store →