top of page
Screenshot 2025-01-27 at 7.53.23 PM.png

AUDIO REACTING PARTICLES

About the Project

The core idea behind this project was to explore the interaction between audio signals and visual elements. The particle system dynamically changes its behavior, shape, and color based on the audio frequencies and amplitudes it receives. Lower frequencies create smoother, flowing motions, while higher frequencies generate sharper and more energetic movements, resulting in a captivating visual symphony that mirrors the rhythm and tone of the sound.

Software Used

TouchDesigner_logo.png

TouchDesigner

Project & Videos

To achieve this, various components of TouchDesigner were used.

 

Audio analysis nodes were implemented to extract key data points, such as frequency ranges and amplitude levels, from the sound input.

 

This data was then mapped to control parameters of the particle system, including speed, direction, color gradients, and density.

 

The use of optical flow and transformations further enhanced the fluidity of the visuals, creating a seamless connection between the auditory and visual elements.

Screenshot 2025-01-27 at 8.04.21 PM.png
Screenshot 2025-01-27 at 8.04.52 PM.png
Screenshot 2025-01-27 at 8.03.58 PM.png
Screenshot 2025-01-27 at 7.52.35 PM.png
Screenshot 2025-01-27 at 7.53.00 PM.png
Screenshot 2025-01-27 at 7.53.23 PM.png

The final result is a mesmerizing visual experience where the particles move, swirl, and interact in harmony with the audio input. This project demonstrates a deep understanding of procedural design, real-time data processing, and creative coding. It also highlights the potential of TouchDesigner as a powerful tool for audio-reactive visualizations.

IMG_0693.heic

This project is an interactive particle system designed in TouchDesigner that dynamically reacts to sound, voice, or audio input in real-time. The core concept revolves around synchronizing visual elements with audio, creating a harmonious blend of motion and sound that captivates the senses.

​

The particle system is programmed to respond to varying frequencies and amplitudes within the audio input. Low frequencies generate smooth and flowing particle movements, while high frequencies produce sharp, energetic bursts. This interaction results in a fluid and ever-changing visual display that mirrors the rhythm, tone, and intensity of the sound. The use of vibrant colors and fluid dynamics enhances the immersive quality of the visuals, offering an engaging and responsive experience.

​

The development process involved leveraging TouchDesigner’s audio analysis tools to extract real-time data from the sound input. Key metrics such as amplitude and frequency ranges were mapped to control various particle attributes, including speed, direction, density, and color gradients. Optical flow and transformation techniques were employed to create a seamless and dynamic flow of particles that react naturally to the audio.

​

This project demonstrates the creative potential of combining technology and art. It showcases expertise in procedural design, real-time interactivity, and audio-visual synchronization. The outcome is a stunning and versatile system suitable for live performances, music videos, or interactive installations. By translating sound into motion, this work delivers a unique sensory experience, embodying the synergy between visual and auditory art.

bottom of page