"Isochronic Void #5," represents a shift from purely visual aesthetics to Psychoacoustic Simulation. It moves away from the concept of a passive image and into the realm of neurological entrainment and bio-digital feedback.
Like its predecessors, this piece relies on Raymarching. It does not use 3D models made of triangles or polygons. Instead, the GPU calculates the scene pixel-by-pixel using a Signed Distance Function (SDF).
The "Tunnel" Constraint: The code forces the camera down an infinite, mathematically generated tube. As you hold your connection, the camera moves forward through the Z axis (Time), simulating a journey at the speed of light through a cellular structure. The "fog" you see is not a texture, but a calculation of light absorption over distance, creating a sense of vast, terrifying scale.
The organic, sponge-like structure lining the tunnel is a Gyroid.
The Math of Nature: The Gyroid is an isosurface defined by the trigonometric equation: sin(x)cos(y) + sin(y)cos(z) + sin(z)cos(x) = 0.
This geometry creates a structure that divides space into two opposing domains without ever intersecting itself. It is the same mathematical architecture found in butterfly wings, mitochondrial membranes, and lipid systems. The code generates this biological perfection in real-time, creating a world that feels simultaneously alien and strangely organic.
While other generative art uses FM synthesis for texture, this system uses Amplitude Modulation (AM) to create a specific psychoacoustic phenomenon known as Isochronic Tones.
Frequency Following Response: The audio engine generates a carrier wave (the drone) and rapidly modulates its volume using a Low-Frequency Oscillator (LFO). This creates distinct, rhythmic pulses. The brain tends to synchronize its electrical cycles to this rhythm—a process called entrainment. By strictly mathematically linking the visual "breathing" of the tunnel to the audio pulse (uPulse), the piece creates a multi-sensory lock, attempting to synchronize your visual and auditory cortexes.
This piece transforms the user interface into a neurological control panel. It utilizes a XY Modulation Scheme to put the observer in control of the simulation's intensity.
X-Axis (Temporal Shift): Moving horizontally adjusts the frequency of the LFO. You are manually sliding the simulation from a Delta state (sleep/deep trance) on the left, to a Beta state (high-alert/anxiety) on the right.
Y-Axis (Spectral Shift): Moving vertically adjusts the pitch of the carrier wave, shifting the resonance from a deep, physical rumble to a high-frequency signal.
Chromatic Aberration: When the system is active, the shader splits the RGB channels based on the intensity of the audio pulse. The harder the audio "hits," the more the red and blue light waves separate, visually simulating the distortion of a lens—or an optic nerve—under high energy stress.
This is an Audiovisual Bio-Loop. The code generates the pulse, your brain perceives the rhythm, your hand adjusts the intensity based on your comfort, and the code updates the math instantly. It is a closed loop between silicon and neuron.