Live visualization takes a live music performance as input and programmatically generates a video that matches said music performance. The video might pulse along with the beat of the music. Or maybe it changes color or shape depending on the “mood” of the audio. The computer driving the visualization receives audio from the live music via a line in, which Apple treats as an active microphone.
A simple live visualization would be to display a video showing the waveform of the live audio feed.