The distortion associated with jitter is not the usual generation of spurious harmonics, which can be measured with a conventional distortion analyzer, but instead comprises sidebands surrounding every musical tone, with the level of those sidebands proportional to both the amplitude of the jitter and the frequency of the musical tone, and their frequency spacing proportional to the frequency content of the jitter. Measuring the effect of jitter is therefore far from a simple matter.
The J-Test, where J stands for jitter, was developed in the mid-1990s by the late Julian Dunn of PrismSound to investigate the jitter rejection of digital datalinks in which the clock is embedded in the audio data: the balanced AES/EBU or AES3 link, for example, or the unbalanced S/PDIF link. The test signal comprises a high-level tone at exactly one-quarter the sample frequency, or Fs/4, to which is added a squarewave at the level of the least significant bit (LSB), at Fs/192. With the twos-complement PCM encoding used by CD data, this low-level squarewave exercises all the bit transitions simultaneously, which is very much the worst case for stimulating jitter. The high-level, high-frequency tone is thus modulated by the jitter and sidebands spaced at the frequency of the Fs/192 squarewave, and its harmonics will appear in the reconstructed analog signal's noise floor.