Search results
Results From The WOW.Com Content Network
Jitter period is the interval between two times of maximum effect (or minimum effect) of a signal characteristic that varies regularly with time. Jitter frequency, the more commonly quoted figure, is its inverse. ITU-T G.810 classifies deviation lower frequencies below 10 Hz as wander and higher frequencies at or above 10 Hz as jitter. [2]
For example, UI is used to measure timing jitter in serial communications or in on-chip clock distributions. This measurement unit is extensively used in jitter literature. Examples can be found in various ITU-T Recommendations, [1] or in the tutorial from Ransom Stephens. [2]
Jitter is the undesired deviation from true periodicity of an assumed periodic signal in electronics and telecommunications, often in relation to a reference clock source. Jitter may be observed in characteristics such as the frequency of successive pulses, the signal amplitude , or phase of periodic signals.
During an interval of time τ, as measured by the reference clock, the clock under test advances by τy, where y is the average (relative) clock frequency over that interval. If we measure two consecutive intervals as shown, we can get a value of ( y − y ′ ) 2 —a smaller value indicates a more stable and precise clock.
The most straightforward scheme uses a digital counter and a free-running crystal oscillator to time intervals with 1-clock ambiguity, resulting in output edge jitter of one clock period peak-to-peak relative to an asynchronous trigger. This technique is used in the Quantum Composers and Berkeley Nucleonics instruments.
The Sega 32X uses PWM to play sample-based sound in its games. In more recent times, the Direct Stream Digital sound encoding method was introduced, which uses a generalized form of pulse-width modulation called pulse-density modulation , at a high enough sampling rate (typically in the order of MHz) to cover the whole acoustic frequencies ...
In that approach, the measurement is an integer number of clock cycles, so the measurement is quantized to a clock period. To get finer resolution, a faster clock is needed. The accuracy of the measurement depends upon the stability of the clock frequency. Typically a TDC uses a crystal oscillator reference frequency for good long term stability.
Clock signal and legend. In electronics and especially synchronous digital circuits, a clock signal (historically also known as logic beat) [1] is an electronic logic signal (voltage or current) which oscillates between a high and a low state at a constant frequency and is used like a metronome to synchronize actions of digital circuits.