Ask the experts. Eye & Jitter


Why should TV broadcast engineers be interested in adding Eye and Jitter measurement to their T & M facilities?
The call for Eye and Jitter as part of overall video test and measurement lies in the widespread adoption of serial digital interface (SDI) standards for broadcast. Unlike analog transmission in which the image quality gradually degrades as the signal quality diminishes, the image quality of a digital transmission stays pretty much perfect as the signal quality falls until it reaches a critical level at which the picture totally fails. Moving through this critical level is often described as ‘falling off the cliff’.
The value of Eye diagrams and Jitter measurements is that they allow you to see how close to the edge of this cliff your system is running.
There are applications for this in many areas of video production and broadcast from evaluating new equipment, through identifying the causes of performance issues in the rigging for an outside broadcast (for instance, kinks where someone has driven over your cables), to long-term monitoring for equipment failure.
What is an Eye diagram? How is it generated?
An Eye diagram is an oscilloscope-like plot of the analog signal that transports the digital video signal but produced in a rather different way. An oscilloscope can overlay the waveform directly. In a video T & M system, the waveform is built up by overlaying the results of sampling the signal at many times the underlying signal rate in a process known as time-equivalent data sampling. The resulting waveform, however, is exactly the same.
The digital signal comprises a sequence of 0s (signal low) and 1s (signal high) in pretty much any combination but overall the chances of the signal being either low or high at any time are approximately equal. The basic form of the resulting plot can be readily seen by superimposing the waveforms associated with the various possible combinations of two 0s or 1s (00, 01, 10 and 11).
What information can be obtained from an Eye diagram?
A number of signal carrier parameters can be determined from an Eye diagram. The obvious ones are the signal amplitude (normally taken as the height of the Eye half-way between two crossover points) and the so-called Unit Interval (UI) between one crossover and the next. In addition, it is possible to determine rise-time for the signal (defined as the time taken to rise from 20% of the signal amplitude to 80%) and its equivalent fall-time. Values can also be determined for signal overshoot and undershoot, while the thickness of the crossover gives a measure of the amount of jitter in the signal.
The other important feature of the Eye diagram is the way that the eyes close up as the signal quality degrades. With a little bit of experience, you can readily tell how close you are to the ‘edge of the cliff’ (and hence how much headroom you have) from how closed the eyes are.
If Eye diagrams can also be produced on an oscilloscope, what is the benefit of having this facility as an integral part of the test and measurement package?
One advantage of generating Eye diagrams within your overall T & M package is in the scope that the T & M package offers for both manipulating and recording the data for you. Where the Eye diagram is produced by a T & M package, it can also determine such things as the Eye Height, the Unit Interval, the rise-time and the fall-time automatically –probably with greater precision than can be achieved using manual methods.
Another advantage is cost. A scope capable of producing Eye diagrams for 3G video can easily cost £100k.
Give me an example of where the Eye diagram would be particularly useful.
The way the eyes close up as the signal degrades gives a simple but effective way of judging when the signal degradation at a particular point in the signal chain is worse than expected. An outside broadcast team could use this to spot, for example, where a 100m cable has been inserted rather than the 50m cable that is normally used or where a 50ohm cable has been used instead of a 75ohm cable.
If Jitter can be measured on an Eye diagram, what is the value of having a separate Jitter waveform display?
Measuring the width of the Eye waveform at the crossover point does indeed give a measure of the amount of jitter in the signal, however it is not an easy measurement to make even for a reasonably skilled person. Firstly two cursors need to be positioned, one either side of the crossover. This measures a time interval, corresponding pretty much to the maximum jitter in the signal (depending on how well the cursors are positioned. The value measured also subject to factors such as signal persistence.) SMPTE standards, however, quote the acceptable jitter level as a proportion of the UI, so as well as measuring the jitter, you also have to measure the UI. Overall, there is considerable potential for operator error and imprecise results.
The Jitter waveform is generated by comparing the clock extracted from the carrier signal against a perfect clock. The displacement of the extracted clock from the reference clock is directly proportional to the jitter, and gives rise to a continuous time-based plot from which the exact amount of jitter can be determined. Any dominant frequencies are also visible in this plot.
The other advantage of Jitter waveform data is that it converts the jitter into a simple number that can both be logged and shown on a Jitter ‘meter’. Such meters are typically divided into green (safe), yellow (warning) and red (failure) sections, to make it easy to spot when the level of jitter has suddenly increased.
How do I determine the source of jitter in my system?
Jitter is typically either mains-based or video-content-based, or introduced by a switch mode power supply, or a mixture of these.
The traditional way to determine the source of jitter is to apply a succession of filters to the jitter waveform display. If you introduce, say, a 100kHz high-pass filter and the jitter disappears, this tells you that the jitter is all below 100kHz, whereas if the jitter doesn’t appear affected by the filter, it tells you that the jitter is all above 100kHz. By working through a sequence of such filters, noting each time how the jitter display changes, it is possible to isolate different bands within the jitter and hence determine the main jitter source(s).
A better approach (and the one used in high-end systems such as the OTM 1000) is to apply a fast Fourier transform (FFT) to the jitter waveform. This produces a jitter spectrum, from which the main jitter components can easily be picked out.
How might the Jitter display help in a TV studio?
In a TV studio, it is by no means unknown for the engineers working on one broadcast to switch cables around in order to fix a particular problem they had, and fail to return the cables to their previous configuration for the next team to use. The effect of this can be to add extra links into the path taken by the broadcast signal.
A good way of spotting such issues is by monitoring the jitter at the output because every piece of equipment on route adds some jitter. So if the jitter at the output is higher than expected, the chances are that either the signal is taking a longer route than intended or some piece of equipment along the route is not functioning correctly.
At the time of its launch, OmniTek emphasised how their Eye and Jitter card was the first SDI physical layer analysis system to cater for 3G signals. What particular design challenges did this introduce?
There were many challenges but the main one was bandwidth. The basic clock rate for 3G is 2970MHz, which makes the data transition rate 1485MHz. 1485MHz is not a problem to handle but digital signals are square waves and SMPTE 424 limits the slew rate on the edges. To reproduce a square wave sufficiently accurately, we have to handle 3rd, 5th, 7th etc. harmonics which means that the sampler used to produce the Eye diagram in the OTM 1000 has to have a flat response from DC to beyond 14GHz. This was a major design challenge.

Tags: eye pattern | eye and jitter pattern | iss032 | omnitek | test and measurment | eye diagram | oscilloscope | waveform | tv studio | N/A
Contributing Author N/A

Article Copyright tv-bay limited. All trademarks recognised.
Reproduction of the content strictly prohibited without written consent.

Related Interviews
  • Omnitek Updates for Ultra 4k Toolbox at NAB 2018

    Omnitek Updates for Ultra 4k Toolbox at NAB 2018

  • Omnitek Ultra TQ at IBC 2017

    Omnitek Ultra TQ at IBC 2017

  • Omnitek Consultancy and Design Services at IBC 2017

    Omnitek Consultancy and Design Services at IBC 2017

  • Omnitek at IBC 2016

    Omnitek at IBC 2016

  • Omnitek at NAB 2016

    Omnitek at NAB 2016

  • Omnitek at BVE 2016

    Omnitek at BVE 2016

  • Omnitek Ultra 4K Tool Box at IBC 2015

    Omnitek Ultra 4K Tool Box at IBC 2015

  • Tektronix at IBC 2016

    Tektronix at IBC 2016

  • Tektronix at IBC 2014

    Tektronix at IBC 2014

  • Atomos with the Samurai Blade on BroadcastShow LIVE at IBC 2013

    Atomos with the Samurai Blade on BroadcastShow LIVE at IBC 2013

  • Calrec Callisto at IBC 2013

    Calrec Callisto at IBC 2013

  • ATEM TV Studio Pro HD from Blackmagic Design at NAB 2017

    ATEM TV Studio Pro HD from Blackmagic Design at NAB 2017

  • aQ Broadcast Production Suite at BVE 2014

    aQ Broadcast Production Suite at BVE 2014


Articles
Future proofing post production storage
Josh Goldenhar Advancements in NVMe (Non-Volatile Memory Express), the storage protocol designed for flash, are revolutionising data storage. According to G2M Research, the NVMe market will grow to $60 billion by 2021, with 70 percent of all-flash arrays being based on the protocol by 2020. NVMe, acting like steroids for flash-based storage infrastructures, dynamically and dramatically accelerates data delivery.
Tags: iss135 | nvme | sas | sata | it | storage | post production | Josh Goldenhar
Contributing Author Josh Goldenhar Click to read or download PDF
The making of The Heist
Tom Hutchings Shine TV has never been one to shy away from a challenge, be that in terms of using new technologies, filming ideas or overall formats: we pride ourselves on being ambitious and risk-takers.
Tags: iss135 | liveu | heist | streaming | cellular | mobile | connectivity | Tom Hutchings
Contributing Author Tom Hutchings Click to read or download PDF
Your two week editing future
Alex Macleod

So here we are - January again! Usually a good time to reflect on the year just gone by, and a good time to look forward to the coming months as the new year begins.

When I was reflecting on my 2018, and when thinking about what to write for my first article for Kit Plus - I kept coming back to one theme - organisation.

Tags: iss135 | editing | mediacity training | premiere pro | dit | Alex Macleod
Contributing Author Alex Macleod Click to read or download PDF
21st Century Technology for 20th Century Content
James Hall A big challenge facing owners of legacy content is rationalising and archiving their tape and film-based media in cost effective and efficient ways, whilst also adding value. Normally the result of this is to find a low cost means of digitising the content – usually leaving them with a bunch of assets on HDD. But then what? How can content owners have their cake and eat it?
Tags: iss135 | legacy | digitising | digitizing | archive | James Hall
Contributing Author James Hall Click to read or download PDF
Test, Measurement and Standards
Alan Wheable The Alliance for IP Media Solutions (AIMS), is a non-profit trade alliance that fosters the adoption of one set of common, ubiquitous, standards-based protocols for interoperability over IP in the media and entertainment, and professional audio/video industries.
Tags: iss135 | omnitek | aims | SNMP | hdr | ai | Alan Wheable
Contributing Author Alan Wheable Click to read or download PDF