Ask the experts. Eye & Jitter

Why should TV broadcast engineers be interested in adding Eye and Jitter measurement to their T & M facilities?
The call for Eye and Jitter as part of overall video test and measurement lies in the widespread adoption of serial digital interface (SDI) standards for broadcast. Unlike analog transmission in which the image quality gradually degrades as the signal quality diminishes, the image quality of a digital transmission stays pretty much perfect as the signal quality falls until it reaches a critical level at which the picture totally fails. Moving through this critical level is often described as ‘falling off the cliff’.
The value of Eye diagrams and Jitter measurements is that they allow you to see how close to the edge of this cliff your system is running.
There are applications for this in many areas of video production and broadcast from evaluating new equipment, through identifying the causes of performance issues in the rigging for an outside broadcast (for instance, kinks where someone has driven over your cables), to long-term monitoring for equipment failure.
What is an Eye diagram? How is it generated?
An Eye diagram is an oscilloscope-like plot of the analog signal that transports the digital video signal but produced in a rather different way. An oscilloscope can overlay the waveform directly. In a video T & M system, the waveform is built up by overlaying the results of sampling the signal at many times the underlying signal rate in a process known as time-equivalent data sampling. The resulting waveform, however, is exactly the same.
The digital signal comprises a sequence of 0s (signal low) and 1s (signal high) in pretty much any combination but overall the chances of the signal being either low or high at any time are approximately equal. The basic form of the resulting plot can be readily seen by superimposing the waveforms associated with the various possible combinations of two 0s or 1s (00, 01, 10 and 11).
What information can be obtained from an Eye diagram?
A number of signal carrier parameters can be determined from an Eye diagram. The obvious ones are the signal amplitude (normally taken as the height of the Eye half-way between two crossover points) and the so-called Unit Interval (UI) between one crossover and the next. In addition, it is possible to determine rise-time for the signal (defined as the time taken to rise from 20% of the signal amplitude to 80%) and its equivalent fall-time. Values can also be determined for signal overshoot and undershoot, while the thickness of the crossover gives a measure of the amount of jitter in the signal.
The other important feature of the Eye diagram is the way that the eyes close up as the signal quality degrades. With a little bit of experience, you can readily tell how close you are to the ‘edge of the cliff’ (and hence how much headroom you have) from how closed the eyes are.
If Eye diagrams can also be produced on an oscilloscope, what is the benefit of having this facility as an integral part of the test and measurement package?
One advantage of generating Eye diagrams within your overall T & M package is in the scope that the T & M package offers for both manipulating and recording the data for you. Where the Eye diagram is produced by a T & M package, it can also determine such things as the Eye Height, the Unit Interval, the rise-time and the fall-time automatically –probably with greater precision than can be achieved using manual methods.
Another advantage is cost. A scope capable of producing Eye diagrams for 3G video can easily cost £100k.
Give me an example of where the Eye diagram would be particularly useful.
The way the eyes close up as the signal degrades gives a simple but effective way of judging when the signal degradation at a particular point in the signal chain is worse than expected. An outside broadcast team could use this to spot, for example, where a 100m cable has been inserted rather than the 50m cable that is normally used or where a 50ohm cable has been used instead of a 75ohm cable.
If Jitter can be measured on an Eye diagram, what is the value of having a separate Jitter waveform display?
Measuring the width of the Eye waveform at the crossover point does indeed give a measure of the amount of jitter in the signal, however it is not an easy measurement to make even for a reasonably skilled person. Firstly two cursors need to be positioned, one either side of the crossover. This measures a time interval, corresponding pretty much to the maximum jitter in the signal (depending on how well the cursors are positioned. The value measured also subject to factors such as signal persistence.) SMPTE standards, however, quote the acceptable jitter level as a proportion of the UI, so as well as measuring the jitter, you also have to measure the UI. Overall, there is considerable potential for operator error and imprecise results.
The Jitter waveform is generated by comparing the clock extracted from the carrier signal against a perfect clock. The displacement of the extracted clock from the reference clock is directly proportional to the jitter, and gives rise to a continuous time-based plot from which the exact amount of jitter can be determined. Any dominant frequencies are also visible in this plot.
The other advantage of Jitter waveform data is that it converts the jitter into a simple number that can both be logged and shown on a Jitter ‘meter’. Such meters are typically divided into green (safe), yellow (warning) and red (failure) sections, to make it easy to spot when the level of jitter has suddenly increased.
How do I determine the source of jitter in my system?
Jitter is typically either mains-based or video-content-based, or introduced by a switch mode power supply, or a mixture of these.
The traditional way to determine the source of jitter is to apply a succession of filters to the jitter waveform display. If you introduce, say, a 100kHz high-pass filter and the jitter disappears, this tells you that the jitter is all below 100kHz, whereas if the jitter doesn’t appear affected by the filter, it tells you that the jitter is all above 100kHz. By working through a sequence of such filters, noting each time how the jitter display changes, it is possible to isolate different bands within the jitter and hence determine the main jitter source(s).
A better approach (and the one used in high-end systems such as the OTM 1000) is to apply a fast Fourier transform (FFT) to the jitter waveform. This produces a jitter spectrum, from which the main jitter components can easily be picked out.
How might the Jitter display help in a TV studio?
In a TV studio, it is by no means unknown for the engineers working on one broadcast to switch cables around in order to fix a particular problem they had, and fail to return the cables to their previous configuration for the next team to use. The effect of this can be to add extra links into the path taken by the broadcast signal.
A good way of spotting such issues is by monitoring the jitter at the output because every piece of equipment on route adds some jitter. So if the jitter at the output is higher than expected, the chances are that either the signal is taking a longer route than intended or some piece of equipment along the route is not functioning correctly.
At the time of its launch, OmniTek emphasised how their Eye and Jitter card was the first SDI physical layer analysis system to cater for 3G signals. What particular design challenges did this introduce?
There were many challenges but the main one was bandwidth. The basic clock rate for 3G is 2970MHz, which makes the data transition rate 1485MHz. 1485MHz is not a problem to handle but digital signals are square waves and SMPTE 424 limits the slew rate on the edges. To reproduce a square wave sufficiently accurately, we have to handle 3rd, 5th, 7th etc. harmonics which means that the sampler used to produce the Eye diagram in the OTM 1000 has to have a flat response from DC to beyond 14GHz. This was a major design challenge.

Tags: eye pattern | eye and jitter pattern | iss032 | omnitek | test and measurment | eye diagram | oscilloscope | waveform | tv studio | N/A
Contributing Author N/A

Article Copyright tv-bay limited. All trademarks recognised.
Reproduction of the content strictly prohibited without written consent.

Related Interviews
  • Omnitek Updates for Ultra 4k Toolbox at NAB 2018

    Omnitek Updates for Ultra 4k Toolbox at NAB 2018

  • Omnitek Ultra TQ at IBC 2017

    Omnitek Ultra TQ at IBC 2017

  • Omnitek Consultancy and Design Services at IBC 2017

    Omnitek Consultancy and Design Services at IBC 2017

  • Omnitek at IBC 2016

    Omnitek at IBC 2016

  • Omnitek at NAB 2016

    Omnitek at NAB 2016

  • Omnitek at BVE 2016

    Omnitek at BVE 2016

  • Omnitek Ultra 4K Tool Box at IBC 2015

    Omnitek Ultra 4K Tool Box at IBC 2015

  • Tektronix at IBC 2016

    Tektronix at IBC 2016

  • Tektronix at IBC 2014

    Tektronix at IBC 2014

  • Atomos with the Samurai Blade on BroadcastShow LIVE at IBC 2013

    Atomos with the Samurai Blade on BroadcastShow LIVE at IBC 2013

  • Calrec Callisto at IBC 2013

    Calrec Callisto at IBC 2013

  • ATEM TV Studio Pro HD from Blackmagic Design at NAB 2017

    ATEM TV Studio Pro HD from Blackmagic Design at NAB 2017

  • aQ Broadcast Production Suite at BVE 2014

    aQ Broadcast Production Suite at BVE 2014

Using Wireless Transmission
Jeremy Benning Wireless acquisition is a staple of live sports, entertainment and reality shows where cable free capture permits shots not previously possible, for health and safety reasons, and gives the camera-operator greater artistic licence to roam. The same is increasingly true of narrative drama where cinematographers are keen to work handheld or Steadicam where that helps tell the story. Any equipment which frees their movement and time by being lighter, easier to use and reliable in performance is going to tick a lot of boxes.
Tags: iss134 | wireless | 4k | transmission | Jeremy Benning
Contributing Author Jeremy Benning Click to read or download PDF
An Epiphany Moment
Peter Savage 2 I had been negotiating the sale of my company and had reached the really hard end of the bargain. We were close to agreeing the final sum after a lot of too-much-give-and-not-enough-take negotiation. The solicitors were calling me, keen for a deal. It had come down to one sticking point and, in my hard ball “I am the Wolf of Wall Street” guise, I wasn’t going to let it go. It would make a value difference of 1.5% on the total outcome. Not much, you might think, but I had already nearly fallen out with the solicitors over their fees and I was giving my advisors an extremely hard time because the corporate adviser couldn’t see how I had already given more than an inch and the buyers were taking more than a mile. I was not going to let them win.
Tags: iss134 | azule | finance | Peter Savage 2
Contributing Author Peter Savage 2 Click to read or download PDF
Accelerated Workflows with eGPU
Mike Griggs From the UK’s National Trust to magazine publishers to manufacturers, digital content creator Mike Griggs has a wide and varied portfolio of clients for whom he creates 3D art, motion graphics and multimedia exhibits. A typical day might involve sampling birdsong near Virginia Woolf’s country estate or creating 3D animations for VR. To keep on top of these demands, Griggs wanted to take the full power of the GPU computing revolution on the road.
Tags: iss134 | sonnet | egpu | amd | post production | editing | Mike Griggs
Contributing Author Mike Griggs Click to read or download PDF
University and Mental Health
Rhiannon Jenkins University study and mental health has been in the media quite a bit over the last year, and I’m sure there are many people wondering what is going on? The issues are complex, and I suppose the focus of employability off the back of a degree course has raised the stress stakes for a lot of young people. I’m only qualified to talk about this from my perspective, and my story began when I joined a course not knowing I had a mental health condition.
Tags: iss134 | portsmouth uni | mental health | student | tvfutures | Rhiannon Jenkins
Contributing Author Rhiannon Jenkins Click to read or download PDF
Protecting the continuity of transmission
Lorna Garrett Your viewers love you. You consistently bring them their preferred channels 24/7. They’ve come to rely on you for their viewing pleasure. They never miss cheering on their beloved sports teams. They’re the envy of their friends as they watch live concerts of their favourite bands. They gather the family around and catch up on their must-see shows. They don’t have a bad word to say about you.
Tags: iss134 | garland | gpl | streaming | artel | disaster recovery | Lorna Garrett
Contributing Author Lorna Garrett Click to read or download PDF