3D diaries -Fix it in post. Dont go there


Bob Pank#
The more I look into it the more scary 3D is. Researching for 3D Diaries shows there are so many pitfalls and traps that would be all too easy to fall into; enough to put people off the whole idea! The answer is simply to learn about the subject before seriously jumping in with both feet. Clearly it adds a whole new layer to all parts of the scene-to-screen chain. In each link there are traps to avoid, and avoid them you must, because getting it wrong produces unwatchable results.
That’s enough gloom! And in any case, there is a growing army of 3D-capable professionals to make sure things work well every step of the way. Even so, it would be sensible to know for yourself how things should be done, so welcome to Part Three 3D Diaries! We know that television is an illusion that works very well and is accepted by our perception. Stereo 3D makes all those calls on our brain and then a whole lot more. It is an even bigger illusion that can work well so long as the material is presented correctly. Go outside the rules and the illusion is instantly shattered.
Rewind not many years and nearly all 3D production was on film and, typically, many months were spent just correcting the mismatches of the left and right cameras. Today television is already able to make a good job of live 3D coverage, meaning that ‘mismatch’ errors are corrected, or avoided, on the fly. But for non-live productions the camera mismatches will be left for correcting in post. In truth, this allows time for more accuracy and adjustment of 3D for dramatic effect and should produce the best possible result.
Knowledge and the right equipment are needed to succeed. For the first part, here are some useful tips.
Shoot it right. All 3D post people agree that this is the number-one priority. Conversely, aiming for ‘fix it in post’, as you might in 2D, is the absolutely wrong approach for 3D. That third dimension can describe a hole so deep that you may never get out of! It’s true there are some 3D issues that can be fixed in post but be very sure exactly what these are and go no further. Finding out the rights and wrongs requires the knowledge of a 3D expert and they are still relatively thin on the ground; but the good news is that their number is growing fast.
3D quality. You really must know the quality of the 3D in the footage you are committing to post produce; otherwise you could be landed with endless hours of corrective 3D work that you had not allowed for, and so could be seriously out of pocket. In practise you can check the quality by having that rare knowledgeable 3D specialist monitoring the shoots on-set, or by carefully analysing the recorded in footage in 3D before quoting for the job.
Allow plenty of time. Even well shot 3D takes longer in post than 2D and, obviously, also depends on the two points above.
Offline and S-3D grammar. If the offline was run from one camera ‘eye’ then the resulting EDL could, and almost certainly would, break the rule of S-3D grammar that says, ‘avoid cutting between large changes in convergence’. Breaking the rule creates an unnatural situation and is uncomfortable to watch. Another rule says that there should not be as many cuts (as 2D). This is because our brains have to work out the 3D information of each one, and fast cutting is really bad as we cannot keep up. These rules are important – as I know to my cost after one visit to a 3D cinema!
Another weakness of one-eyed offine is that there could be glaring differences between left and right footage, such as lens flares, highlights and reflections that affect each eye differently. Also footage from Mirror rigs produces slightly different coloured images for left and right, so again this could not be checked. Some 3D post packages, such as Quantel’s Pablo, have extensive 3D tools to provide quick fixes for such disparities, which saves time by not having to off-load the footage to another area for correction.
Obey the 3D rules. This could be a book but here are two things that many get wrong; understanding the limits of disparity and mixing up depth cues. Disparity is what you see when you take your 3D glasses off – the double image showing the different left end right-eye horizontal positions of an object. The bigger it is, the greater the depth. Its value depends on variables including lenses, inter-axial distance (between lenses), convergence and the range of depth within the scene. Beyond a certain limit, viewers have trouble accepting the image (headache!). Measuring and managing disparity is a key concern for both stereographers and post finishers. This is where high-level tools such, as Mistika, come in with disparity mapping through a scene and changing its values in various zones.
Depth cues are what your brain latches onto to create a stereoscopic impression of a scene. There are several, but they all need to agree with each other, otherwise stereopsis, our 3D perception process, fails (headache!). So placing a title optically in front of an object, but stereoscopically behind it means your brain accepts the title is in front of the object because it is occluded by the title, while stereopsis says it is behind the object. The result is a title that is strangely but powerfully difficult to read!
Size (of viewing screen) matters. Another behind-infront conflict is created when objects come out of the screen and are cropped by the screen edge, or when an object moves off the edge of the screen. In either case there will be an anomaly where parts of, or the whole, object exists in one eye but not in the other. This is more noticeable on small TV screens where the viewing angle is relatively small, whereas cinema screens should look better as the pictures edges are further away from our centre of vision.
Equipment
Get a 3D editing system. It might be possible to get a 3D upgrade to a 2D system; otherwise you really need a new one. Making 3D work properly means getting everything right and this makes a strong case for the all-in-one-box approach to editing and post where all the elements can be assembled, viewed and adjusted in one place.
Video quality. It’s best to have two in-sync real-time full resolution video channels as then you can see all the problems that need fixing, such as left-right sync issues (the quickest way to induce a headache!) and checking the pace and convergence ‘depth’ over edits. Accessing and processing two streams of 2K, or even 4K in real time requires a powerful ‘post engine’ but it can avoid rendering time that otherwise slows the whole process. Using low resolution images is a compromise that can spoil the 3D perception, or rather illusion, as for example, ‘screen effects’ such as aliasing and lens flares, look weird because they have no reason to be there in 3D.
View in 3D. For finishing you need to watch in 3D as this is what the end product is. Also left and right streams from digital cameras are almost never correct, with issues of geometry, lens inaccuracies, convergence adjustments and colour balance, and it’s hard to deal with these viewing just one channel. Also checking depth continuity can only be done in 3D.
All of the above has little or nothing to do with 2D post, which has not been mentioned as it is taken for granted. 3D editing, grading and other post activities mentioned are all supplementary to 2D. So now, perhaps, it is easier to understand exactly why ‘fix it in post’ could easily go seriously wrong in 3D. Don’t go there.
Writing 3D Diaries has given me the excuse tap the knowledge of manufacturers and industry experts. This article owes much to the help given by Roger Thornton at Quantel and David Cox, a 3D post specialist.

Tags: 3d diaries | post production | 3d | iss044 | 3d production | s-3d | Bob Pank#
Contributing Author Bob Pank#

Article Copyright tv-bay limited. All trademarks recognised.
Reproduction of the content strictly prohibited without written consent.

Related Interviews
  • Tony Taylor from TMD talks about Post Production

    Tony Taylor from TMD talks about Post Production

  • SGO at IBC2011

    SGO at IBC2011

  • Davinci Resolve 14 with Fairlight from Blackmagic Design at NAB 2017

    Davinci Resolve 14 with Fairlight from Blackmagic Design at NAB 2017

  • Forscene at IBC 2014

    Forscene at IBC 2014

  • Quantel LiveTouch at IBC 2014

    Quantel LiveTouch at IBC 2014

  • Quantel deal with AFP at IBC 2014

    Quantel deal with AFP at IBC 2014

  • Snell Kahuna Production Switcher at IBC 2014

    Snell Kahuna Production Switcher at IBC 2014

  • Forbidden Technologies FORscene at BVE 2014

    Forbidden Technologies FORscene at BVE 2014

  • Forbidden Technologies FORscene App at BVE 2014

    Forbidden Technologies FORscene App at BVE 2014

  • Tony Taylor from TMD talks about LTFS at IBC 2013

    Tony Taylor from TMD talks about LTFS at IBC 2013

  • Tony Taylor from TMD talks about Mediaflex CI

    Tony Taylor from TMD talks about Mediaflex CI

  • Facilis at IBC 2013

    Facilis at IBC 2013

  • Facilis at NAB 2013

    Facilis at NAB 2013

  • Autodesk at NAB 2012

    Autodesk at NAB 2012

  • 3D Storm presents LiveXpert and NewTek products at IBC 2018

    3D Storm presents LiveXpert and NewTek products at IBC 2018

  • 3D Storm discuss IP and NDI for Live Media Server and Live CG Broadcast at NAB 2018

    3D Storm discuss IP and NDI for Live Media Server and Live CG Broadcast at NAB 2018

  • LiveCG Broadcast and LiveMedia Server from 3D Storm at BVE 2018

    LiveCG Broadcast and LiveMedia Server from 3D Storm at BVE 2018

  • NDI and Newtek Products from 3D Storm at IBC 2017

    NDI and Newtek Products from 3D Storm at IBC 2017

  • LiveMedia Server and Live Xpert from 3D Storm at NAB 2017

    LiveMedia Server and Live Xpert from 3D Storm at NAB 2017

  • 3D Storm with LiveXpert at BVE 2017

    3D Storm with LiveXpert at BVE 2017

  • 3D Storm at IBC 2016

    3D Storm at IBC 2016

  • 3D Storm at BVE 2016

    3D Storm at BVE 2016

  • 3D Storm at IBC 2015

    3D Storm at IBC 2015

  • Brainstorm Multimedia: Aston3D at NAB 2013

    Brainstorm Multimedia: Aston3D at NAB 2013

  • Brainstorm Multimedia at IBC 2015

    Brainstorm Multimedia at IBC 2015

  • BRAINSTORM VIRTUAL SETS at NAB 2015

    BRAINSTORM VIRTUAL SETS at NAB 2015

  • Brainstorm Multimedia at BVE 2015

    Brainstorm Multimedia at BVE 2015

  • Brainstorm on BroadcastShow LIVE at IBC 2013

    Brainstorm on BroadcastShow LIVE at IBC 2013

  • Brainstorm with Aston demonstration at IBC 2013

    Brainstorm with Aston demonstration at IBC 2013

  • Brainstorm at IBC 2013

    Brainstorm at IBC 2013

  • Featured Clip: Geoff Boyle comments on NAB 2012

    Featured Clip: Geoff Boyle comments on NAB 2012

  • Brainstorm at NAB 2012

    Brainstorm at NAB 2012

  • VIZRT at BVE 2012

    VIZRT at BVE 2012

  • Newtek at BVE 2012

    Newtek at BVE 2012

  • Sony at ProVideo2011

    Sony at ProVideo2011

  • Sisvel Technology at IBC2011

    Sisvel Technology at IBC2011

  • Sensio at IBC2011

    Sensio at IBC2011

  • Panasonic at IBC2011

    Panasonic at IBC2011

  • Doremi at IBC2011

    Doremi at IBC2011

  • Matrox at IBC2011

    Matrox at IBC2011

  • Blackmagic at IBC2011

    Blackmagic at IBC2011

  • Marshall Electronics at IBC2011

    Marshall Electronics at IBC2011


Related Shows
  • Show 20 - July 17th 2013

    Show 20 - July 17th 2013


Articles
Shedding Light on the Blackmagic Pocket Cinema Camera 4k BMCPP4K
Garth de Bruno Austin “What is it about light that has us craving it?” Is the question asked in the opening seconds of Garth de Bruno Austin’s latest short, The Colour of Light. Exploring this natural, human need as well as our innate desire to control it, Garth’s film showcases everyday people going about their lives in differing degrees of luminance, whether that be an artificial streetlight or a natural morning sunrise.
Tags: iss134 | blackmagic | cinema camera | 4k | cpp4k | Garth de Bruno Austin
Contributing Author Garth de Bruno Austin Click to read or download PDF
Protecting the continuity of transmission
Lorna Garrett Your viewers love you. You consistently bring them their preferred channels 24/7. They’ve come to rely on you for their viewing pleasure. They never miss cheering on their beloved sports teams. They’re the envy of their friends as they watch live concerts of their favourite bands. They gather the family around and catch up on their must-see shows. They don’t have a bad word to say about you.
Tags: iss134 | garland | gpl | streaming | artel | disaster recovery | Lorna Garrett
Contributing Author Lorna Garrett Click to read or download PDF
An Epiphany Moment
Peter Savage 2 I had been negotiating the sale of my company and had reached the really hard end of the bargain. We were close to agreeing the final sum after a lot of too-much-give-and-not-enough-take negotiation. The solicitors were calling me, keen for a deal. It had come down to one sticking point and, in my hard ball “I am the Wolf of Wall Street” guise, I wasn’t going to let it go. It would make a value difference of 1.5% on the total outcome. Not much, you might think, but I had already nearly fallen out with the solicitors over their fees and I was giving my advisors an extremely hard time because the corporate adviser couldn’t see how I had already given more than an inch and the buyers were taking more than a mile. I was not going to let them win.
Tags: iss134 | azule | finance | Peter Savage 2
Contributing Author Peter Savage 2 Click to read or download PDF
AI in Media and Entertainment
David Candler Artificial Intelligence (AI) is a term appearing everywhere these days. What is happening in media and entertainment (M&E) that makes the industry ripe for AI? In other words, why does the M&E industry need AI?
Tags: iss134 | AI | wazee | David Candler
Contributing Author David Candler Click to read or download PDF
Keeping it remotely real
Reuben Such Everyone wants to do more with less. Always have, although it could be argued that doing more with more is something to aspire to, not many have that luxury. So let’s stick with the prevailing winds of doing more with less, and not just doing more, but doing it remotely, particularly in terms of production. Remote production, in particular, is getting a lot of attention in the field these days, but not so much in terms of the remote operation of fixed studios.
Tags: iss134 | remote control | IPE | IDS | Reuben Such
Contributing Author Reuben Such Click to read or download PDF