To view this page ensure that Adobe Flash Player version 11.1.0 or greater is installed.
the broadcast head-end where it was injected into the appropriate place by a caption server during broadcast. However, the growing demand for multi-platform delivery, catch-up services and international versioning have brought captions and subtitles back up the audio and video workfl ow chain and into the fi le domain. In the caption server, captions and subtitles were stored in authoring formats with SCC and STL respectively becoming the de facto interchange standards. There are many other old, proprietary less well known formats, and, more recently, variations of TTML (Timed-Text Mark-up Language) as defi ned by W3C, SMPTE and the EBU are becoming common to ease standardised interchange. SCC, STL and TTML therefore, typically, represent the bulk of upstream “insert/extract” captions operation. The downstream side is less constrained. In HD, most carriage will be in the ancillary data packets. For MXF, the storage of ANC data is now well defi ned by the SMPTE ST436 specifi cation, providing a solid framework for caption/subtitle workfl ows. Beyond MXF, formats like ProRes have proprietary methods to store (US) captions while legacy fi les may use other specifi cations. A signifi cant challenge today is in understanding archived legacy SD fi les. For IMX fi les, VBI is encoded in the video stream and caption/subtitle data is therefore preserved. For other formats it might be less obvious. ST436 and VAUX element of DV fi les are valid homes for timed text but sporadically used, whilst proprietary VBI fi les, proprietary MPEG headers and A53 user data are also known to have been used for the storage of subtitles and captions. In short, if you’re a); writing a tender that includes subtitle preservation or processing or b); writing a product specifi cation that includes captions, make sure you know where to stick them!