AI in Media and Entertainment


David Candler TV-Bay Magazine
Read ezine online
Download PDF
Download PDF

Artificial Intelligence (AI) is a term appearing everywhere these days. What is happening in media and entertainment (M&E) that makes the industry ripe for AI? In other words, why does the M&E industry need AI?

In virtually every industry, AI is claiming a growing stake in the supply chain, creating both operational enhancements and business efficiencies at an amazing rate. In the M&E industry, AI is starting to make its way into a wide range of conversations about everything from projects to products. With an initial focus on streamlining workflows and creating enhanced discovery experiences, the benefits are rapidly becoming a reality for many businesses.

The M&E content landscape continues to transform at a staggering rate. Organisations face increasing challenges to grow audiences, prove the effectiveness of advertising campaigns, index for quality and compliance, and increase revenue. In terms of asset management solutions, the sheer volume of content under management and the rate at which it is being created can make finding and retrieving content a challenge, even with the best metadata and logging workflows in place. With the explosion of content creation, from UGC (user-generated content) to multiversion studio releases, AI will have an increasing role to play in the critical task of discovering relevant content all along the digital supply chain, so that it can then be further utilised and monetised.

How can AI help M&E organisations (e.g. driving operational efficiencies, etc.)?

When you look at the M&E industry (and especially how asset management solutions have been traditionally deployed alongside operational teams to create, distribute, and monetise content), the limiting factor in many cases has been human. Much of the content produced by a media company never gets distributed or broadcast and ends up on the virtual cutting-room floor. Over time, this valuable content can grow into huge static repositories. To make content discoverable and useable, we have relied upon humans to tag metadata to identify the content with varying levels of accuracy and completeness. AI can vastly improve this whole scenario by interrogating a wide range of video and audio elements from every frame of an asset. AI technologies will dramatically improve how humans engage with content and how the M&E industry can drive better efficiency and value.

The concern that AI will eventually replace humans in the M&E supply chain is missing the point somewhat. The real value is based upon how AI can augment operational tasks by taking the first phase of compiling, analysing, and delivering content as required. Many content repositories are just not set up to address future consumption requirements — be they on-demand distribution right through to historical archive discovery and monetisation. AI will help to automate the processes that connect the content to the consumer, from programme viewers to footage marketplace buyers. This area is where AI can start to drive real value into the M&E industry.

In general, how does AI work in the M&E industry?

AI-enabled asset management solutions are typically utilising their workflow orchestration layers to push assets through a third-party AI cognitive engine or several engines controlled by an AI operating system or orchestration layer. Metadata generated by the cognitive engines is then generally received back via API. Many solutions can submit single or multiple assets through both automated and manual workflow processes. An ideal scenario is for the asset management solution to allow for all returned metadata to be displayed along an asset’s video timeline with different engine results displayed in their own timeline fields. Metadata should also be fully indexed in the asset management’s search engine to enable discovery via advanced search features or by timeline data search tools, which make it possible to jump to moments identified by the engines when selected. Analytical tools and dashboards provide all statistics and reports that the end user requires, or the API can feed the metadata into third-party systems for downstream utilisation. The types of cognitive engines typically used by the M&E industry are as follows:

  • Transcription (converting spoken audio and video recordings into readable text)
  • Face Recognition (identifying and indexing the presence of individuals in video or still images)
  • Object Recognition (identifying multiple objects within video or still images)
  • Sentiment (discerning the tone behind a series of words and using it to understand the attitudes, opinions and emotions expressed)
  • A/V Fingerprinting (generating a condensed digital summary, deterministically generated as a reference clip, that can be used to quickly locate similar items across multiple media files)
  • Translation (translating written text from one language to another)
  • Geolocation (associating media with geolocation data points to enable search by location, displaying a map view of media file collections or other specialised functionality)
  • Optical Character Recognition (also known as text recognition, extracting text from an image, video, or document)
  • Logo Recognition (identifying specific companies based on their logos or brands in images and video)

If a media company or a rights holder wants to put AI to work, what are some possible applications (e.g. asset management)?

All of the above cognitive engine types allow asset management solutions to build purpose-driven workflows that can help improve operational efficiencies, optimise ad and sponsorship verification, repurpose content, enhance competitive research, unlock hidden revenue streams, and more. By applying key components, users can automatically create a searchable set of data along the content timeline, as opposed to manual viewing and logging. Integrating an asset management solution with an AI operating system or orchestration layer opens access to hundreds of cognitive engines, which enables the user to try different engines to find the one that best fits the parameters of a given project.

At its most basic, AI alongside asset management can help organisations analyse, share, and index their content automatically, which ultimately leads to streamlined workflows and enhanced discovery experiences. AI can automatically generate preconfigured and relevant metadata that can enhance advanced searches on vast archives, which in turn can reduce operational costs and raise the discoverability and usability of valuable content.

Many content companies have turned to the cloud for things like archiving and distribution. Is it possible to build AI into a cloud workflow?

Absolutely yes. Many customers are now fully adopting cloud-native solutions that take full advantage of the inherent benefits of the cloud (e.g. scalability, resiliency, agility, etc.). In my own world, both asset management and AI solution components are fully cloud-native, making integration and usage so much easier. Also, as customers start to store their content in cloud-based object storage, it becomes easier for both asset management and AI solutions to work with the same content in the same location.

How can M&E asset management vendors bring AI to their customers?

Many traditional asset management vendors have failed over the years to deliver on their original promise to their M&E customers to transform critical business processes and enhance the overall value of their content through technology. In terms of AI, if the vendor fails to appreciate the value of understanding content at a cognitive level, then there will be a strategy gap between the vendor and the industry. Without this understanding, we will still require humans to perform most of the critical functions along the digital supply chain (e.g., content creation to distribution plus analysis and monetisation).

This lack of understanding will continue to prevent asset management vendors from taking full advantage of business process automation technologies, which industries that rely on structured data have used to deliver significant advancements and efficiency and performance. The M&E industry generates a great deal of unstructured data, which is why cognitive AI solutions are a must as opposed to computational strategies. Therefore, if M&E asset management vendors fully integrate AI processes within their solutions, they will be helping to transform the on-demand creation, delivery, and monetisation of content.

What are the benefits of such a collaboration?

M&E asset management vendors are already engaging with the likes of IBM Watson, Amazon Web Services, Microsoft Azure, Google, and Veritone to find tangible ways to use AI to help drive efficiencies and monetisation opportunities across their customers’ operations and content archives. In the M&E sector, early adopters of such technologies are already beginning to reap the benefits of operational efficiencies and revenue generation.

I believe that the ideal scenario is to utilise all of these technologies through a single AI operating system or orchestration layer. This type of operation is going to become the norm in the modular cloud infrastructure, where collaborating with other industry experts can lead to better, more flexible solutions than most vendors could ever build on their own. It is the best and most efficient way to give customers the problem-solvers they need. This type of collaboration will be the foundation of next-generation supply chains that reduce costs, improve efficiencies, and ultimately make content available to use.

What are some possible use cases?

There are multiple scenarios in which AI can enhance an asset management solution’s value to the M&E industry. Some possible use cases are noted below:

  • Automated AI Captions: Automated captioning is on the increase as accuracy levels improve and the cost to create captions for access regulation purposes remains high. Automated AI captioning not only helps with accessibility but also provides a searchable timeline reference for discoverability within the asset management solution.
  • Automated AI Highlights: We all saw how Wimbledon has started to use IBM Watson AI to power highlights, analytics, and enriched fan experiences. This use case will become much more commonplace, especially with the ever-increasing demand for highlights across sports, news, and entertainment verticals. These highlights are used all over social media platforms to engage viewers and fans across the world. AI automation helps to streamline the process of identifying relevant moments and getting them to the consumer as quickly as possible.
  • Automated AI Metadata Generation: One of the key functions of AI is to generate relevant metadata for downstream use cases, including search and classification. This function enables deeper and more advanced discovery of content without needing a team manually tagging content (with the chance of human error). The larger your archive, the more automated AI metadata generation is relevant for you.

Other typical use cases include:

  • Automatic quality assurance versus manual evaluation
  • Marketing and advertising (e.g., targeted campaigns)
  • Personalised services (e.g., content recommendation)
  • Experience innovation (e.g., new immersive visual content experiences, including virtual reality and augmented reality)
  • Sponsorship verification (e.g., using logo recognition to analyse the use of sponsorship)
  • And there are many, many more …

Can you give a real-world example of an M&E company using AI for media asset management today?

The use cases are vast and diverse. In one recent example, an international media conglomerate — home to premier global television, motion picture, gaming, and other brands — used an AI solution to underpin a broadcast compliance workflow. To comply with the U.S. Federal Communications Commission’s Children’s Television Act of 1990, the company is required to identify the talent used in any advertisements that run during children’s educational programs. The solution leveraged automated facial recognition, speech to text, and enriched metadata within the asset management platform to identify the talent and provide data back to the company. As a result, the company can be sure that the ads do not contain the same talent as in the concurrent program, thereby ensuring compliance.


Tags: iss134 | AI | wazee | David Candler
Contributing Author David Candler

Read this article in the tv-bay digital magazine
Download PDF
Article Copyright tv-bay limited. All trademarks recognised.
Reproduction of the content strictly prohibited without written consent.

Related Interviews
  • IMT Vislink Air Pro 75Ka and NewSwift 240HD Antenna at IBC 2018

    IMT Vislink Air Pro 75Ka and NewSwift 240HD Antenna at IBC 2018

  • Veritone AI Technology shown at IBC 2018

    Veritone AI Technology shown at IBC 2018

  • ChyronHego Paint 7.5 new features shown at IBC 2018

    ChyronHego Paint 7.5 new features shown at IBC 2018

  • Mixer Bags and New Airo Range from K-Tek at NAB 2018

    Mixer Bags and New Airo Range from K-Tek at NAB 2018

  • Paint from ChyronHego at NAB 2018

    Paint from ChyronHego at NAB 2018

  • Playbox Production Air Box shown at IBC 2017

    Playbox Production Air Box shown at IBC 2017

  • Playbox Cloud Air at IBC 2107

    Playbox Cloud Air at IBC 2107

  • Glyph External Storage including the Studio and Atom raid at IBC 2017

    Glyph External Storage including the Studio and Atom raid at IBC 2017

  • Davinci Resolve 14 with Fairlight from Blackmagic Design at NAB 2017

    Davinci Resolve 14 with Fairlight from Blackmagic Design at NAB 2017

  • Brainstorm Multimedia at NAB 2016

    Brainstorm Multimedia at NAB 2016

  • G-Technologys G Speed Shuttle XL Raid at BVE 2016

    G-Technologys G Speed Shuttle XL Raid at BVE 2016

  • Training and education within the broadcast industry

    Training and education within the broadcast industry

  • Brainstorm Multimedia at IBC 2015

    Brainstorm Multimedia at IBC 2015

  • CHYRONHEGO PAINT at NAB 2015

    CHYRONHEGO PAINT at NAB 2015

  • BRAINSTORM VIRTUAL SETS at NAB 2015

    BRAINSTORM VIRTUAL SETS at NAB 2015

  • Brainstorm Multimedia at BVE 2015

    Brainstorm Multimedia at BVE 2015

  • Air Alloy Tripod System from Miller Fluid Heads at IBC 2014

    Air Alloy Tripod System from Miller Fluid Heads at IBC 2014

  • Haivision at NAB 2014

    Haivision at NAB 2014

  • Brainstorm Multimedia at NAB 2014

    Brainstorm Multimedia at NAB 2014

  • Atomos Samurai Blade at BVE 2014

    Atomos Samurai Blade at BVE 2014

  • Haivision on BroadcastShow LIVE at IBC 2013

    Haivision on BroadcastShow LIVE at IBC 2013

  • Atomos with the Samurai Blade on BroadcastShow LIVE at IBC 2013

    Atomos with the Samurai Blade on BroadcastShow LIVE at IBC 2013

  • Brainstorm on BroadcastShow LIVE at IBC 2013

    Brainstorm on BroadcastShow LIVE at IBC 2013

  • Gearhouse Broadcast HD OB trailer at IBC 2013

    Gearhouse Broadcast HD OB trailer at IBC 2013

  • Atomos new Samurai Blade at IBC 2013

    Atomos new Samurai Blade at IBC 2013

  • Brainstorm with Aston demonstration at IBC 2013

    Brainstorm with Aston demonstration at IBC 2013

  • Brainstorm at IBC 2013

    Brainstorm at IBC 2013

  • Haivision live encoding HEVC at IBC 2013

    Haivision live encoding HEVC at IBC 2013

  • ChyronHego Paint at IBC 2013

    ChyronHego Paint at IBC 2013

  • Atomos Samurai Blade demonstration at NAB 2013

    Atomos Samurai Blade demonstration at NAB 2013

  • Brainstorm Multimedia: Aston3D at NAB 2013

    Brainstorm Multimedia: Aston3D at NAB 2013

  • Haivision: Video Cloud at NAB 2013

    Haivision: Video Cloud at NAB 2013

  • AC Entertainment Technologies at BVE 2013

    AC Entertainment Technologies at BVE 2013

  • Haivision at IBC 2012

    Haivision at IBC 2012

  • Brainstorm at IBC 2012

    Brainstorm at IBC 2012

  • Haivision at NAB 2012

    Haivision at NAB 2012

  • Brainstorm at NAB 2012

    Brainstorm at NAB 2012

  • Atomos Samurai at NAB 2012

    Atomos Samurai at NAB 2012

  • Brainstorm at IBC2011

    Brainstorm at IBC2011

  • Haivision at IBC2011

    Haivision at IBC2011

  • Object Matrix Hybrid Workflow and Artifical Intelligence at NAB 2018

    Object Matrix Hybrid Workflow and Artifical Intelligence at NAB 2018

  • NewsTicker 5 from ChyronHego at NAB 2017

    NewsTicker 5 from ChyronHego at NAB 2017

  • Uncompressed Video over IP from Matrox at NAB 2017

    Uncompressed Video over IP from Matrox at NAB 2017

  • HEVC 4k Encoding from Aviwest at NAB 2017

    HEVC 4k Encoding from Aviwest at NAB 2017

  • Blackmagic Design at IBC 2016

    Blackmagic Design at IBC 2016

  • Prime Focus Technologies at IBC 2016

    Prime Focus Technologies at IBC 2016

  • Videosys at NAB 2016

    Videosys at NAB 2016

  • 3D Storm at IBC 2015

    3D Storm at IBC 2015

  • Playbox at IBC 2015

    Playbox at IBC 2015

  • PlayBox at BVE 2015

    PlayBox at BVE 2015

  • KITPLUS rig setup at IBC 2014

    KITPLUS rig setup at IBC 2014

  • PAG talk about flying with batteries at IBC 2014

    PAG talk about flying with batteries at IBC 2014

  • Clear-Com FreeSpeak II at IBC 2014

    Clear-Com FreeSpeak II at IBC 2014

  • Fibrenetix with StorageDNA at IBC 2014

    Fibrenetix with StorageDNA at IBC 2014

  • Fibrenetix with Quadrus at IBC 2014

    Fibrenetix with Quadrus at IBC 2014

  • Libec ALLEX slider system at NAB 2014

    Libec ALLEX slider system at NAB 2014

  • Atomos Shogun and Ninja Star at NAB 2014

    Atomos Shogun and Ninja Star at NAB 2014

  • Atomos Operating System at NAB 2014

    Atomos Operating System at NAB 2014

  • Atomos at NAB 2014

    Atomos at NAB 2014

  • Atomos Ronin at BVE 2014

    Atomos Ronin at BVE 2014

  • Atomos Ninja Blade at BVE 2014

    Atomos Ninja Blade at BVE 2014

  • Camdec at BVE 2014

    Camdec at BVE 2014

  • Digital Vision on BroadcastShow LIVE at IBC 2013

    Digital Vision on BroadcastShow LIVE at IBC 2013

  • Atomos Ninja2 Recorder at IBC 2013

    Atomos Ninja2 Recorder at IBC 2013

  • Atomos Ronin at IBC 2013

    Atomos Ronin at IBC 2013

  • Softron Media at IBC 2013

    Softron Media at IBC 2013

  • Photon Beard: Cyc Hood at NAB 2013

    Photon Beard: Cyc Hood at NAB 2013

  • File Catalyst at NAB 2013

    File Catalyst at NAB 2013

  • Thear Technology at BVE 2013

    Thear Technology at BVE 2013

  • True Lens at BVE 2013

    True Lens at BVE 2013

  • Atomos at BVE 2013

    Atomos at BVE 2013

  • ATOMOS at BVE North 2012

    ATOMOS at BVE North 2012

  • Thear Technology Limited at BVE North 2012

    Thear Technology Limited at BVE North 2012

  • Scayl at IBC 2012

    Scayl at IBC 2012

  • Studer at BVE 2012

    Studer at BVE 2012

  • Global Distribution at BVE 2012

    Global Distribution at BVE 2012

  • Thear Technology Limited at BVE 2012

    Thear Technology Limited at BVE 2012

  • Real Life Kit at ProVideo2011

    Real Life Kit at ProVideo2011

  • Atomos at ProVideo2011

    Atomos at ProVideo2011

  • Thear Technology Limited at ProVideo2011

    Thear Technology Limited at ProVideo2011

  • Jonathan Harrison at BVE North 2011

    Jonathan Harrison at BVE North 2011

  • The Vision Charity at BVE North 2011

    The Vision Charity at BVE North 2011

  • Hireacamera at BVE North 2011

    Hireacamera at BVE North 2011

  • G-Technology at IBC2011

    G-Technology at IBC2011

  • Sonnet Technology at IBC2011

    Sonnet Technology at IBC2011

  • Prime Focus at IBC2011

    Prime Focus at IBC2011

  • PlayBox Technology at IBC2011

    PlayBox Technology at IBC2011

  • Bristol VFX at IBC2011

    Bristol VFX at IBC2011


Related Shows
  • Samantha Baines at BVE 2015

    Samantha Baines at BVE 2015


Articles
The Wireless Way to 4k
JP Delport DTC’s AEON group of products have been specifically designed for the 4K market. We encode with the more efficient HEVC algorithm, which means we are taking a 12G signal and compressing it to a bitrate that can be managed over an RF link. So what makes this a leading idea in the 4K revolution?
Tags: iss134 | wireless | 4k | transmission | JP Delport
Contributing Author JP Delport Click to read or download PDF
An Obituary to Timecode
Bruce Devlin - new A stoic and persistent character that stubbornly refused to change with the times, Timecode has finally passed on, but no-one has noticed. A long-lasting industry veteran, Timecode was brought into this world at an uncertain date in the late 1960s due to the needs of analogue tape workflows and the demand for synchronisation between audio and video devices. A joint activity between SMPTE and the EBU led to the work on Time and Control codes starting its journey to standardisation in the early 1970s.
Tags: iss134 | timecode | smpte | ebu | edit | Bruce Devlin - new
Contributing Author Bruce Devlin - new Click to read
The brave new world of software based production
Boromy Ung In today’s rapidly evolving broadcast industry, the only constant media organizations can truly count on is change — and the need to adapt as rapidly and cost-effectively as possible. One of the biggest agents of change is the IP revolution, driving broadcasters to migrate their operations to all-software solutions running on commodity, IT-based technologies.
Tags: iss134 | chyronhego | graphics | sports | ott | Boromy Ung
Contributing Author Boromy Ung Click to read or download PDF
Shedding Light on the Blackmagic Pocket Cinema Camera 4k BMCPP4K
Garth de Bruno Austin “What is it about light that has us craving it?” Is the question asked in the opening seconds of Garth de Bruno Austin’s latest short, The Colour of Light. Exploring this natural, human need as well as our innate desire to control it, Garth’s film showcases everyday people going about their lives in differing degrees of luminance, whether that be an artificial streetlight or a natural morning sunrise.
Tags: iss134 | blackmagic | cinema camera | 4k | cpp4k | Garth de Bruno Austin
Contributing Author Garth de Bruno Austin Click to read or download PDF
Keeping it remotely real
Reuben Such Everyone wants to do more with less. Always have, although it could be argued that doing more with more is something to aspire to, not many have that luxury. So let’s stick with the prevailing winds of doing more with less, and not just doing more, but doing it remotely, particularly in terms of production. Remote production, in particular, is getting a lot of attention in the field these days, but not so much in terms of the remote operation of fixed studios.
Tags: iss134 | remote control | IPE | IDS | Reuben Such
Contributing Author Reuben Such Click to read or download PDF