The media landscape has shifted dramatically over recent years. Content providers are under increasing pressure to deliver compelling content to multiple platforms. Also, the production and distribution of video is no longer limited to traditional broadcasters, which face rising competition from a range of media providers. We are also seeing a massive increase in video being produced to broadcast quality by companies using it for other purposes, such as marketing, site surveys, internal communications, and a whole host of other applications.
All this means that the way in which that media is managed, processed, and distributed has changed and varies from user to user. I believe that we will see some very specific trends over the coming year, which will have a knock-on effect on how media is managed.
The first major trend is the move to Hybrid Cloud. With many content providers fearful of the cloud, whilst seeing its benefits, a hybrid cloud approach gives them the best of both worlds. Over the coming months I believe we will see more hybrid cloud deployments, as well as a slew of more software releases to enable this. The hybrid cloud will be deployed in a number of different ways, including for content acquisition, file management and storage. Content providers will likely, for the most part and for the short-term, keep their high speed editing SANs but increasingly manage movement of that content between offices using the cloud.
We are already seeing some deployments from major broadcasters and content providers and we have installed some hybrid cloud deployments of Cantemo Portal. This move to hybrid cloud means that MAM systems need to be able to manage that flow of content even more precisely, ensuring only the right content is sent to the cloud at the right point in its lifecycle.
Deep learning is becoming increasingly important for media players, enabling a whole slew of possibilities to personalise content and recommendations, for example. We are beginning to see a number of different players in the market place coming up with deep learning toolsets, such as Google, Microsoft, Facebook, and Nvidia. As the technology progresses, and it is progressing really fast, we are seeing more and more apps being able to do object, face and logo recognition, event recognition, and location identification, and actually do it well.
Walking around IBC, it was clear to see that Virtual Reality (VR) remains a hot topic and continues to attract a great deal of innovation. That said, when it comes to TV and film content, I believe it is a while before we see any major traction and I'm not entirely convinced we will ever see it as an everyday experience in the majority of homes. Partly, that is because of the need for expensive hardware, but also because generally TV remains a social viewing experience, which is not as easy if you all have to wear headsets.
What I do believe we will see over the coming months is the increased use of VR in the b-to-b world. Some major firms are already using it to collaborate with colleagues in other areas of the world or help customers visualise a building project, for example. We will begin to see others tapping into this potential, and across a range of different sectors.
As the use of VR content increases, whether that be for consumer entertainment or b-to-b collaboration, MAM systems will need to be able to cope with that content, ensuring that users can view and edit VR content within the MAM environment.
Traditional MAM is extinct
It is clear that in this ever evolving market, traditional MAM systems are simply not flexible enough to deal with modern media management requirements. It is very different from the days of big broadcasters with large volumes of content all being distributed to the same channel, in the same format, and in the same way. With so many different formats and channels, the traditional MAM systems will either need to adapt over the coming months or they will simply become extinct.