Over the last 20 years or so we have all become accustomed to the expression “asset management” and, as ever, familiarity has bred contempt. In this case, the expression has become a loose descriptor for almost anything with a bit of a database in it. We need to stop and think about what we are trying to achieve.
In any media enterprise, the content itself is the most important asset. You need to know what you have, where it is, and what you can do with it. Assets represent the value in a business, and the asset management system should be, in large part, about realising that value. That, in turn, has two elements: protecting the asset for long-term availability, and providing access to the asset.
A lot of vendors offering systems to the broadcast industry talk about “digital” asset management, and DAM is a common abbreviation. I like to think that those who talk about DAM are still living in the era when to describe something as digital was cool, but maybe I am being a bit harsh.
We have always had asset management, of course. In the old days it was a room with a large amount of shelving, a card index, and a librarian – usually with a comprehensive knowledge of what was in the archive – who would only hand a tape or a film can over the counter if the production assistant signed for it in blood.
When we moved to automated playout we needed to put the card index on computer, so that the robot knew which tape to pick up and load. Later, content stopped being in physical form at all and became files on a server, so we needed the database and disk management system simply to retrieve the content when we needed it.
This simplistic view of asset management still prevails among some vendors: it is a database to help you navigate the disk filing system to put content on air. That is an important requirement, of course, as far as it goes. But for virtually all broadcasters, production companies and facilities I would suggest that it needs to be much more sophisticated if it is to fulfil a worthwhile role.
First – and this is a point which is often neglected – it only applies to digital content, stored on the server. What about assets which are either not yet ingested, or cannot be treated in the same way?
A broadcaster or production company that has been in existence more than a decade or so will almost certainly have some assets on tape which have not been “digitised”. Many will have material on film. These assets may still have residual value: a quick glance at the Sky EPG as I am writing this revealed a channel showing an episode of The Waltons from 40 years ago.
Even if the immediate monetary value of an asset is not obvious, there may be a historical or cultural reason for keeping it. National broadcasters will have an obligation to maintain a comprehensive record of their output. At TMD we also provide asset management systems for national archives, to maintain and give access to cultural and social history.
Our work with audiovisual archives has also driven us to develop ways of including material in the asset register which cannot truly be digitised. You can group together some of the assets in digital form, like the final cut, the elements that make up the soundtrack, the subtitle files and so on. You can include a Word file of the script, but what if your archive includes a working copy of the script that the director has scribbled notes on? That surely is worth preserving. If you produced Doctor Who, surely you would want to keep at least one original of each of the best monsters.
So actually the archive needs to cover three categories: digital stuff, physical stuff that will one day be digital stuff, and stuff that will remain physical.
Within this there are, of course, multiple sub-categories. To give one example, audio and visual content that is not yet ingested will need to be assessed for its state of preservation, how much restoration work will be needed to return it to the best possible quality, what the business case for that restoration is, and the timescales before it can be completed.
So you see that the practical archive needs a sophisticated data model. You need to know what you have got, you need to be able to retrieve it, and you need to be able to distribute it to other users. Distribution may mean allowing researchers to search themselves for what you have, and it will almost certainly mean realising the value of the assets by selling content on multiple platforms.
To maximise the revenues earned from these new opportunities you need to automate the delivery, which means that the metadata in the archive has to contain all the information required by any user as well as the means of finding the content. It also needs to support automated workflows to add the right metadata in the right places as the content is transcoded and wrapped. The most common reason for rejection by the iTunes Store is because of metadata errors.
There is no single simple answer. There are some standardised data frameworks, like the BBC SMEF (standard metadata exchange framework) and the Dublin Core, but these are partial solutions. You have to develop the metadata schema that is right for your enterprise, supported by a product that is flexible enough to allow complete tailoring. Getting the data model right is critical to the success of the asset management system, so at this stage you need to talk to someone with experience.
If you have read this far you could be forgiven for thinking that an asset management system needs a huge amount of metadata for each asset, and that is so daunting a task it is better to forget the idea altogether. Please believe me when I say that implementing a good asset management system, which you can trust with your archive, need not involve vast teams of people copy-typing information to create a database.
My advice – and I have worked on a lot of successful asset management projects – is to think of the problem the other way up. Rather than start by filling in every piece of information you can conceivably think of to describe an asset, just add the metadata you need as you go. Not only does that make the challenge much less scary, it means that the data is added by the most appropriate person, who understands and values it, at the right point in the workflow.
Integration with other systems will also ensure that, wherever possible, metadata is added automatically. For broadcasters, for example, production details and proposed transmission dates will be transferred from the planning system. Precise timings and technical metadata information will come from automated quality control technology. And so on.
Downstream other information can be added as it becomes available. The key here is to design the system not to “allow” this to happen but to encourage it. Give each operator screens tailored for their needs, so they only see the boxes they need to complete. Make the tagging simple where practical, using touch screens or links to external sources. If it is a bought in programme or film, do not ask someone to type in the cast: pull it in from IMDB.
Finally, you need to ensure that the digital archive is secure. The TMD Mediaflex Archive module supports the LTFS data tape standard, which means that information can be retrieved on other systems if required. It is a sensible precaution, and should be considered as part of a comprehensive HSM (hierarchical storage management) policy.
There are specialist HSM businesses in the audiovisual asset management field that provide the links between online and deep archives, and offsite disaster recovery centres. TMD’s Archive module interfaces closely with them, simplifying the implementation and reducing the operational costs of maintaining them.
Remember that the key word here is “asset”. Protecting your assets must be the top priority, followed closely by accessing them readily and monetising them effectively. The technology to achieve this is proven and ready – but choose with care.