Broadcast News

Bookmark and Share
10/11/2015

Migrating The Archive

News Image
As broadcast and media infrastructures become firmly file-based, with an eye on moving to the cloud, attention is increasingly turning to the archive. Finished programmes, rushes and clips have to be preserved and accessible. By Tony Taylor, CEO, TMD.

Many organisations are now looking at a second generation archive. As part of the asset management revolution over the last decade they will have included at least some digital archiving capability. The challenge now is to implement a second generation system in a modern, open way, which makes it ready for cloud implementations in the future.

The core issue is that the initial system will almost certainly have been implemented using proprietary technology, and particularly proprietary storage software standards. That locks the user into a particular vendor, and that in turn limits the prospects of future expansion.

There are now a number of open formats for content archiving. Some vendors are advocating AXF, an archive format specifically developed for broadcast. This is a practical standard, although at TMD we argue against a broadcast-specific standard, because it again is unnecessarily limiting at a time when so much else of our infrastructure is moving to open IT platforms.

We prefer to recommend LTFS, the open IT standard. LTFS is a practical and portable standard, in which each LTO tape is a self-contained unit which can be read by any system with a suitable tape drive. In practice, a tape in the LTFS format looks like a disk drive to any computer. It is a highly cost-effective format which is completely future-proofed: in time LTFS archives could exist in holographic storage, or on virtual "tapes" in the cloud, with the same degree of transparency.

The biggest issue becomes how to migrate from an expensive proprietary format to LTFS. This is an area in which TMD has a huge amount of experience.

The first part of the challenge is in migrating the content, and here planning is critical. In hardware terms it means adding a new data tape robot to the existing infrastructure, to copy data from the old format to the new. The practicalities, though, are around managing that transfer without disrupting the day to day work of the archive.

How can transfers be made without taking up all the resources? What network bandwidth is available, and what is needed to accomplish the migration in a timely manner? What is the balance between a practical deadline for completing the project, the risks to live operation, and the costs of additional hardware and network capacity? These are not simple questions, and it is wise to seek expert support.

Transferring the content is an issue of capacity: migrating the metadata is the side of the problem that calls for very careful system design. Again, this is not something that can be undertaken lightly, and using a system provider who has done it before is vital.

For a successful transfer, a new and carefully thought-through metadata schema needs to be defined, then the metadata in the old system mapped onto it. Once this mapping is clear and finalised, the transfer should be secure.

Our experience, though, is that first generation metadata schemes tend to be poorly defined, bordering on the ramshackle. You have to develop a clear understanding of the existing metadata and its variables before you can clean it for transfer.

Typical problems such an exercise will throw up lie around free text fields, especially where these have been used for metadata which would be better described using specific variables. When we migrated archive data for one of the world's biggest broadcast and production companies, for example, we discovered that the audio format – mono, stereo, music and effects and so on – was stored in a free text field, and we found 8400 different permutations of the description. That is the sort of unexpected challenge that can derail a project if you are not prepared.

At TMD we have migrated many archives, including some of the largest in the world. In some ways, the technology is not the big issue – the sensible plan is to move to proven, open, IT standard hardware. The real issue is in understanding what needs to be done to achieve a clean, clear consistency of processing. You need to have that understanding on your project team to deliver a second generation archive which does not lock you into an individual vendor, but is open and future-proof.

www.tmd.tv

(JP/LM)
Solidmate Ltd Memory Card Hire London

More Archive Stories

03/02/2020
Quantum Enters Agreement With Western Digital Technologies
Quantum Corporation has entered into an agreement with Western Digital Technologies to acquire its ActiveScale™ object storage business. The addition
06/12/2019
Digital Nirvana Unveils New Version Of MonitorIQ
Digital Nirvana has revealed the first details about the latest version of its next-generation broadcast monitoring and compliance logging platform, M
29/03/2019
Signiant Announces New Adoption Metrics For SaaS Platform
Signiant has announced new capabilities and adoption metrics for the innovative cloud-native SaaS platform that is driving the company's growth. As a
22/03/2019
Quantum To Launch New NVMe All-Flash Storage Platform
Quantum is to launch a new NVMe all-flash storage platform that is designed to dramatically accelerate media workflows at NAB 2019. The company also a
05/07/2018
BFV's Big Interview: LaCie
BFV talks to Stéphane Jamin, Channel Marketing Supervisor at LaCie, about how the company started and how its merger with Seagate has benefited the co
09/05/2018
BFV's Big Interview: Tyrell
This month, Broadcast Film & Video talks to Tyrell Sales Director Dan Muchmore about their success at NAB 2018, the number of partners they invest in,
17/11/2017
BFV's Big Interview: G-Technology
Designing high performing and reliable storage solutions, G-Technology allows professionals to enhance their storytelling and bring to life their mome
25/07/2017
Quantum Ask: Are You 4K Ready?
The question isn't whether you'll be working in 4K/UHD anymore—it's how! The development of 4K/UHD content is growing every day and it is becoming an
21/07/2017
A Royal Case Study - GB Labs Intelligent Storage Saves The Day
The Royal Shakespeare Company's in-house Video Media department is responsible for producing high quality and engaging trailers, synopsis, interviews
25/04/2017
Quantum To Showcase Veritone's Artificial Intelligence Platform
Quantum Corp has announced that it will showcase Veritone Inc.'s multi-engine artificial intelligence (AI) platform in a Quantum StorNext-managed envi