Broadcast News

Bookmark and Share

'The Cloud' Gets Less Fluffy, Part 2

News Image
Continued from Part One here.

Serious data transfer
Much of the processing that is done on files will involve taking the high bit rate source file, doing a transcode to multiple lower bit rate delivery formats, with QC before and after transcode. Video files comprise a lot of data and high bit rate source files particularly so: for example mezzanine ProRes UHD files can be at a data rate exceeding 800 Mbits/second, i.e. 6 GBytes per minute of video (144 GB for a 24 minute TV programme or 600 GB for a 100 minute movie). This requires a very fat pipe to upload to a remote site in a reasonable timeframe, as even a 1Gbits/sec Internet connection will only just about do real-time. Okay, this is probably worst-case as currently most mezzanine files are HD and at a much lower bit rate - perhaps 50 or 100MBits/sec - but even so, when dealing with multiple programs it still requires high-speed i.e. high cost Internet connection.

So while it can be that there is a substantial saving on PC costs, the cost of your Internet connection might increase enormously.

If the media being processed is short form this becomes a non-issue.

Lots of remote storage
A second aspect with dealing with this volume of data is that it has to be stored remotely as well. Most organisations have good amounts of online local disk storage, often with good automated access to near-line or off-line (tape-based) storage. A proportion of the immediate access local disk storage will need to be replicated on the remote setup and this could be a significant volume where the costs are also significant. This remote storage will be used on a continual basis with new media swapping in and out and for this reason it is no substitute for the local storage so the remote storage is a direct additional cost. If the media being processed is short form this is likely not significant.

A user could decide to put all their media solely on remote storage but many would consider this a risky approach for obvious reasons.

Security considerations
This brings me to the third aspect of remote processing: security. Many large broadcasters and content distributors currently have two separate networks within their facility. The first is the 'dirty' network which is connected to the Internet and is used for email traffic and general business use; the second one is the 'clean' network which only carries the video and audio media and is secured and separated from the dirty network.

This is for good reason: the video and audio content is the basis upon which the broadcast businesses are built. The video and audio media on the internal clean network is generally non-encrypted, so the pitfalls of this leaking out onto the Internet are obvious.

There are good solutions to the issue of secure transmission to remote locations, from companies such as Aspera, FileCatalyst, Signiant.
However, once the media arrives on the remote storage it is generally unencrypted. The well-known providers of remote processing services are well aware of this important issue and take steps to ensure the security of the media and remove/minimise this risk; and some networks set up for media processing have many layers of security. I am not aware of any instances where unencrypted media has leaked out onto the general Internet from one of these providers, so perhaps I am being 'a nervous Nelly' in this regard.

And when there are errors in the media.
The fourth aspect issue with processing wholly remotely is perhaps not so obvious: what happens if the QC finds an error? Our experience is that roughly 5% of media at this stage of the workflow has an error, which may or may not need to be corrected.

With software that just does QC the media needs to be returned from the remote location to be manually corrected, uploaded again and the process re-started. This is clearly long-winded and slow; and in our experience when a file is re-edited in this way, often further errors are introduced requiring this round-trip process to happen multiple times.

Vidcheck have a great solution for this problem – don't do just automated QC but also do automated correction: we estimate that our automated correction will fix 80% or more of the common errors in media files.

Hybrid local and remote
There is an extension to the remote processing scenario: a 'hybrid' approach where the local processing and storage in a post-production house or broadcaster is sufficient to deal with the normal level of week-in week-out media and the remote processing and storage only fires up to deal with the peak requirements.

This is clearly a more complex situation and I would say that the fluffy edges of the cloud are not yet well enough defined to cope with this in a good way.

Footnote: old-fashioned rental; bureau services
There are two alternative methods of managing cost that should be mentioned: rental, and hosted services.

The first of these is a straightforward rental of QC processing and other software for the time required e.g. to process the TV series as it is being made (Vidcheck already does this with a number of customers).

The second is a hosted/bureau service for QC, transcode, delivery etc. The method of doing this for delivery, e.g. of commercial advertisements, has already been well proven over a good period of time and it seems logical to extend this particularly for occasional users who need other services such as subtitling, transcode and QC. There are a number of companies setting up such offerings using Vidcheck's products, although it is early days yet and probably the first version 1.0 end-to-end services will be available in the first half of 2016.

Just the beginning
It is clear that it is still the very early days of cloud implementations, as it is only in the last 6-12 months that solutions for the specific requirements of cloud processing i.e. PAYG metering, auto scalability, processing on local or remote servers are becoming available (and Vidcheck is leading in this regard, for QC).

Our estimation is that the benefits to some smaller post-production houses, content distributors and broadcasters already outweigh the negatives, where the security issues are less of a concern – depending upon the media type being dealt with – and we anticipate a number of customers taking this approach in the next six months.

Similarly, the scalable approach for local clouds is a good solution that can be adopted soon and would be appropriate for some organisations.

This article is also available to read at BFV online.

Solidmate Ltd Memory Card Hire London

More Content Management Stories

Cutting Edge Technology And Innovative Solutions Unveiled At NAB 2018
Despite temperatures soaring to 30 degrees in Las Vegas, a record number of visitors attended the world's largest convention which encompasses the con
DRILLARIUM: Large Scale Playout
The current broadcast environment requires hundreds of channels to be supervised by a single operator. Vector 3 has created DRILLARIUM, the tool for t
Chart Your Future With PROVYS Scheduling
As every ship's captain recognises, it is very important to know exactly where we are heading, together with how, when and at what cost. Successful na
PlayBox Technology To Demonstrate Latest Advances At NAB
PlayBox Technology will exhibit the latest advances to its Neo and CloudAir broadcast channel branding and playout solutions at the NAB Show (07-12 Ap
SWISS TXT Revolutionises Live Subtitling Market With SaaS Solution
SWISS TXT has revolutionised the live subtitling market with an SaaS solution. The service was first used for MTG TV Norway at the European Handball C
PlayBox Technology Neo Powers New HD Channel at, one of Bavaria's leading regional broadcasters, has chosen PlayBox Technology Neo as the core playout system for its new high definition tele
Cloud And AI To Scare The Industry At IBC Again
Everyone knows about penetration by IT technologies in the sphere of broadcast engineering. Although the IT and engineering campuses are still poles a
Workflows Need To Change To Unlock The Benefits of IP
The primary advantage of an IP-based system is the convergence of multiple signal formats onto a comprehensive, distributed IP switch fabric that seam
PlayBox Technology Announces Neo TS IP Stream Time Delay
PlayBox Technology has announced a major new addition to its modular Neo broadcast playout system. Scheduled for launch at IBC2017 in September, Neo T
Video Over IP: Bright New Future Or Pandora's Box?
Broadcast television is all about timing, says Kevin Salvidge. Terrestrial and satellite programmes still have to be played out on schedule even thoug