Broadcast News

Bookmark and Share
28/03/2017

In Conversation With Nic Hatch

News Image
Nic Hatch, CEO of Ncam, tells us about the company, Augmented Reality, Virtual Reality, and its future within broadcast.

Tell me a little about Ncam...
We started Ncam because people were asking for on-set pre-visualisation; a way of seeing how live action and visual effects would fit together. We wanted to put the VFX back through the viewfinder in real time, matched to what the camera was doing. The core idea was to give the control back to directors and DoPs, even in effects-heavy movies.

Can you explain augmented reality, virtual reality and immersive graphics?
Virtual reality (VR) took off during the 90s. The term refers to a 100% virtual environment which relies on headsets and computer generated worlds. Augmented reality (AR) is the mixing of real and virtual. That means live, real-time images from a camera or cameras augmented with virtual, computer-generated images.

The broadcast view – which was originally called the virtual set – was to replace physical set construction with computer graphics. News or sports magazine programmes, for instance, would have some real objects, some computer generated objects.

Today this might be called augmented reality for broadcast. If we are talking about 3D graphics, with the freedom to move camera and presenter within them, then this tends to get called immersive graphics.

Where does Ncam fit into this?
If you are going to have this freedom of movement in immersive graphics, there has to be a way of knowing what the camera is doing. Traditionally this was done with patterns on the walls or ceiling, but that is cumbersome and limiting. Ncam is a way of tracking camera movements – on a pedestal, a Steadicam or even handheld – using the real objects in the scene itself or on location.

How does it work?
Bolted to the camera – and it can be any camera – is what we call the Ncam Camera Bar. It is a black bar that contains numerous sensors. It is light and discrete, so it does not change the feel of the camera.

We also need to know what the lens is doing. Broadcast lenses tend to have digital control, which we can read. For film lenses we add a follow focus like a Preston or C-Motion. All this data feeds to a breakout box, and there is a single ethernet cable to a server which does the heavy computation. We have two versions of our software: Ncam Reality for film, and Ncam Live for real-time broadcast.

The output of our software is accurate positional data which allows the graphics computer to control its view. We have standardised interfaces based on the open Free-D standard (originally developed by the BBC), links to common immersive graphics systems like Vizrt and Brainstorm, and an SDK for users to create their own interfaces.

How is it used?
There are lots of applications. In movies or commercial productions where you are integrating live action with graphics, it is so much better to be able to see how they will fit together on set. It means you frame the live action correctly, and the actors can understand how they react with the virtual elements and get their eye-lines right, and so on. You can, of course, fix this in post, but it is better to get it right in the first place.

For live television, it means you can place virtual graphics into a scene and have them stay precisely in place, so they add to the viewer's enjoyment and do not wobble around and cause distraction. ESPN uses it on Monday Night Football, and Fox Sports is also a big user. Recently we implemented a system at The Weather Channel, which allows the presenter to talk about how weather systems form, interacting with really complex graphics. So the presenter can walk around a tornado in the studio, pointing out the elements which allow the system to build and where the big damage can get caused.

What are the benefits to broadcasters and film-makers?
In movies, it gives the power back to the director and DoP because they can see what the final composite will look like. As well as putting the creative control where it rightfully belongs, it also saves money because it needs less work, and less re-work, in post. And actually you are able to make better shots, because you can compose the live action shoot to match accurately the virtual elements. You are framing the whole shot, live and virtual.

Another benefit of Ncam is that the user can collect a complete set of metadata of what the camera is doing frame-by-frame. As well as the three-dimensional position and rotation of the camera, you also get focus, iris and zoom data from the lens. This means that if/when you have to start match-moving in post, you save a huge amount of processing and even more guesswork.

In broadcast, you can put graphics and stats onto the playing surface of sports to make the experience better for the audience. You can also place advertising where previously it has been impractical. You could put sponsors' logos onto golf courses, for example. The inserted graphic is not fixed as it would be if it was a physical thing: you can add animations or even live video. It really is more immersive, flexible and more fun for the audience.

Image: Nic Hatch, CEO, Ncam.

www.ncam-tech.com

(JP)
Solidmate Ltd Memory Card Hire London

More Production Companies Stories

28/03/2017
In Conversation With Nic Hatch
Nic Hatch, CEO of Ncam, tells us about the company, Augmented Reality, Virtual Reality, and its future within broadcast. Tell me a little about Ncam..
09/12/2015
360˚ Video Now On Facebook And YouTube
360˚ video players have now been adopted by two of the biggest platforms in social media and content sharing – Facebook and YouTube. 360˚ vi