Is a camera tracking system features a multi-sensor bar that is unobtrusively mounted on the camera and provides real-time data to its associated tracking server.
Ncam provides complete position and rotation information, plus focal length and focus, via industry standard protocols compatible with any Virtual reality/Augmented reality graphics system.
It is suitable for a wide range of applications both indoors and outdoors, across all camera rigs, even hand-held and Steadicam.
Nic Hatch, CEO of Ncam
My background was in VFX, and particularly 3D modelling, texturing and lighting. I have worked for The Mill, Moving Picture Company, Sony Pictures Imageworks and Warner Bros, before concentrating on pre-visualisation. I started Nvizage in 2003, an effects company called Nvizible in 2009, and then Ncam in 2011.
Could you tell a little about Ncam?
We started Ncam because people were asking for on-set pre-visualisation; a way of seeing how live action and visual effects would fit together. We wanted to put the VFX back through the viewfinder in real time, matched to what the camera was doing.
The core idea was to give the control back to directors and DoPs, even in effects-heavy movies.
Where does Ncam fit into this?
If you are going to have this freedom of movement in immersive graphics, there has to be a way of knowing what the camera is doing. Traditionally this was done with patterns on the walls or ceiling, but that is cumbersome and limiting. Ncam is a way of tracking camera movements – on a pedestal, a Steadicam or even handheld – using the real objects in the scene itself or on location.
How does it work?
Bolted to the camera – and it can be any camera – is what we call the Ncam Camera Bar. It is a black bar that contains numerous sensors. It is light and discrete, so it does not change the feel of the camera.
We also need to know what the lens is doing. Broadcast lenses tend to have digital control, which we can read. For film lenses we add a follow focus like a Preston or C-Motion.
All this data feeds to a breakout box, and there is a single ethernet cable to a server which does the heavy computation. We have two versions of our software: Ncam Reality for film, and Ncam Live for real-time broadcast.
The output of our software is accurate positional data which allows the graphics computer to control its view. We have standardised interfaces based on the open Free-D standard (originally developed by the BBC), links to common immersive graphics systems like Vizrt and Brainstorm, and an SDK for users to create their own interfaces.
How is it used?
There are lots of applications. In movies or commercial productions where you are integrating live action with graphics, it is so much better to be able to see how they will fit together on set. It means you frame the live action correctly, and the actors can understand how they react with the virtual elements and get their eye-lines right, and so on. You can, of course, fix this in post, but it is better to get it right in the first place.
For live television, it means you can place virtual graphics into a scene and have them stay precisely in place, so they add to the viewer’s enjoyment and do not wobble around and cause distraction. ESPN uses it on Monday Night Football, and Fox Sports is also a big user.
Recently we implemented a system at The Weather Channel, which allows the presenter to talk about how weather systems form, interacting with really complex graphics. So the presenter can walk around a tornado in the studio, pointing out the elements which allow the system to build and where the big damage can get caused.
What are the benefits to broadcasters and film-makers?
In movies, it gives the power back to the director and DoP because they can see what the final composite will look like. As well as putting the creative control where it rightfully belongs, it also saves money because it needs less work, and less re-work, in post.
And actually you are able to make better shots, because you can compose the live action shoot to match accurately the virtual elements. You are framing the whole shot, live and virtual.
Another benefit of Ncam is that the user can collect a complete set of metadata of what the camera is doing frame-by-frame. As well as the three-dimensional position and rotation of the camera, you also get focus, iris and zoom data from the lens. This means that if/when you have to start match-moving in post, you save a huge amount of processing and even more guesswork.
In broadcast, you can put graphics and stats onto the playing surface of sports to make the experience better for the audience. You can also place advertising where previously it has been impractical. You could put sponsors’ logos onto golf courses, for example. The inserted graphic is not fixed as it would be if it was a physical thing: you can add animations or even live video. It really is more immersive, flexible and more fun for the audience.
Who is using it?
We are seeing people adopting Ncam all around the world. Our users today are in Japan, China, Australia, Thailand, Hong Kong, Latin America as well as the big creative markets in the USA and Europe. We are able to support our users through various sales channels and recently signed an agreement with Vitec Videocom which will support this further as they currently sell into more than 100 countries.
What does the future hold for Ncam?
For augmented reality, depth will play a very big part. We need to understand three dimensional space at a per pixel level so we can really start interacting with the graphics.
I think too that the graphics companies will match that with photo-realistic rendering. Ultimately we will not be able to distinguish between the virtual and real elements of an image. Once we can do this, there will be no obstacles and it will be down to the creative imaginations of designers, directors and cinematographers.
- Article #22
- behind the scene
- HERO OF A DAY
- In depth
- Inspiration #21
- Inspiration #22
- Learning path
- making of
- Master class
- OpenDrives Workflow Series
- Skyline effect
- Skyline realize #21
- The Maze Runner
- tip for production
- VFX #21
- VFX #22
- VFX magazine #22. ind
- VFX magazine #23.ind
- VFX tutorials
- VFX.ind #21