AR Suite
LED Volume Workflow

Configure: nDisplay Workflow

28min

Executive Overview

This document is designed to inform a user how the Ncam AR Suite Lite can be used to drive the camera viewpoint and frustum of Epic’s nDisplay framework.

Prerequisites

As of 08/06/2022 Epic haven't updated their in-camera VFX project or documentation from UE4.27 to UE5 however the principals remain the same.

The user guide has been written for Unreal version 5.X+ and requires at least version 2.0.0+ of the Ncam AR Suite Lite plug-in or newer. It builds upon Epic’s InCameraVFX project which is included with the Unreal Engine to show how Ncam can be integrated into a scene that has already been configured for use with nDisplay. For this reason and before progressing further please be sure to first read the nDisplay quick-start guide to become familiar with the InCameraVFX template project. Epic also provides a comprehensive guide to setting up an in-camera VFX project which is worth studying for a more complete picture on how to bring all of the different elements together.

The nDisplay quick start guide can be found here.

The InCameraVFX project is included as part of UE5.

The In-Camera VFX Quick Start guide can be found here.

Hardware Setup

The hardware set-up when using Ncam and nDisplay together is not too dissimilar to the standard configuration described in Epic’s nDisplay Quick Start guide. The only addition is that each node in the nDisplay cluster must be placed on the same network as the Ncaxcvm Reality Server that is providing the camera tracking information.

It is also recommended that the Ncam Datastream SDK is configured with Ncam Reality to sync with the video signal. This ensures that each new packet of tracking data is broadcast on the video's frame boundary which makes the latency between each packet less variable and easier for Unreal to synchronize with.

The Ncam Data Stream Preferences tab with the "Sync To Video Signal" options enabled.
The Ncam Data Stream Preferences tab with the "Sync To Video Signal" options enabled.


For further reading on Epic’s recommended hardware for nDisplay please refer to their In-Camera VFX Recommended Hardware help pages.

Multiple Node Hardware Configuration

The image below illustrates how a cluster of nDisplay nodes should be connected together. Note that each node in the cluster has a network connection to both the master PC and the Ncam server.

Hardware configuration of an nDisplay stage with Ncam camera tracking
Hardware configuration of an nDisplay stage with Ncam camera tracking


Single Node Hardware Configuration

If only a single LED panel is being driven by nDisplay it is possible to simplify the hardware configuration and remove the need for an external genlock signal generator. This is made possible by genlocking (synchronizing) the single cluster (node) PC to the SDI video signal through the use of a custom time step in UE5. Please note that a small amount of sofware synchronization is still required (see Software Synchronization below) and although unlikely, this appraoch may still suffer from synchronization issues especially at higher framerates. This could cause “tearing” artefacts to appear across the LED wall when viewing it through the camera.

Please see the Software Synchronization chapter later in this guide for more details on how to use software synchronization.

Hardware configuration of a single nDisplay node without an external genlock signal generator
Hardware configuration of a single nDisplay node without an external genlock signal generator


Step-by-step Guide

Before continuing it is assumed that you have read the nDisplay quick-start guide and have familiarized yourself with steps 1-4. You should know how to create a basic nDisplay project using the NDC_Basic configuration and launch it using Switchboard. The following steps in this guide will build upon it to add Ncam to your nDisplay project.

Create a New Project

As outlined in Step 1 of the nDisplay quick-start guide, create a new project from the nDisplay Template Project.

  • Select the Film / Video & Live Events category and then click Next.
  • Choose the InCameraVFX example project.
  • Name the project nDisplayNcamDemo and then click Create.
Document image


Project Set-up

Once the project has been created, begin by first loading the Ncam AR Suite Lite plug-in before restarting the Unreal editor.

Loading the Ncam AR Suite Lite plug-in requires Unreal to be restarted
Loading the Ncam AR Suite Lite plug-in requires Unreal to be restarted


Navigating the Project

The template project is pre-configured with an nDisplay configuration called nDisplay_InCameraVFX_Config that demonstrates the use of a curved stage with four LED panels. It has three main components that can be customized to change the images that are displayed on the LED wall as well as it's geometry:

The components of the nDisplay config
The components of the nDisplay config

  • nDisplayXform Is the root component that defines the local coordinate space of the LED wall geometry.
  • DefaultViewPoint is a component that can be positioned to specify the viewpoint from which the scene is projected onto the LED wall geometry. Most commonly it is placed at the location where the human actors are standing relative to the LED wall and it's corresponding geometry. This ensures that they are lit correctly by the content of the virtual scene which is being displayed on the LED panels.
  • ICVFXCamera Specifies how the camera's viewpoint should be projected onto the LED wall. It references a CineCameraActor to display an Inner Frustum, which is a floating window positioned and resized on the screens, to fill the area observed by the camera. This Inner Frustum ensures that regardless of where the camera is looking on the LED wall that it's field-of-view is always filled with the content of the virtual scene.

Together these components allow the LED stage to both consistently light the subject and to also correctly display the viewpoint required by the camera. More information on this can be found within Epic’s in-camera VFX documentation.

Connecting LiveLink to a Camera

The ICFXCamera component of the nDisplay config references a CineCamera which is parented underneath it. This camera and subsequently the inner frustum is driven by a LiveLinkComponentController that receives its data from a LiveLink source. This section illustrates how to create a new Ncam LiveLink source and connect it to the camera.

Create a LiveLink Source

A LiveLink Source is required to connect to the Ncam Reality Server. It can receive data from either the Ncam SDK or the Ncam SDK Lite. This data is then interpreted and made available to entities (mostly LiveLinkComponentControllers) within Unreal as different LiveLink Subjects.

  • Start by opening the LiveLink window.
Open the LiveLink window
Open the LiveLink window

  • Create a new NcamLiveLink Source from the "+ Source" drop-down menu.
  • Ensure that the Ip Address is set to reference the Ncam Reality server that is streaming the Ncam SDK data.
  • The Packets option must include Camera Tracking, Optical Parameters and Clip Info as shown below.
  • Once configured, click Add.
Add a and correctly configure a new NcamLiveLinkSource
Add a and correctly configure a new NcamLiveLinkSource

  • The configured LiveLink source should look similar to the image below. Note that only the NcamLens subject is needed by nDisplay. The dot to the right of the subject should be green when successfully connected.
The new NcamSDK Source
The new NcamSDK Source


Create a LiveLink Preset

Now that Live Link is configured a preset needs to be saved so that each node on the nDisplay cluster can load it on start-up.

  • From the LiveLink window click on the Presets drop-down menu and choose to Save As Preset.
Create a new LiveLink Preset for the NcamSDK Source
Create a new LiveLink Preset for the NcamSDK Source

  • Select an appropriate name for the LiveLink Preset and click Save.
Choose an appropriate name for the new LiveLink Preset
Choose an appropriate name for the new LiveLink Preset

  • Set the new Live Link preset as the default for the project so that it will be loaded when a UE5 instance is launched on the nDisplay cluster. To do this, go to the Live Link plug-in settings tab which can be found in the Project Settings. Set the Default Live Link Preset to reference the newly created Live Link Preset asset.
Setting a default LiveLink Preset ensures that it is loaded on start-up
Setting a default LiveLink Preset ensures that it is loaded on start-up


Connect a camera to LiveLink

A LiveLinkComponentController needs to be added to any object in order to control it using the data provided by a LiveLink Subject. In this case we to drive the position and frustum of the camera that is connected to the Inner Frustum of the nDisplay config.

Conveniently within the InCameraVFX project a camera called CineCameraActor1 has already been created for this purpose and a LiveLinkComponentController has even been added to it already as seen in the image below.

Document image


If the camera being used hasn't got a LiveLinkComponentController attached then add one by:

  • Selecting the camera in the World Outliner
  • Clicking the + Add button
  • Selecting Live Link Controller from the drop-down menu.
Add a new Live Link Controller if one doesn't already exist on the camera
Add a new Live Link Controller if one doesn't already exist on the camera


Finally, the newly created (or pre-existing) LiveLinkComponentController needs to be configured by connecting it to the "NcamLens" LiveLink Subject.

Document image

  • First select the LiveLinkComponentController component attached to the camera within the detail panel.
  • Set the Subject Representation to NcamLens.

Once this is done the Camera should be driven by the LiveLink data and the nDisplay correctly set-up.

Software Synchronization

Synchronization is a very important consideration and critical to ensure that the images rendered by each node are displayed on the LED walls within the frame boundaries of the camera. Without any synchronization a number of visual artifacts could be encountered. These range from apparent stuttering in the camera motion, an offset between two LED walls or tearing of the image when viewed through the camera. To prevent these issues each node in the cluster must be instructed to render and display it’s viewpoint on the LED wall before the camera captures a new frame. The preferred way to do this is to use a hardware genlock as described in the nDisplay user guide. However, other alternatives are also possible for simple installations that only have a single (or few) LED walls. This method uses a Custom Time Step to instruct Unreal to render only at specific intervals and the instructions in this section illustrate how to do this.

Camera SDI Genlock

The preferred method of syncronizing the nDisplay nodes together if an external genlock is unavailable is to use the SDI video from the camera instead. This would require each nDisplay node to have it's own supported (AJA/BMD) SDI video card which is connected to the output of the camera. By using either a Blackmagic SDI Input or an AJA SDI Input custom time step the Unreal engine will only render a new viewpoint for the LED wall once per frame. As Ncam Reality should also have it's SDK datastream synced with the video it is guaranteed that only a single packet of data will be sent per frame. To ensure that the Ncam SDK packets are always processed once per frame the Timed Data Monitor can be used to offset their timing.

Create a Media Profile

The easiest way to manage the use of a Custom Time Step (genlock) is with a Media Profile. If the Ncam Workflow Wizard was used to set-up LiveLink then one has already been created and the next step can be skipped.

  • In the content browser click on the + Add button, navigate to the Media sub-menu and select Media Profile.
Create a new Media Profile
Create a new Media Profile

  • Apply the media profile to the level as shown below.

Configure the Media Profile and Genlock (Custom Time Step)

Apply the Media Profile
Apply the Media Profile

  • For the Media Profile to be loaded when each nDisplay node starts up it must be added to the project settings.
Adding the Media Profile to the project settings ensures that it will be loaded on start-up
Adding the Media Profile to the project settings ensures that it will be loaded on start-up

  • Once the Media Profile has been created and added to the project it needs to be configured. Start by opening it up by either double-clicking on the asset in the Content Browser or by using the icon on the toolbar as shown below.
Document image

  • Enable the option Override Project Settings under the Genlock tab and select either the AJA or Blackmagic SDI Input depending on which devices are installed in the nDisplay nodes.
Select the device to gelock Unreal to
Select the device to gelock Unreal to

  • Configure the Genlock by selecting the correct options for the SDI input. In this example a Blackmagic SDI genlock was used and the video format was set to 1080p at 25 fps.
Configure the SDI Genlock
Configure the SDI Genlock

  • To verify that the Unreal Editor has been successfully genlocked you can enable the FPS display on the viewport. It should report that the genlock is synchronized and the frame rate should match that of the video.
Document image


Improving Syncrhonization using the Timed Data Monitor

Both the Unreal engine and the data from Ncam Reality should now be synced with the video's framerate. This is desirable but due to latency in the network or software it can't be guaranteed that Unreal won't always start rendering after the Ncam data has been recieved. If left unresolved this can result in sync issues that could cause stuttering in the camera motion or tearing in the image across multiple LED panels. To make this less likely to happen the Ncam SDK data can be delayed slightly to ensure that it is always processed consistently once per frame.

This is achieved using the Unreal Engine's Timed Data Monitor which is described more fully in the chapter on Step 4 - Source Synchronization .

The image below shows how the data stream from the NcamLens subject has been offset by just over a frame to ensure the lowest latency and liklihood of a sync issue. Note that the time correction value is in seconds.

Document image