AR Suite
LED Volume Workflow

Configure: nDisplay Workflow

28min
executive overview this document is designed to inform a user how the ncam ar suite lite can be used to drive the camera viewpoint and frustum of epic’s ndisplay framework prerequisites as of 08/06/2022 epic haven't updated their in camera vfx project or documentation from ue4 27 to ue5 however the principals remain the same the user guide has been written for unreal version 5 x+ and requires at least version 2 0 0+ of the ncam ar suite lite plug in or newer it builds upon epic’s incameravfx project which is included with the unreal engine to show how ncam can be integrated into a scene that has already been configured for use with ndisplay for this reason and before progressing further please be sure to first read the ndisplay quick start guide https //docs unrealengine com/5 0/en us/ndisplay quick start for unreal engine/ to become familiar with the incameravfx template project epic also provides a comprehensive guide to setting up an in camera vfx https //docs unrealengine com/5 0/en us/in camera vfx quick start for unreal engine/ project which is worth studying for a more complete picture on how to bring all of the different elements together the ndisplay quick start guide can be found here https //docs unrealengine com/5 0/en us/ndisplay quick start for unreal engine/ the incameravfx project is included as part of ue5 the in camera vfx quick start guide can be found here https //docs unrealengine com/5 0/en us/in camera vfx quick start for unreal engine/ hardware setup the hardware set up when using ncam and ndisplay together is not too dissimilar to the standard configuration described in epic’s ndisplay quick start guide https //docs unrealengine com/5 0/en us/ndisplay quick start for unreal engine/ the only addition is that each node in the ndisplay cluster must be placed on the same network as the ncaxcvm reality server that is providing the camera tracking information it is also recommended that the ncam datastream sdk is configured with ncam reality to sync with the video signal this ensures that each new packet of tracking data is broadcast on the video's frame boundary which makes the latency between each packet less variable and easier for unreal to synchronize with for further reading on epic’s recommended hardware for ndisplay please refer to their in camera vfx recommended hardware https //docs unrealengine com/5 0/en us/recommended hardware for in camera vfx in unreal engine/ help pages multiple node hardware configuration the image below illustrates how a cluster of ndisplay nodes should be connected together note that each node in the cluster has a network connection to both the master pc and the ncam server single node hardware configuration if only a single led panel is being driven by ndisplay it is possible to simplify the hardware configuration and remove the need for an external genlock signal generator this is made possible by genlocking (synchronizing) the single cluster (node) pc to the sdi video signal through the use of a custom time step in ue5 please note that a small amount of sofware synchronization is still required (see software synchronization below) and although unlikely, this appraoch may still suffer from synchronization issues especially at higher framerates this could cause “tearing” artefacts to appear across the led wall when viewing it through the camera please see the software synchronization chapter later in this guide for more details on how to use software synchronization step by step guide before continuing it is assumed that you have read the ndisplay quick start guide and have familiarized yourself with steps 1 4 you should know how to create a basic ndisplay project using the ndc basic configuration and launch it using switchboard the following steps in this guide will build upon it to add ncam to your ndisplay project create a new project as outlined in step 1 of the ndisplay quick start guide, create a new project from the ndisplay template project select the film / video & live events category and then click next choose the incameravfx example project name the project ndisplayncamdemo and then click create project set up once the project has been created, begin by first loading the ncam ar suite lite plug in before restarting the unreal editor navigating the project the template project is pre configured with an ndisplay configuration called ndisplay incameravfx config that demonstrates the use of a curved stage with four led panels it has three main components that can be customized to change the images that are displayed on the led wall as well as it's geometry ndisplayxform is the root component that defines the local coordinate space of the led wall geometry defaultviewpoint is a component that can be positioned to specify the viewpoint from which the scene is projected onto the led wall geometry most commonly it is placed at the location where the human actors are standing relative to the led wall and it's corresponding geometry this ensures that they are lit correctly by the content of the virtual scene which is being displayed on the led panels icvfxcamera specifies how the camera's viewpoint should be projected onto the led wall it references a cinecameraactor to display an inner frustum , which is a floating window positioned and resized on the screens, to fill the area observed by the camera this inner frustum ensures that regardless of where the camera is looking on the led wall that it's field of view is always filled with the content of the virtual scene together these components allow the led stage to both consistently light the subject and to also correctly display the viewpoint required by the camera more information on this can be found within epic’s in camera vfx documentation connecting livelink to a camera the icfxcamera component of the ndisplay config references a cinecamera which is parented underneath it this camera and subsequently the inner frustum is driven by a livelinkcomponentcontroller that receives its data from a livelink source this section illustrates how to create a new ncam livelink source and connect it to the camera create a livelink source a livelink source is required to connect to the ncam reality server it can receive data from either the ncam sdk or the ncam sdk lite this data is then interpreted and made available to entities (mostly livelinkcomponentcontrollers) within unreal as different livelink subjects start by opening the livelink window create a new ncamlivelink source from the "+ source" drop down menu ensure that the ip address is set to reference the ncam reality server that is streaming the ncam sdk data the packets option must include camera tracking, optical parameters and clip info as shown below once configured, click add the configured livelink source should look similar to the image below note that only the ncamlens subject is needed by ndisplay the dot to the right of the subject should be green when successfully connected create a livelink preset now that live link is configured a preset needs to be saved so that each node on the ndisplay cluster can load it on start up from the livelink window click on the presets drop down menu and choose to save as preset select an appropriate name for the livelink preset and click save set the new live link preset as the default for the project so that it will be loaded when a ue5 instance is launched on the ndisplay cluster to do this, go to the live link plug in settings tab which can be found in the project settings set the default live link preset to reference the newly created live link preset asset connect a camera to livelink a livelinkcomponentcontroller needs to be added to any object in order to control it using the data provided by a livelink subject in this case we to drive the position and frustum of the camera that is connected to the inner frustum of the ndisplay config conveniently within the incameravfx project a camera called cinecameraactor1 has already been created for this purpose and a livelinkcomponentcontroller has even been added to it already as seen in the image below if the camera being used hasn't got a livelinkcomponentcontroller attached then add one by selecting the camera in the world outliner clicking the + add button selecting live link controller from the drop down menu finally, the newly created (or pre existing) livelinkcomponentcontroller needs to be configured by connecting it to the " ncamlens" livelink subject first select the livelinkcomponentcontroller component attached to the camera within the detail panel set the subject representation to ncamlens once this is done the camera should be driven by the livelink data and the ndisplay correctly set up software synchronization synchronization is a very important consideration and critical to ensure that the images rendered by each node are displayed on the led walls within the frame boundaries of the camera without any synchronization a number of visual artifacts could be encountered these range from apparent stuttering in the camera motion, an offset between two led walls or tearing of the image when viewed through the camera to prevent these issues each node in the cluster must be instructed to render and display it’s viewpoint on the led wall before the camera captures a new frame the preferred way to do this is to use a hardware genlock as described in the ndisplay user guide however, other alternatives are also possible for simple installations that only have a single (or few) led walls this method uses a custom time step to instruct unreal to render only at specific intervals and the instructions in this section illustrate how to do this camera sdi genlock the preferred method of syncronizing the ndisplay nodes together if an external genlock is unavailable is to use the sdi video from the camera instead this would require each ndisplay node to have it's own supported (aja/bmd) sdi video card which is connected to the output of the camera by using either a blackmagic sdi input or an aja sdi input custom time step the unreal engine will only render a new viewpoint for the led wall once per frame as ncam reality should also have it's sdk datastream synced with the video it is guaranteed that only a single packet of data will be sent per frame to ensure that the ncam sdk packets are always processed once per frame the timed data monitor can be used to offset their timing create a media profile the easiest way to manage the use of a custom time step (genlock) is with a media profile if the ncam workflow wizard was used to set up livelink then one has already been created and the next step can be skipped in the content browser click on the + add button, navigate to the media sub menu and select media profile apply the media profile to the level as shown below configure the media profile and genlock (custom time step) for the media profile to be loaded when each ndisplay node starts up it must be added to the project settings once the media profile has been created and added to the project it needs to be configured start by opening it up by either double clicking on the asset in the content browser or by using the icon on the toolbar as shown below enable the option override project settings under the genlock tab and select either the aja or blackmagic sdi input depending on which devices are installed in the ndisplay nodes configure the genlock by selecting the correct options for the sdi input in this example a blackmagic sdi genlock was used and the video format was set to 1080p at 25 fps to verify that the unreal editor has been successfully genlocked you can enable the fps display on the viewport it should report that the genlock is synchronized and the frame rate should match that of the video improving syncrhonization using the timed data monitor both the unreal engine and the data from ncam reality should now be synced with the video's framerate this is desirable but due to latency in the network or software it can't be guaranteed that unreal won't always start rendering after the ncam data has been recieved if left unresolved this can result in sync issues that could cause stuttering in the camera motion or tearing in the image across multiple led panels to make this less likely to happen the ncam sdk data can be delayed slightly to ensure that it is always processed consistently once per frame this is achieved using the unreal engine's timed data monitor which is described more fully in the chapter on step 4 source synchronization docid\ wospolutcqibns nvt0ts the image below shows how the data stream from the ncamlens subject has been offset by just over a frame to ensure the lowest latency and liklihood of a sync issue note that the time correction value is in seconds