AR Suite
...
VR Workflow
Configure: VR Workflow

Step 1 - Workflow Wizard

8min

Use the Workflow Wizard to configure a project for use with Composure or the Ncam AR Suite rendering pipelines.

5.1 Changes

Updates to the media framework in Unreal 5.1 have resulted in small changes being made to the Workflow wizard that is described here. Please see the 5.1 Changes section for details on the modified steps



The Ncam AR Suite Lite plug-in provides a wizard that can streamline the creation and configuration of the assets needed to support an augmented reality workflow. It sets up the Ncam LiveLink source, input SDI video source including an optional composure keying pass, configures the output SDI device and creates a media profile and LiveLink preset so that the configuration can be easily reloaded.

The Workflow Wizard should typically be used when rendering augmented reality graphics using either the Ncam AR Suite Standard plug-in or Composure.

Requirements

Before using the Workflow Wizard the following requirements must be met:

  • A supported video acquisition card must be installed. Ncam utilises Epic's media framework. This means that any card supported by that framework will work with the AR Suite plugins. Please ensure that the recommended drivers are also installed.
  • Ensure that the Ncam server is configured correctly and sending SDK data to the Unreal Engine instance.
  • Ensure that the video signal is being fed correctly into the UE5 server, and that if the video acquisition card has multiple inputs and outputs available that you have identified the specific spigots being used.
  • Where possible timecode should be embedded in the video signal.
  • Progressive signals are strongly recommended. Unreal can not guarantee the frame order when outputting an interlaced signal.
  • Make sure the AR Suite plugins are activated in the "Plugins" window.
  • The project setting to enable the alpha channel through tonemapper must also be enabled. Enable the "Allow through tonemapper" option in Project Settings
Enable the "Allow through tonemapper" option in Project Settings
Enable the "Allow through tonemapper" option in Project Settings


Configure a new workflow

The Ncam Workflow Wizard helps to streamline the creation of the necessary Unreal assets required for the Unreal Engine to render augmented reality content. It creates and configures the Ncam tracking data and video IO assets so that the user only needs to set up the method of compositing afterwards. This is typically done either using Composure or with the pre-configured blueprints supplied as part of the Ncam AR Suite (standard version). It's where the user will define the source of the Ncam tracking data as well as how they intend to send and receive a video signal from Unreal.

The Ncam Workflow Wizard can be found under the Window drop down menu.

The Ncam Workflow Wizard can be opened from the Window menu
The Ncam Workflow Wizard can be opened from the Window menu


This will open the profile Workflow Wizard window which allows the user to define the basics of their setup.

Workflow Wizard Window
Workflow Wizard Window


The options provided are:

  • Ncam Source Configuration - Contains all settings that relate to the connection with Ncam Reality.
    • Use SDKLite - If you want the data to be transferred using the SDKLite or the SDK. The lite version is faster but has limited features.
    • Ip Address - The IP address of the Ncam Server to connect with to receive tracking data.
    • Packets - This drop-down contains a user selectable checklist of the available packets that can be received from the Ncam server. It is recommended to NOT enable all of them by default, see note below.
    • UDPPortNumber - (SDKLite only) port to be used
  • Use Timecode - Enable this option if you intend to synchronize the tracking data and input video using timecode. This method of synchronization is preferred but requires that the video has timecode embedded within it.
  • Video Configuration - Contains all settings that relate to the input video format.
    • Frame Rate - Should be set to the frame rate of the video signal.
    • Resolution - Should be set to the resolution of the video signal.
    • Standard - Should be set to the image standard of the signal. Progressive video is strongly recommended.
    • Timecode Format - Specify the type of timecode embedded in the video signal. This is only required if Use Timecode is enabled.
  • Input Pipeline Config - Allows the user to specify whether additional assets are required to key the input video. Typically the Video with Composure Keyer should be selected when the video contains greenscreen.
  • Input Configuration - Specify the type of the video media device to use depending on the hardware available. Valid options are Aja Media Source or BlackMagic Media Source.
  • Output Configuration - Allows one or two media devices to be added for outputting video. The options will be either Blackmagic or AJA depending on the hardware of the Unreal server.

Once all of the options have been configured, clicking on the Create Workflow button will create all of the the necessary assets. They can then be found inside an Ncam folder in the root of the content directory.

Reality Packets

An explanation of the different packets available is listed below:

  • Camera Tracking - Required - The Translation information of the camera
  • Optical Parameters - Required - The lens information of the camera
  • Distortion Map - Required - The lens distortion information of the camera
  • Undistortion Map - Not currently implemented
  • Clip Info - Camera clip information (slate and take name for instance)
  • Markers - Ncam Reality can track AruCo fiducial markers. Their position and orientation can be streamed to Unreal if this is enabled. See the Ncam Reality documentation for more information on this feature.

The Video with Composure Keyer Input Configuration Option

The Ncam Workflow Wizard has two options for the "Input Configuration". The "Video" option simply captures the video using the specified Input Configuration device and leaves it unchanged. The "Video with Composure Keyer" option generates a BP_MediaPlateCompElement within the scene and connects it to the input media device. This allows a key to be applied to the input video to matte out areas of blue or green screen.

Selecting "Video with Composure Keyer" allows green screen to be removed for irtual studio applications
Selecting "Video with Composure Keyer" allows green screen to be removed for irtual studio applications


If you select Composure Keyer then in ouliner objects highlighted below are created.



An explanation of the how the "Video with Composure Keyer" configuration works
An explanation of the how the "Video with Composure Keyer" configuration works