Step 8 - Configuring the Composure Stack
Once the camera has been set-up the Composure composite can be created. Start by opening the Composure window from the Virtual Production tab of the Window menu.
Right-click within it to create a new empty Composure Comp shot.
This step should only be completed if the "Video" Input Configuration option was chosen when running the Workflow Wizard and therefore no Keyer is required.
Once the comp has been created it's input needs to be connected to the Media Bundle texture that was configured during Step 3 - Add Media Bundles . Do this by selecting the Comp and expanding the "Input" section within it's property panel. Add a new input to the "Inputs" array by clicking on the "+" button. Expand the list of inputs and set the type of the new input pass to be a "Texture Input". Make it reference the "T_Video_BC" asset before finally setting the pass' name to "Film".
This step should only be completed if the "Video with Composure Keyer" Input Configuration option was chosen when running the Workflow Wizard because the video requires keying.
The "Video with Composure Keyer" Input Configuration option within the Ncam Workflow Wizard generates a BP_MediaPlateCompElement within the scene that is automatically connected to the input media device. It can be used to remove green/blue screen from the input video. As it has already been configured just move it into the Comp composure stack.
A CG layer element is needed to render the scene and apply the lens distortion to it. Create it by right-clicking on the Comp and selecting "Add Layer Element" followed by "CG Layer". By default the new CG Layer Element will locate and use the CineCameraActor that is present in the scene.
To ensure that lens distortion is applied to the CG, first select the CG layer and enable the "Apply Distortion" check-box that can be found under the "LensDistortion" section.
Usually an augmented reality composite is created in one of two ways. Either the CG is overlaid onto the video to create the illusion of a virtual object existing within a real scene or the inverse is achieved where a masked element is placed into a virtual scene. Both setups are explained in the following two sections. Follow the one which is appropriate for your scenario.
In this example we will construct a "Virtual Overlay" composite where the CG is overlaid onto the video.
Start by creating a new post-process material that contains two TextureSampleParameter2D nodes called "CG" and "Film". Use an "Over" node to layer one over the other as shown in the image below.
Add the newly created post process material to the Comp as a new Compositing Pass as illustrated in the image below. Set the CG and Film inputs to their respective elements.
In this example we will construct a "Virtual Studio" composite where the keyed video is overlaid onto the CG background.
Start by creating a new post-process material that contains two TextureSampleParameter2D nodes called "CG" and "Film". Use an "Over" node to layer the film over the CG as shown in the image below.
Add the newly created post process material to the Comp as a new Compositing Pass as illustrated in the image below. Because the compositing material pass requires the keyed video which is output from the NcamKeyedVideoMediaPlate pass, be sure to set the "Film" input element to "NcamKeyedVideoMediaPlate".
This step will only work successfully if an output device was added during when running the Ncam Workflow Wizard.
The final step is to output the result of the Comp over the SDI. This stage requires that the Ncam Workflow Wizard was run with at least one output device specified in the list of outputs.
To connect the output of the Comp to one of those output devices first select the Comp from within the Composure window. Scroll through it's properties to locate the section called "Output". Click the "+" button to add a new output to the "Outputs" list. Change the type of the output "Compositing Media Capture Output". This type allows the Comp to write it's output to an existing media capture device via a media proxy. In this case we want to connect it to one of the two possible devices created when running the Ncam Workflow Wizard. For each device there exists a proxy asset which should be chosen for the Comp's "Capture Output" attribute. For example, "OutputProxy0" will write the output of the Comp to the first device and "OutputProxy1" to the second device if it exists. These steps are illustrated in the image below: