RenderStream - Unity Integration (2024)

RenderStream is the proprietary Disguise protocol for controlling third party render engines from Designer. This topic covers the steps to configure Unity with RenderStream.

For cluster rendering, we recommend the use of render nodes from the same media server product range, e.g. all rx series machines. Mixing of machines from different product ranges is untested and unsupported.

Unity is a third-party game engine that can be used to create 3D scenes. Unity projects can be adapted such that they create RenderStream output streams using Disguise’s Unity plugin. This plugin supports both camera and non-camera based mappings.

A Unity Pro license is required. Please visit the Unity website for more information on purchasing licenses and training on the software.

Plugins

  • In order to communicate with Designer, Unity requires the installation of a plugin on the render node.

  • The RenderStream pre-packaged plugins for Unity are available in the Disguise RenderStream-Unity GitHub. For the most up to date Unity plugin, you can compile the plugin from the source code under releases.

  • Place the plugin into this folder :PROJECT_ROOT/Assets/DisguiseUnityRenderStream

  • When adding a plugin to a Unity project, it is important that it is placed in the correct location and that the folder containing the plugin files is named correctly otherwise unexpected errors may occur.

  • The avaialble plugin uses DX11,Direct3D11. Unity has developed the integration further for DX12, Direct3D12 which they are making available.

Unity project setup

  1. Launch Unity, navigate to Projects and select New.
  2. Create a new 3D project.
  3. Name the project, set the location to: C:\Users\USERNAME\Documents\Renderstream Projects or D:\Renderstream Projects if using a system with a media drive (e.g. RX) and select Create.
  4. Open the project folder and place the plugin inside the ‘Assets’ folder.
  5. Select File followed by Build Settings:
    1. Set Architecture to Intel 64-bit.RenderStream - Unity Integration (1)
  6. Navigate to Player Settings… and Configuration:
  7. Set Api Compatibility Level to .NET FrameworkRenderStream - Unity Integration (2)

Options

Set GameObject Channel visibility

  1. In Project Settings > Tags and Layers and Layers.
  2. Name an empty User Layer.
  3. Select any object from the scene.
  4. In the Inspector panel, add your new layer in the Layer paraneterl.
  5. Select your Camera(s)
  6. Select whether or not you want the Camera(s) to see the objects in your newly defined Layer by opening the Culling Mask dropdown from the Inspector panel.

Building the Unity project

RenderStream integrates with Unity builds rather than the Unity editor/game mode.

  1. If you have multiple scenes in your project, ensure that the scene you wish to build appears in File > Build Settings.
  2. Build the project.
  3. Save the project and then close Unity.
  4. Ensure that the build folder is copied to your RenderStream Projects folder.

Ensure the correct version of Designer is installed on all machines in network, including render nodes.

RenderStream Layer configuration in Desiger

  1. Create a new RenderStream Layer in your Designer project.
  2. Select Asset and choose your executable.
  3. Right-click the Asset to open its editor and set Source Machine to the machine with the asset (used for syncing content).
  4. Select Cluster pool and create a new cluster.
  5. Within the Cluster Pool, add the desired machines.
  6. Select Cluster assigner and create a new Cluster Assigner.
  7. Within the Cluster Assigner:, select the Asset.
  8. Select New Channel Mapping.
  9. Within each Channel’s row, select the relevant mapping type, assigner and load weight.
  10. If the machines in the cluster pool do not have the content or the project has changed, press Sync.
  11. Ensure all Sync Tasks are marked completed.
  12. Press Start.
  13. Wait for Workload status to switch to Running.

Firewall settings can interfere with transport streams and/or cluster communication. You may see a Windows security dialog pop-up the first time you run the asset.Click Enable / Allow on this Windows pop-up to allow RenderStream communication to/from designer.

If the firewall is the problem check your outbound firewall rules on the Unity machine, and the inbound firewall rules on the Disguise machine for either software being specifically blocked.

Unity assets can only be split across multiple nodes when using 3D Mappings (i.e. Camera Plate or Spatial). Attempting to split using a 2DMapping will not work; all nodes will render the entire frame.

Exposed parameters

The Unity plugin allows you to expose certain options for each component of an object available within the scene. These options will be presented as parameters within the RenderStream Layer in Designer. Modifying the value of a parameter in turn changes the value of the corresponding option in Unity thus altering the options of the selected object within the scene.

Exposing a parameter in Unity

  1. Select an object from within the scene (e.g. any light source).
  2. Select ‘Add Component’ at the bottom of the Inspector panel.
  3. Add the ‘Remote Parameters’ component.
  4. Drag and drop the component you wish to expose (e.g. Light) into the ‘Exposed Object’ field.
  5. Expand the ‘Fields’ separator.
  6. Select all options you wish to expose (e.g. Colour and Intensity).
  7. Save all changes.
  8. Build the project.
  9. Save and close Unity.
  10. Ensure that the build folder is copied to your RenderStream Projects folder.
  11. Open the RenderStream layer in and start the workload.
  12. Modify parameter value(s).

RenderStream - Unity Integration (3)

Remote Texture Parameters

The Unity plugins offers support for sharing textures remotely through the use of exposed parameters. This allows a two-way flow of video content between Designer and the Unity engin

  1. [Optional] Add a Plane (or any other 3D game object) to the scene.
  2. Create a new Render Texture: Select ‘Assets’ in the Project panel. Right-click inside and select Create > Render Texture.
  3. Drag and drop the new Render Texture onto your Plane (or 3D game object of your choice) in the scene. Confirm that the new Render Texture has been added as a ‘Material’ component to your game object. Confirm that the new Render Texture has been set as a material element in the ‘Mesh Renderer’ component of your game object.
  4. Expose the Render Texture as a remote parameter: Add a ‘Remote Parameters’ component to the Plane (or 3D game object of your choice). Drag and drop the new Render Texture (Material) component into ‘Exposed Object’. Open the ‘Fields’ separator and ensure that “Main Texture” is enabled (no need to enable any other fields).
  5. Build the Unity project.
  6. Save and close Unity.
  7. Open the RenderStream layer in Designer and start the workload.
  8. Create a new layer (e.g. Video) and assign media (e.g. Ada).
  9. Move the new layer underneath the RenderStream Layer with Ctrl+Alt+down arrow.
  10. Alt+drag to arrow the newly created layer into the RenderStream Layer.
  11. Confirm arrowed input appears in the RenderStream content.

RenderStream - Unity Integration (4)

Screenshot showing how to create a Render Texture and expose it on an object.

Remote Parameters-3D Object Transforms

You can also expose a Unity GameObject’s transform parameters; this will allow you to control the object’s movements in two ways which is defined by the field you expose. When exposing a GameObject’s Transform, you will have the options to expose the following fields:

  • Transform: This allows you to control the full 3D transform (translation, rotation, scale) of the GameObject using a null object in Designer. This workflow is known as 3D Object Transform. The null object acts as a ‘proxy’, allowing you to move objects in 3D via the on-screen transform handles, or by linking it to a dynamic transform data source, e.g. a tracking device.

  • Local Rotation, Local Position, Local Scale: If you expose the local options these will give you the ability to keyframe the rotation, positions and scale in Designer allow for quick and easy manipulation of a Unity object on the Disguise timeline.

Note: If you expose all of the fields, the Local Rotation, Local Position and Local Scale will override the Transform in Designer. This will result in the null object not controlling the Unity GameObject.

Remote Parameters-Text Parameters

The Exposed Parameter workflow can also be used to expose live text parameters using a “3D text” actor in the Unity engine. You can use the Remote Parameters workflows to expose the text input field in Designer allowing you to edit the text in real time.

Scenes

Scenes in Unity can be composed of any number of game objects which are unique to that scene (by default). The Unity plugin offers two forms of multi-scene support:

Manual - this option restricts Designer’s control of scenes and instead merges all Channels and remote parameters into a single scene.

Selection - this option allows scenes to be controlled from inside Designer; Channels are merged into a single list (duplicates removed) and remote parameters are per-scene.

  1. Create a new scene, or open an existing scene.
  2. Select Resources then DisguiseRenderStreamSettings from the Project panel.
  3. Set Scene Control option accordingly.
  4. Ensure all required scenes are selected in the Build settings.
  5. Build the Unity project.
  6. Save and close Unity.
  7. Ensure that the build folder is copied to your RenderStream Projects folder.
  8. Open the in and Start the workload.
  9. Modify the Scene parameter as part of your normal sequencing.

RenderStream - Unity Integration (5)

Time control

The Unity plugin offers Timecode support. This means that if any game object has a ‘Playable Director’ component and is animated using timeline functionality, it will be reflected in Designer timeline.

Adding time control to an object in Unity

  1. Select Windowfollowed by Sequencing followed by Timeline.
  2. Optionally, move the Timeline Window from the main panel to the bottom panel (i.e. drag and drop Timeline Window next to Console Window).
  3. Select an object from within the scene (e.g. any user placed prop).
  4. Click the Create button within the Timeline Window.
  5. Save the new Timeline.
  6. Hit the Record button on the newly created Animator for the chosen object within the Timeline.
  7. Add an initial keyframe by right-clicking on any animatable property of the object (e.g. Position) and select Add Key.
  8. Move the Playhead along the Timeline.
  9. Modify your chosen property either by using the 3D controls within the Scene or by updating the value directly from within the Inspector panel (a keyframe will be added automatically when a value is changed).
  10. Hit the Record button again to stop recording.
  11. Return the Playhead to the beginning of the Timeline and play the sequence to confirm your animation is correct.
  12. With the object still selected, select Add Component at the bottom of the Inspector panel.
  13. Add the Time Control component.

Build and test time control

  1. Build the project.
  2. Ensure that the build folder is copied to your RenderStream Projects folder.
  3. Save and close Unity.
  4. Open the in and Start the workload.
  5. Play your timeline in Designer.

Useful Unity information

  • There is no need to attach the Disguise RenderStream script to the cameras in Unity. Cameras will auto-configure on asset launch.

  • When using the ‘Manual’ scene selection option in Unity, game objects from all built scenes will not appear in the “Default” scene. In order to merge scenes and/or dynamically load/unload them, a custom script must be used.

  • The “Default” scene will be the first indexed scene from within the ‘Scenes In Build’ table in Unity’s build options.

  • The exposed parameters from all scenes will still show within Designer, even if game objects from all built scenes are not merged into the “Default” scene.

  • When using the ‘Selection’ option in Unity, game objects and exposed parameters are unique to each scene. There is no shared object scene similar to the “Persistent” level in Unreal Engine.

  • When launching a Unity executable for the first time, a Windows Firewall popup will appear. If the executable is not allowed through the firewall, Designer will not be able to receive the RemoteStream Game objects included as part of a Unity template may not be be able to be controlled via Timecode. This is a Unity issue rather than one with the Disguise script.

  • When using the Unity High Definition Render Pipeline (HDRP), an alternative to the Universal Render Pipeline (URP):

    • Both the ‘Windows 10 SDK’ and ‘Game Development with C++’ modules must be installed as part of your Visual Studio installation.

    • The ‘Scripting Backend’ must be set to “Mono” when building the executable.

  • Since Unity Assets are built executables, Disguise will not recognise the ‘Engine’ of them; they will simply be reported as “Custom”. Disguise will not be able to report the Unity plugin version used within built executables. If an incompatible Disguise-Unity plugin combination is used, no explicit notification will be shown.

RenderStream - Unity Integration (2024)
Top Articles
Latest Posts
Recommended Articles
Article information

Author: Zonia Mosciski DO

Last Updated:

Views: 5924

Rating: 4 / 5 (51 voted)

Reviews: 82% of readers found this page helpful

Author information

Name: Zonia Mosciski DO

Birthday: 1996-05-16

Address: Suite 228 919 Deana Ford, Lake Meridithberg, NE 60017-4257

Phone: +2613987384138

Job: Chief Retail Officer

Hobby: Tai chi, Dowsing, Poi, Letterboxing, Watching movies, Video gaming, Singing

Introduction: My name is Zonia Mosciski DO, I am a enchanting, joyous, lovely, successful, hilarious, tender, outstanding person who loves writing and wants to share my knowledge and understanding with you.