Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
This page details the file types and formats supported in VIVERSE Polygon Streaming.
Polygon Streaming supports the following formats:
.GLB
.glTF (zipped)
.OBJ (zipped)
It’s highly recommended to use a single .GLB file to convert your 3D models. This will ensure that the file contains all the necessary data, and its following the correct standards used during conversion. 3D models must be a triangulated mesh with PBR materials following metallic-roughness standards.
In case you use a .glTF model, all the necessary files need to be zipped into a single .zip file. It should at least contain a .gltf file, a .bin file and all the texture images in the formats .jpg or .png.
In case you use a .OBJ model, all the necessary files need to be zipped into a single .zip file. It should contain a .obj file, a .mtl file, and all the texture images in the formats .jpg or .png, making sure they are properly set and connected. You can also upload a single .obj file in case there are no textures or .mtl file needed.
Image-based lighting (or "lightmapping") can save GPU processing time by baking real-time lighting information into static images that are then overlaid on models as texture maps. Currently, a single lightmap is supported via a separate file. When uploading your model to https://stream.viverse.com/, package it in a .zip
file containing your .glb
or .obj
and a file named either lightmap.hdr
or lightmap.png
. This lightmap will be automatically applied to the Streaming Model inside PlayCanvas when running the VIVERSE Create SDK, and should be using the first UV set, often called UV0
. There is no size limit to this lightmap, but most browser will error on image files larger than 8k. To reiterate, only one (1) lightmap file is supported at this time.
JPG
PNG
BMP
KTX
Diffuse/Albedo
Metallic/Roughness
Transparency/Opacity
Normals
Ambient Occlusion
Emissive
KHR_draco_mesh_compression
KHR_materials_unlit
KHR_texture_basisu
KHR_materials_specular
KHR_materials_pbrSpecularGlossiness
KHR_texture_transform
KHR_materials_ior
KHR_materials_transmission
KHR_materials_volume
KHR_materials_clearcoat
This page introduces you to the goals and technology that make up VIVERSE Polygon Streaming.
VIVERSE Polygon Streaming is an intelligent visualization core technology that overcomes modern-day 3D visualization challenges. It unleashes the potential to stream spatial content from the cloud and seamlessly shares high-fidelity models and worlds on any device.
Polygon Streaming aims to provide creators, developers, designers, and enterprises with tools to build immersive, high-quality 3D content that is widely compatible with the most popular game engines and can be experienced across different platforms.
VIVERSE Polygon Streaming focuses on the next shift in computing and the future of the internet. It breaks down barriers and enables accessibility from any device (mobile, tablet, PC, and even XR headsets), allowing people to create, share experiences, and collaborate like never before.
Allow end-users to consume big 3D models rapidly by only downloading the geometry that has relevant visual impact for them at their specific position and bandwidth. With Polygon Streaming, you can stream large 3D models into your platform or projects without taking up local storage space, leveraging the number of polygons that can be shown in a cross-platform and multi-device way.
Only transfer geometry is needed in the user’s current position and field of view in the scene over time.
Automatic LOD generation and texture compression. Texture compression method optimized for lightweight decompression.
Minimal back-end server requirements, only requiring HTTP requests.
Utilize the end-user's GPU for full rendering. No Cloud GPU is necessary at any step.
This page details important for avoiding errors when working with VIVERSE Polygon Streaming.
When working with high-fidelity 3D content using Polygon Streaming, it's essential to differentiate between Preview Mode and Play Mode (also known as Runtime, Live Mode, or Launch Mode depending on the game engine). For simplicity, we will use "Play Mode" as a shared term throughout this article, although each engine may refer to it differently. Understanding these modes is crucial to ensure that developers are testing their models under realistic conditions. Below, we'll explore what to expect when using Polygon Streaming in PlayCanvas, Unity, and Unreal Engine, and provide clear instructions on how to test models effectively in each environment.
Preview Mode: This mode provides a quick visualization within the editor, allowing developers to see a basic representation of their models and adjust placements in the scene. It offers a rough approximation and may show visual artifacts. This mode does not fully utilize Polygon Streaming capabilities and should not be used for assessing the final quality or behavior of the models.
Play Mode: Known by different names such as Runtime or Launch Mode in various engines, Play Mode simulates real-world conditions by running the application as it would be experienced by end-users. Polygon Streaming is fully active in Play Mode, ensuring high-quality rendering and optimal performance. Testing in Play Mode is essential for final model evaluation.
In PlayCanvas, models utilizing Polygon Streaming are treated differently between modes:
Preview Mode: In PlayCanvas, Polygon Streaming models are not visible in Preview Mode. This mode only shows basic placeholders, making it unsuitable for evaluating the model’s quality or performance. Developers can use this mode for simple adjustments and placements, but it doesn't reflect the actual capabilities of Polygon Streaming.
Play Mode: Known as "Launch Mode" in PlayCanvas, this mode is activated by clicking the "Launch" button. Play Mode is where Polygon Streaming becomes fully operational, streaming the model from the cloud and rendering it in high fidelity. Testing models in Play Mode is crucial for an accurate assessment of both performance and quality.
In Unity, the terminology and functionality align with industry standards, using Edit Mode and Play Mode:
Edit Mode: Similar to Preview Mode, Edit Mode allows developers to place and adjust models within the Unity editor. While models can be viewed in this mode, the representation might include artifacts, and streaming optimizations are not fully applied. Edit Mode should not be relied upon for evaluating final model quality or performance.
Play Mode: Activated by pressing the "Play" button in the Unity interface, this mode initiates the full capabilities of Polygon Streaming. The models are streamed in real-time from the cloud, showcasing their high-quality rendering and optimized performance. Play Mode is essential for realistic testing and accurate visualization.
Unreal Engine follows a similar structure, with Editor Mode and Play Mode being key concepts:
Editor Mode: Referred to as Preview Mode in other contexts, this mode in Unreal Engine allows for basic visualization and placement of models. However, it does not utilize the full power of Polygon Streaming, and visual artifacts may be present. Editor Mode is useful for setup but not for final evaluation.
Play Mode: By clicking the "Play" button in Unreal Engine, developers activate Play Mode, where the game is simulated under real-world conditions. Polygon Streaming is fully engaged, streaming high-quality models from the cloud and ensuring that the performance and appearance meet expectations. Play Mode is crucial for validating the final output.
Apart from the in-editor modes like Preview and Play Modes, it's also important to understand how the "Model Preview" feature works within the console. When you upload a model to the Polygon Streaming platform, you can access a "Model Preview" option. This preview opens in a new browser window and provides a basic visualization of your model, giving you a quick way to see the model's appearance and structure.
However, like Preview Mode in development environments, the "Model Preview" is not indicative of the final streamed quality. It serves more as a preliminary check for the model's appearance rather than a comprehensive evaluation of its streaming performance or interactive capabilities. Developers should always perform thorough testing using Play Mode in their chosen game engine to fully understand how their models will behave in a real-world application.
This page guides you through the steps of uploading an asset to the Polygon Streaming platform.
After selecting your file you will be able to click Convert to convert the 3D asset into a streaming model. You can also choose different options before conversion:
After selecting your file you will be able to click Convert to convert the 3D asset into a streaming model. You can also choose different options before conversion:
Default Resolution will use default resolution for the textures of your models, a middle term between low and high resolution.
Low Resolution will lower down the resolution of textures, making the file to stream lighter in exchange of a visual quality drop.
High Resolution will ensure the resolution of textures is as high as possible, which will ensure the highest visual quality. However, depending on the model, especially very large models like full environments, it could make the file heavy to stream and display.
Once you choose the desired options you can press Convert and the conversion of the model will start. Once the conversion is done, the converted model will appear in the Models panel.
In the Model panel you can click on the icons next to your model to either preview the model in a new tab on the web-browser, or copy the converted URL address to use inside your projects using the Polygon Streaming integrations and plugins.
This document provides a guide for using the Chrome Browser Extension to integrate Polygon Streaming into a PlayCanvas project that targets publication to VIVERSE.
Start by opening your PlayCanvas project and setup using the VIVERSE Extension. Once the extension is setup and you’re logged in, you are ready to add special components. To add a Polygon Streaming component to the scene, add an Empty component to the scene and give a name like "Streaming Model".
With the new component selected, click on EDIT VIVERSE EXTENSION. Choose a Media type of plugin, select the module PolygonStreaming and click the plus + symbol. Now you just added a Polygon Streaming component to your scene.
Paste the Asset ID of your streaming model into the Polygon Streaming URL field. Once you added the URL in the field, the object will preview inside your scene. This is just a preview; once Published, the streaming model will start actively streaming from the cloud.
That’s all you need to stream your models inside your VIVERSE Scene. You can also modify the Polygon Streaming Parameters going to Viverse Scene Settings / Polygon Streaming and change the parameters according to your preferences. You can check the Supported Parameters list below in this documentation.
After you published your VIVERSE World scene, the Streaming Model will automatically start to stream inside the scene.
Option 1: Chrome Browser Extension Plugin
PlayCanvas Chrome Browser Extension includes the Polygon Streaming Plugin.
Instructions for using the Polygon Streaming Plugin can be found .
Option 2: Standalone Plugin
Add polygon-streaming.js to PlayCanvas project.
Instructions for using the Standalone plugin can be found .
Option 3: HTML Scripting
Download build of PlayCanvas project to access HTML.
Instructions for using HTML Scripting can be found .
Start by logging into the . Then you will be able to upload a 3D asset into the platform by simply dropping a file in the upload field and clicking the upload button.
This page details how to control the settings of individual VIVERSE Polygon Streaming models within your PlayCanvas project when using the VIVERSE PlayCanvas extension.
The Streamable Model Component or Polygon Streaming (VIVERSE SDK) component represents the model to be streamed inside your project. It will ask for a Polygon Streaming URL, which is the address of your newly converted model. Once you paste the address in that field, it's ready to be streamed inside your scene
Initial Triangle Percentage is the percentage of triangles to be first loaded in the scene. You should enter a number between 0 and 1. The lower the number the quicker the model will appear but at a lower quality though will start to update to a higher quality straight away.
Cast Shadows and Receive Shadows should be checked in order for the model to cast and receive shadows.
Force Double Sided Materials should be checked if parts of the model are not showing up. This usually resolves problems with incorrect normals on the model. If you are not encountering problems with the model you should leave this setting turned off or you will double the amount of triangles that need to be rendered.
Use Alpha should usually be turned on but can be turned off to increase performance but transparent materials will render opaque.
Play Animation Automatically determines whether to play the embedded animation when the scene is loaded.
Animation To Play is the embedded animation to play when the scene loads. This can either be the name of the animation or it's index i.e. 0 would be the first animation. If it's left blank it will play the first animation. You can view the names of the animations in a GLTF file in the PlayCanvas Model Viewer. If you have supplied an animation state graph the initial animation will be the animation that is set in the default state. The default state is the one that is connected to START in the state graph editor.
Animation States map states in the animation state graph (see below) to the embedded animations. To the left of Array Size enter the number of state/animation mappings you want. State should be the state name in the state graph. Animation is the name or index of the embedded animation and layer is the layer in the state graph. If it's left blank it will use the base layer.
Scripting Polygon Streaming animations works in the same way as standard PlayCanvas animations but you shouldn't try to access the anim component in the initialize method of a script component as it's only added once the model has loaded.
Below is an example of triggering an animation using a state graph. The state graph has a running parameter and when it's set to true will transition to the Running state. You will need to setup the parameters, states and transitions in the state graph.
this.entity.anim.setBoolean('running', true);
It is possible to change the animation without a state graph. The following switches to the idle state with a transition duration of 0.2 seconds. if no animation states are supplied it will use the embedded animation name as the state name.
this.entity.anim.baseLayer.transition('idle', 0.2);
Animation State Graph allows you to control the transition between different animations when triggered from code. You can create the asset in the PlayCanvas editor.
Priority Level and Quality Priority affect the same setting but the first provides presets while the second allows you to enter a number. This setting affects how triangles are allocated to each model and is a ratio. For example if you have one model set to 1 and another set to 2 and both models are the same distance from the camera, the second model will receive twice as many triangles as the first.
Default
1
Higher
1.5
Highest
2
Custom
Enter a number in Quality Priority
Environment Asset allows to set a custom environment map to use for the streaming model. It expects a cubemap which you can create in the PlayCanvas editor. This is useful if you have not set an environment asset in the scene but still want reflections to show on the model.
This page details the process of setting up the VIVERSE Polygon Streaming plugin in Unreal Engine.
To start using the plugin, there are two main classes (Actors) that can be found in the Plugin's folder, or in the Classes menu, which will be all you need to stream your models inside Unreal Engine. The Stream Controller, and the Streaming Model.
You can drag and drop both Actors inside your Unreal Scene, and be ready to start streaming your converted 3D model inside the engine.
The Stream Controller should only be added once, and it controls the streaming feeds inside your scene. The Streaming Model is an Actor to be used with all models that will be streaming from the cloud into your project, you only need to add the URL of your converted model into the URL field to start streaming. You can add as many Streaming Models as you want inside your scene.
Once that's set, press play and your streaming models will already start streaming inside your project!
This document guides you through the three options available for integrating VIVERSE Polygon Streaming into PlayCanvas projects.
Polygon Streaming component settings are only available when using the VIVERSE of the PlayCanvas editor.
Option 1: Chrome Browser Extension Plugin
Used for projects targeting publication to VIVERSE. Dependent on the PlayCanvas Chrome Browser extension being installed.
Option 2: Standalone Plugin
Used for projects NOT targeting publication to VIVERSE. Dependent on creation of PlayCanvas entities in editor.
Option 3: HTML Scripting
Used for projects NOT targeting publication to VIVERSE. Dependent on manually adding HTML code to PlayCanvas project.
This document provides a guide for using HTML Scripting to integrate Polygon Streaming into a PlayCanvas project that does not target publication to VIVERSE.
For PlayCanvas projects that are not targeting publication to VIVERSE, this PlayCanvas Direct HTML Plugin is the second path for integrating Polygon Streaming into those projects. Because these projects will not be published to VIVERSE, we've created a direct HTML version of the plugin that does not utilize the PlayCanvas VIVERSE Chrome browser extension.
The script in PlayCanvas should be added to your HTML file after initializing of pc.Application, but before you attach the script component to the Entity and run Application.start()
// add a camera entity for Polygon Streaming SDK to use
const camera = new pc.Entity("camera");
camera.addComponent("camera", {
farClip: 1000,
nearClip: 0.1
});
app.root.addChild(camera);
After then you need to define a new Entity, add it to the root and define the attributes (described here). In the streamController only the camera attribute is required and in streamableModel only the path attribute is required but all attributes are shown below for informational purposes.
// add a Polygon Streaming controller
const streamController = new pc.Entity();
streamController.addComponent("script");
streamController.script.create("streamController", {
attributes: {
camera: camera,
cameraType: "nonPlayer",
occlusionCulling: true,
occlusionGeometry: "boundingBox",
occlusionQueryFrequency: 8,
triangleBudget: 3000000
mobileTriangleBudget: 1000000,
minimumDistance: 0.01,
distanceFactor: 1.1,
distanceType: "boundingBoxCenter",
maximumQuality: 15000,
closeUpDistance: 3,
closeUpDistanceFactor: 5,
iosMemoryLimit: 0
}
});
// add a Polygon Streaming model
let streamableModel = new pc.Entity();
streamableModel.addComponent("script");
streamableModel.script.create("streamableModel", {
attributes: {
path: "/model.xrg",
qualityPriority: 1,
useAlpha: true,
castShadows: true,
receiveShadows: true,
doubleSidedMaterials: false,
initialTrianglePercent: 0.1
playAnimationAutomatically: true,
animation: 0
}
});
// add model to stream controller
streamController.addChild(streamableModel);
// add stream controller to the scene
app.root.addChild(streamController);
In this guide, you will find information on how to prepare you 3D models to achieve the best outcomes when converting and streaming your models within the supported platforms and devices.
In this guide you will find information on how to prepare your 3D models to achieve the best outcomes when converting to Polygon Streaming solutions and streaming your models within the supported platforms and devices.
Physically based rendering (PBR) has recently become the standard in many 3D applications, such as Unity, Unreal, 3D Studio Max and 3D Web platforms. Polygon Streaming is also fully based on this standard, for a more performant and realistic material and textures representations.
This approach differs from older approaches in that instead of using approximations for the way in which light interacts with a surface, a physically correct model is used. The idea is that, instead of tweaking materials to look good under specific lighting, a material can be created that will react correctly under all lighting scenarios.
In case your models are created using legacy standards such as Lambert, Blinn-Phong or Specular-Roughness, they should be converted to a PBR Physically Based Rendering standard before conversion, in order to achieve a correct representation of the materials and textures from the original model.
Models using legacy or custom standards can be converted, however, it’s not possible to guarantee a correct representation of the materials and textures. Using custom standard or custom shader elements may also give failed conversion, since the system won’t be able to proper translate custom elements.
Polygon Streaming maintains the origin point, position, and scale of all 3D objects. When converting entire environments with multiple objects, each object will automatically be positioned at its origin point. You need to make sure you have your objects placed at the correct origin point. If a model is far from the 0,0,0 origin point it will be incorrect for visualization needs, and the wrong origin point will still be translated after conversion. The same goes for the general scale of the model, the scale set for the model will be translated after conversion. Polygon Streaming follows meter units by default, the same as Blender, Unity or Unreal Engine. If the model scale is following another unit system or is set very small on the original model, The model will appear very small after conversion.
Make sure to have your model’s faces facing the correct orientation. In case a face is being oriented in the wrong direction (also known as flipped normals), it will appear transparent depending on the user view-point direction. Face orientation is a design choice and therefore subjective, not possible for an automated system to know which face should be oriented at which direction. The system will follow the original model’s orientation and direction. In case it’s facing the wrong direction on the original model, it will also face wrong after conversion.
Z-Fighting is a common issue in real-time rendering, it happens when two meshes are overlapping each other, and the GPU is unable to understand which one to render first. It’s very common to happen when displaying NURBS based models, like CAD or BIM models, in a real-time rendering scenario. In order to have a proper real-time display of your streaming model, make sure to avoid z-fighting by avoiding creating overlapping meshes.
Sometimes a 3D model may have data that is being hidden or enclosed by another object, which will never be seen by the final user. Although there is no issue in converting the model with hidden data, and the hidden data won’t be render, since our player plugins make use of Occlusion Culling. It’s a good practice to remove any data that will be hidden, in order to make the overall model smaller, more concise, faster to convert, and overall lighter to stream.
It’s a good practice to maintain a proper mesh topology. There is no issue in converting a model with a bad mesh topology. However, any 3D visualization issues that comes from a bad mesh topology, will be translated for the streaming model as well. Therefore, in order to obtain the best looking model, with faster and lighter stream. A concise, sensible and well done mesh topology will always give much better results than a bad mesh topology, that may contain weird looking patches, holes, weird looking triangles and faces. The best the 3D model is modeled and maintain a sensible mesh topology, the best it will be streamed.
This page details how to control the global settings of VIVERSE Polygon Streaming within your PlayCanas project when using the VIVERSE PlayCanvas extension.
The Stream Controller or Polygon Streaming Settings (VIVERSE SDK) manages the streaming of models and streaming parameters inside your project. It can be found under Viverse Scene Settings.
When Occlusion Culling is enabled objects that are hidden by other objects will not be rendered. It depends on the model how much occlusion culling will improve performance.
Occlusion Geometry allows you to specify what is used to check if a mesh is occluded. Bounding Box option is faster but less accurate while the Mesh option is slower but more accurate. Bounding Box should be sufficient in most cases.
Occlusion Query Frequency determines how many times a second meshes are checked if they are occluded. A lower value will increase performance but meshes will take longer to reappear.
The Triangle Budget is a limit on the amount of triangles that will be drawn per frame. Increasing this will lead to better visual quality, but of course also higher processing and memory utilization. It's recommended to keep Triangle Budget to at least 30% of the full amount of polygons that you are going to stream. For example, if you are going to stream a 3D model of 10 million polygons, it's recommended to use a Triangle Budget of at least 3 million.
The Mobile Triangle Budget is the same as Triangle Budget, however it will be applied in case the system identifies the user is visiting the experience via a mobile device.
With a Distance Factor above 1 nearby objects are given preference when allocating triangles. A value below 1 gives preference to further away objects and a value of 1 is neutral.
It will stop improving geometry when it reaches the Maximum Quality. This can be used to stop far away objects from showing more detail than is necessary. Setting a value of 0 means there is no limit on quality apart from the triangle budget.
Details of parameters are also set on the Supported Parameters section.
Occlusion Culling
Enable Dynamic Occlusion Culling.
FALSE
FALSE
TRUE
FALSE
FALSE
FALSE
Occlusion Geometry
Mesh: Use the mesh to check if it's occluded Bounding Box: Use the bounding box of the mesh to check if it's occluded.
Bounding Box
Bounding Box
Bounding Box
Bounding Box
Bounding Box
Bounding Box
Occlusion Query Frequency
Value is in times per second. A value of 0 means will it run on every frame.
0
8
60
5
8
12
Triangle Budget
The maximum amount of triangles that you want to be in the scene at any single point.
0
5000000
Depends on device
Mobile won't use this parameter
5000000
5000000
Mobile Triangle Budget
The triangle budget used on a mobile device. If it is set to 0 it will use the non-mobile triangle budget.
0
3000000
Depends on device
3000000
PC won't use this parameter
PC won't use this parameter
Distance Factor
Preference for nearby objects over objects further away. Values above one mean a preference for nearby objects. Values below one mean a preference for objects further away. One is neutral.
0
1.1
10
1.1
1.1
1.1
Maximum Quality
Stops improving geometry that exceeds the maximum quality. This can be used to stop far away objects from showing more detail which can be wasteful. Leaving this at 0 means there is no maximum quality.
0
15000
300000
15000
15000
0
This document provides a guide for using the Standalone Plugin to integrate Polygon Streaming into a PlayCanvas project that does not target publication to VIVERSE.
For PlayCanvas projects that are not targeting publication to VIVERSE, this PlayCanvas standalone plugin is the first option for integrating Polygon Streaming into those projects. Because these projects will not be published to VIVERSE, we've created a standalone version of the plugin that does not utilize the PlayCanvas VIVERSE Chrome browser extension.
Download the latest version of the PlayCanvas standalone plugin here. Import the polygon-streaming.js script into the Assets window of the PlayCanvas project.
A. Create a new entity and name it StreamController. This entity will control the streaming models inside the project.
B. Add the StreamController script to the StreamController entity. You can check detailed information on these parameters here.
C. Add the Camera entity from the project to the Camera field.
See here for a description of the attributes used in the Stream Controller and Streamable Model components.
This page details basic information about using VIVERSE Polygon Streaming with Unreal Engine.
Polygon Streaming models can be added to Unreal Engine based projects with the use of the Unreal Engine Plugin. This will allow the streaming of models with multiple millions of polygons to stream from the cloud into your Windows-based projects and applications.
Once the plugin’s zip file is downloaded from the Polygon Streaming Web Console. The “PolygonStreaming-Unreal-Version” folder should be added to the “Plugins” folder of your Unreal Engine Project.
In case the “Plugins” folder does not exist, make sure to create a new folder with the name “Plugins” and add the plugin’s folder inside. Also make sure to close your project before installing the plugin.
After adding the “PolygonStreaming-Unreal-Version” folder inside the “Plugins” folder of your Unreal Engine Project. Double click your project’s file (.uproject). In case it asks you to Re-Build the project, choose “Yes” - now the project will be rebuilt and once it opens, it will have the Polygon Streaming Plugin installed.
Sometimes it may appear like nothing is happening - the Unreal Engine welcome screen may not show, but don’t worry, just give it some time for building and compiling and your project should open normally after a couple of minutes.
The plugin is only available for C++ based projects, and not available for Blueprint only projects. In case you want to use it for a Blueprint based project, first you will need to add C++ classes and make it C++ available.
Download the latest extension version
Change Log
2.0.2
Updated to work with xrgc5 format
2.0.1
Updated the plugin to work with Unreal 5.4.x versions.
2.0.0
Added support for KTX textures.
Added model preview to StreamingModel actor.
This page details how to control the settings of individual VIVERSE Polygon Streaming models within your Unity project.
The Streaming Model prefab represents the model to be streamed inside your project. A Streaming Model component always needs a reference to the Stream Controller component to function and a Source URL which is the URL of your converted model. Once those two are correctly set, your model will start streaming inside your project as soon as you press Play.
The Stream Controller is a reference to the Stream Controller inside your scene and needs to be set, and the Source URL is the URL of your converted model that needs to be pasted in the field.
In case your project is using Light Probes, you can tick the Light Probe Usage box. In case you would like to use a custom material in your streaming model, you can tick the Custom Material box, which will open a new panel that will allow you to change the material.
You can also set different Quality Priority options by clicking at Streaming Runtime Options Settings. This means that some models can have higher priority of streaming than others, it can use only integer numbers and it works in the following logic:
You can also turn on or off the Streaming Model Preview script to check a preview of your streaming model in the scene without the need to play it.
Stream Controller
Set the Stream Controller from the scene.
None
Source Url
Address of the Streamable Model to be streamed into the scene.
None
Normals
Calculation parameter for the normals of the model being streamed.
Original Vertex Normals
Light Probe Usage
Enable if using Light Probes in the scene.
False
Custom Material
Enable if wish to use custom materials.
False
Quality Priority
How much to prioritize the quality of this model relative to the quality of other models in the scene. This parameter does nothing if this is the only model in the scene.
1
This document guides you through integrating Polygon Streaming to JavaScript projects.
Polygon Streaming models can be added to JavaScript projects with the use of our JavaScript SDKs. These plugins will help to stream high polygon models from the cloud into your projects and build for the web. The table below breaks down the plugins and their usage.
This page details how to control the global settings of VIVERSE Polygon Streaming within your Unreal project.
The Stream Controller manages the streaming of models and streaming parameters inside your project. There must be only one Stream Controller per scene, since it will control all Streaming Models as one.
The Triangle Budget is a limit on the amount of triangles that will be drawn per frame. Increasing this will lead to better visual quality, but of course also higher processing and memory utilization. It's recommended to keep Triangle Budget to at least 30% of the full amount of polygons that you are going to stream. For example, if you are going to stream a 3D model of 10 million polygons, it's recommended to use a Triangle Budget of at least 3 million.
In case you use too low Triangle Budget, for example 500 thousand to stream 10 million polygons, you may reach the budget before it's possible to showcase the higher quality of the model. Resulting in only showing a lower quality version of the model in order to stay within the budget.
The Distance Factor is a factor between the distance of the camera to the object being streamed. The default value of 1.1 has a neutral preference. A higher value, such as 3 or 5, will have a preference for nearby objects over objects further away. These parameters can be changed at runtime to find the sweet spot for your scene.
The Distance Type can be set to Bounding Box or Bounding Box Center. In case it's set to Bounding Box it will calculate the distance between the camera and object based on the edges of the bounding box of the object. In case it's set to Bounding Box Center, it will calculate the distance from the center of the object. It's recommended to use Bounding Box for single objects, and Bounding Box Center for full environments in which the user will be walking inside the object.
The Close Up Distance is a change in the distance factor between camera and streaming object when the camera gets too close. For example, when the camera is at 3 units or less of distance from the object, it will use the value at Close Up Distance Factor, when the camera is further than 3 units from the object, it will use the value at Distance Factor. That way the system forces a strong streaming of data when the camera is very close to an object.
The other parameters should be left in default or you can check an explanation on the Supported Parameters section.
Polygon Streaming project settings are only available when using the VIVERSE of the PlayCanvas editor.
Babylon.js
Instructions for installation and setup can be found .
Three.js
Instructions for installation and setup can be found .
Triangle Budget
The maximum amount of triangles that you want to allow in the scene at any single point.
3000000
Distance Factor
Preference for nearby objects over objects further away. Values above one mean a preference for nearby objects. Values below one mean a preference for objects further away. One is neutral.
1.1
Distance Type
Distance type from camera to the meshes bounding boxes.
Bounding Box
Close Up Distance
The distance where it starts using Close-up Distance Factor instead of Distance Factor. Set it to 0 to not use close-up distance factor.
3
Close Up Distance Factor
The distance factor used when close-up to an object. Should be higher than the standard Distance Factor.
5
Maximum Quality
Stops improving geometry that exceeds the maximum quality. This can be used to stop far away objects from showing more detail which can be wasteful. Leaving this at 0 means there is no maximum quality.
3000000
This page details how to control the global settings of VIVERSE Polygon Streaming within your Unity project.
The Stream Controller manages the streaming of models and streaming parameters inside your project. There must be only one Stream Controller per scene, since it will control all Streaming Models as one.
The Stream Controller needs to have a reference to a Camera object which will based the distance between viewer and object. It also contains several parameters and features to assist with the best streaming quality and control.
The Triangle Budget is a limit on the amount of triangles that will be drawn per frame. Increasing this will lead to better visual quality, but of course also higher processing and memory utilization. It's recommended to keep Triangle Budget to at least 30% of the full amount of polygons that you are going to stream. For example, if you are going to stream a 3D model of 10 million polygons, it's recommended to use a Triangle Budget of at least 3 million.
In case you use too low Triangle Budget, for example 500 thousand to stream 10 million polygons, you may reach the budget before it's possible to showcase the higher quality of the model. Resulting in only showing a lower quality version of the model in order to stay within the budget.
The Distance Factor is a factor between the distance of the camera to the object being streamed. The default value of 1.1 has a neutral preference. A higher value, such as 3 or 5, will have a preference for nearby objects over objects further away. These parameters can be changed at runtime to find the sweet spot for your scene.
The Close Up Distance is a change in the distance factor between camera and streaming object when the camera gets too close. For example, when the camera is at 3 units or less of distance from the object, it will use the value at Close Up Distance Factor, when the camera is further than 3 units from the object, it will use the value at Distance Factor. That way the system forces a strong streaming of data when the camera is very close to an object.
The Distance Type can be set to Bounding Box or Bounding Box Center. In case it's set to Bounding Box it will calculate the distance between the camera and object based on the edges of the bounding box of the object. In case it's set to Bounding Box Center, it will calculate the distance from the center of the object. It's recommended to use Bounding Box for single objects, and Bounding Box Center for full environments in which the user will be walking inside the object.
The other parameters should be left in default or you can check an explanation on the Supported Parameters section.
This page details how to control the settings of individual VIVERSE Polygon Streaming models within your Unreal project.
The Streaming Model Actor represents the model to be streamed inside your project. A Streaming Model component always needs a reference to the Stream Controller component to function and a Source URL which is the Asset ID URL of your converted model. Once those two are correctly set, your model will start streaming inside your project as soon as you press Play.
The URL field is the Asset ID address of your streaming model, once the Asset ID is pasted in this field, the model can be streamed inside your scene.
You can also set different Quality Priority options by clicking at Streaming Runtime Options Settings. This means that some models can have higher priority of streaming than others, it can use only integer numbers and it works in the following logic:
The Stream Controller is a reference to the Stream Controller inside your scene and it needs to be set, otherwise the model won’t appear or be streamed inside your scene.
Show Preview option will create a simple preview of the model in the editor scene, so you can position and scale your model correctly without the need to run the game.
In case you would like to use a custom material in your streaming model, you can tick the Custom Material box, which will open a new panel that will allow you to change the materials of the model.
Viewer Camera
The viewer camera to be set, the system will calculate distances based on the selected camera.
None
Triangle Budget
The maximum amount of triangles that you want to allow in the scene at any single point.
150000000
Distance Factor
Preference for nearby objects over objects further away. Values above one mean a preference for nearby objects. Values below one mean a preference for objects further away. One is neutral.
1.1
Close Up Distance
The distance where it starts using Close-up Distance Factor instead of Distance Factor. Set it to 0 to not use close-up distance factor.
3
Close Up Distance Factor
The distance factor used when close-up to an object. Should be higher than the standard Distance Factor.
5
Distance To
Distance type from camera to the meshes bounding boxes.
Bounding Box
Max Download Size Mb
Pause downloading more data when the current active downloads exceed this threshold.
4
Maximum Quality
Stops improving geometry that exceeds the maximum quality. This can be used to stop far away objects from showing more detail which can be wasteful. Leaving this at 0 means there is no maximum quality.
0
Occlusion Culling
Enable Dynamic Occlusion Culling.
True
Raycast Mask
Set masks for the occlusion culling raycast.
Everything
Time Slicing
Determine how many frames should be accumulated to decide whether an object should be hidden.
120
Ray Per Frame
How many rays are generated per frame.
256
Name
Description
Default value
Source URL
Asset ID address of the Streamable Model to be streamed into the scene.
None
Quality Priority
How much to prioritize the quality of this model relative to the quality of other models in the scene. This parameter does nothing if this is the only model in the scene.
1
Stream Controller
Set the Stream Controller from the scene.
None
Show Preview
Enable model preview in editor scene.
True
Custom Material
Enable if wish to use custom materials.
False
This page details basic information about using VIVERSE Polygon Streaming with Unity.
Polygon Streaming models can be added to Unity Engine based projects with the use of our Unity Plugin, being able to stream multi-million polygon models streaming from the cloud into your projects and build for desktop, mobile and VR headsets.
Open Package Manager inside the Unity Engine - The Plugin should ONLY be installed via Package Manager and not copying inside the project’s folder, since it also need to automatically install dependencies:
Press the + button and choose Add Package from tarball… and select the downloaded .tgz file.
If everything went correct, the package should now be visible under the Packages list in the project panel
Download the latest extension version
Change Log
06/12/2025
2.7.4
Fixed Standard Shader rendering issue on Android devices
Small performance optimizations
02/28/2025
2.7.3
Support for updated convertor and newly converted models (2.12.6)
Small performance optimizations
01/02/2025
2.7.1
Hotfix for model updating bug
Support for updated convertor and newly converted models (2.11.1) at player level
Small performance optimizations
10/28/2024
2.5.0
Support Vertex Color
Support Multi UV Set
Support KHR_materials_emissive_strength in BRP
Occlusion culling improved performance
Distance Type set to only Bounding Box Center
Fix shader bug in Android devices
9/16/2024
2.4.2
Fix shader bug in Android devices
9/10/2024
2.4.1
Fix Preview bug
Fix KHR_texture_transform bug
Fix KHR_materials_specularGlossiness bug
7/30/2024
2.4.0
Support KHR_materials_pbrSpecularGlossiness
7/05/2024
2.3.8
Fix Texture Transform Issue
7/03/2024
2.3.7
Fix Double Sided Material Transparent Issue
Add Signature Check
6/12/2024
2.3.6
Support xrgc version 4 - read materials from info.json
Fix XrgcMaterial default value spec
5/22/2024
2.3.5
Fixed all found memory leaks.
Support Doubled Material in BRP and URP.
Handling Stream Controller destroying/disabling edge cases
Free up resources when hiding/disabling models
Hotfix light probe usage.
Quality Priority now works only with integer numbers and follow the following logic: 0 = Highest priority. 1 = Lowest priority. 2 or more = Higher priority than lower number, but still lower priority than 0.
4/26/2024
2.3.4
Hotfix light probe usage
4/15/2024
2.3.3
Handle destroy / disable more gracefully
Free up streamingcontroller resources when hiding / disabling models
3/23/2024
2.3.2
Add Light Probe Usage in Streaming Model
Small fixes for memory and performance
2/22/2024
2.3.1
Fix Triangle Budget Bug
Remove duplicated metallic roughness and occlusion texture
Camera Required
This is the camera used in the scene. This attribute is required.
None
Camera Type
Non-player camera ('nonPlayer'): A camera that is not attached to a player e.g. a camera that orbits an object. Player camera ('player'): A camera that is attached to a player
Non-player camera
Occlusion Culling
Whether to enable occlusion culling. When occlusion culling is enabled objects that are hidden by other objects will not be rendered. It depends on the model whether this setting will improve performance.
False
Occlusion Geometry
Bounding Box: Use the bounding box of each mesh to check if it's occluded. This is less accurate but faster. Mesh: Use the mesh to check if it's occluded. This is more accurate but slower.
Bounding Box
Occlusion Query Frequency
How many times per second to check for occlusion. A lower number will improve performance but geometry will take longer to reappear.
8 times/second
Triangle Budget
The maximum amount of triangles that you want to be in the scene.
5000000
Mobile Triangle Budget
The triangle budget used on a mobile device. If it is set to 0 it will use the non-mobile triangle budget.
3000000
Minimum Distance
The smallest possible distance to the camera.
0.01
Distance Factor
Preference for nearby objects over objects further away. Values above one mean a preference for nearby objects. Values below one mean a preference for objects further away. One is neutral.
1.1
Maximum Quality
Stops improving geometry that exceeds the maximum quality. This can be used to stop far away objects from showing more detail than is necessary. Setting it to 0 means there is no maximum quality.
15000
Close Up Distance Factor
The distance factor used when close-up to an object. Should be higher than the standard distance factor.
5
Close Up Distance
The distance where it starts using close-up distance factor. Set it to 0 to not use close-up distance factor.
3
iOS Memory Limit
The maximum amount of memory in MB that meshes and textures can consume on iOS devices to avoid the page crashing. Use -1 for no limit or 0 to let Polygon Streaming determine a device specific limit.
0
Model URL (path) Required
The URL of the XRG file. This attribute is required.
None
Quality Priority
How much to prioritize the quality of this model relative to the quality of other models in the scene. It is a ratio.
1
Initial Triangle Percent
Percentage of triangle budget to initialize the model with. It should be a number between 0 and 1
0.1
Cast Shadows
Whether the model should cast shadows.
True
Receive Shadows
Whether the model should receive shadows.
True
Force Double Sided
Render the model double sided regardless of the setting in the model file.
False
Use Alpha
Whether to render semi-transparency in materials.
True
Use Embedded Collider
Determines whether the embedded collider should be used in physics.
True
Play Animation Automatically
Whether to play the embedded animation automatically
True
Animation To Play
The name or index of the embedded animation to play if an animation state graph is not provided. An index value of 0 is the first animation. If no value is supplied it will play the first animation.
None
Animation State Graph
Create an Anim State Graph in the Playcanvas editor and drag it onto this attribute. If no asset is provided a default state graph will be created.
None
Animation States (animationStateMappings)
Assigns animations to states in the animation state graph. Nested attributes include: state: the state in the state graph animation: The embedded animation name or index. layer: The layer in the state graph. If left blank will use the base layer.
None
Environment Asset
Use a cubemap asset if you want to provide an environment map otherwise it will use the scene's environment map.
None
Hash Code
Hash code to validate streaming model.
''
This guide provides instructions for setting up an iFrame with a Polygon Streaming object to implement into your webpage, e-commerce product page, or any other type of website.
Copy the iFrame snippet below.
<iframe src="https://yourmodellink/" style="width: 100%; height: 100%;"></iframe>
Paste it right into your webpage editor. We are using Shopify as our example e-commerce platform.
Adjust the width and height to fit your web design.
Publish your website to preview.
This document provides a guide for integrating Polygon Streaming web player for Babylon.js using the NPM package directly.
You can download a sample project to get started from this link:
This will open a browser window and display the 3D model.
Import Babylon.js, glTF loader and the StreamController:
Load Ammo.js. This is only required if you want to make use of the optional embedded collider in the model. Other physics plugins are not supported as Ammo.js is the only one that supports concave colliders. The Ammo.js physics plugin uses version 1 of the physics engine so you will need to add physics impostors to your meshes rather than physics aggregates or bodies.
Instantiate the stream controller:
Add a streaming model, passing it a model URL and a TransformNode to act as a model parent:
Call the stream controller's update method in the render loop:
Now you have everything setup to stream your 3D model inside your Babylon.js application.
Paste this URL as the first parameter of streamController.addModel() method.
All options are optional.
All options are optional.
This document provides a guide for integrating Polygon Streaming web player for Three.js using the NPM package directly.
This is a guide to use Polygon Streaming web player for Three.js using the NPM package directly.
You can download a sample project to get started from this link:
It will open a browser window and display the 3D model.
Import the StreamController from the package:
Instantiate the stream controller:
Add a streaming model, passing it a model URL and a Group to act as a model parent:
Call the stream controller's update method in the animation loop:
Now you have everything setup to start streaming your 3D models inside your Three.js application.
Paste this URL as the first parameter of streamController.addModel() method.
All options are optional.
All options are optional.
This page details important release notes on convertor releases, version changelogs, bug and feature hotfixes, and compatibility dependencies.
Current version: 2.12.10
To run the sample project you first need to make sure you have installed. Then run the following in a terminal:
Upload your 3D model to the online console:
To get the model URL go to the models section of the console: and click on the three dots next to your model and select "Copy asset ID".
To run the example you first need to make sure you have installed. Then run the following in a terminal:
Upload your 3D model to the online console:
To get the model URL go to the models section of the console: and click on the three dots next to your model and select "Copy asset ID".
npm install
npm run dev
import * as BABYLON from '@babylonjs/core/Legacy/legacy';
import '@babylonjs/loaders/glTF';
import { StreamController, loadWasmModule } from '@polygon-streaming/web-player-babylonjs';
import ammoWasmJsUrl from './lib/ammo.wasm.js?url';
import ammoWasmWasmUrl from './lib/ammo.wasm.wasm?url';
import ammoJsUrl from './lib/ammo.js?url';
loadWasmModule('Ammo', ammoWasmJsUrl, ammoWasmWasmUrl, ammoJsUrl).then(ammoInstance => {
const streamController = new StreamController(camera, engine, scene, cameraTarget, {
cameraType: 'nonPlayer',
triangleBudget: 5000000,
mobileTriangleBudget: 3000000,
minimumDistance: 0.01,
distanceFactor: 1.1,
maximumQuality: 15000,
closeUpDistanceFactor: 5,
closeUpDistance: 3,
ammoInstance: ammoInstance
});
const modelParent = new BABYLON.TransformNode('Model parent', scene);
modelParent.position.set(0, 1, 0);
streamController.addModel('the URL of your model to be streamed (Asset ID)', modelParent, {
qualityPriority: 1,
initialTrianglePercent: 0.1,
castShadows: true,
receiveShadows: true,
forceDoubleSided: false,
useAlpha: true,
environmentMap: null,
hashCode: ''
});
engine.runRenderLoop(function () {
scene.render();
streamController.update();
});
camera (required)
The camera used in the scene.
engine (required)
The engine used in the scene.
scene (required)
The scene object.
cameraTarget (required)
The camera target which is a Vector3.
options (optional)
You need to provide an object of options. The available options are listed below.
cameraType
'nonPlayer': A camera that is not attached to a player e.g. a camera that orbits an object.
'player': A camera that is attached to a player
'nonPlayer'
triangleBudget
The maximum amount of triangles that you want to be in the scene at any single point.
5000000
mobileTriangleBudget
The triangle budget used on a mobile device. If it is set to 0 it will use the non-mobile triangle budget.
3000000
minimumDistance
The smallest possible distance to the camera.
0.01
distanceFactor
Preference for nearby objects over objects further away. Values above one mean a preference for nearby objects. Values below one mean a preference for objects further away. One is neutral.
1.1
maximumQuality
Stops improving geometry that exceeds the maximum quality. This can be used to stop far away objects from showing more detail which can be wasteful. Setting it to 0 means there is no maximum quality.
15000
closeUpDistanceFactor
The distance factor used when close-up to an object. Should be higher than the standard distance factor.
5
closeUpDistance
The distance where it starts using close-up distance factor. Set it to 0 to not use close-up distance factor.
3
iOSMemoryLimit
The maximum amount of memory in MB that meshes and textures can consume on iOS devices to avoid the page crashing. Use -1 for no limit.
440
URL (required)
URL of the XRG model. If it doesn't end with .xrg it will append model.xrg to the URL.
modelParent (required)
The scene object that the streaming model will be attached to.
options (optional)
You need to provide an object of options. The available options are listed below.
qualityPriority
How much to prioritize the quality of this model relative to the quality of other models in the scene. This parameter does nothing if this is the only model in the scene.
1
initialTrianglePercent
Percentage of triangle budget to initialize the model with.
0.1
castShadows
Whether the model should cast shadows.
true
receiveShadows
Whether the model should receive shadows.
true
forceDoubleSided
Render the model double sided regardless of the setting in the model file.
false
useAlpha
Whether to render semi-transparency in materials. You might turn this off to increase performance but all your materials will render opaque.
true
environmentMap
A cube map environment texture.
null
hashCode
Hash code to validate streaming model
''
npm install
npm run dev
import { StreamController } from '@polygon-streaming/web-player-threejs';
const streamController = new StreamController(camera, renderer, scene, controls.target, {
cameraType: 'nonPlayer',
triangleBudget: 5000000,
mobileTriangleBudget: 3000000,
minimumDistance: 0.01,
distanceFactor: 1.1,
maximumQuality: 15000,
closeUpDistanceFactor: 5
closeUpDistance: 3,
iOSMemoryLimit: 440
});
const modelParent = new THREE.Group();
modelParent.position.set(0, 1, 0);
scene.add(modelParent);
streamController.addModel('the URL of your model to be streamed (Asset ID)', modelParent, {
qualityPriority: 1,
initialTrianglePercent: 0.1,
castShadows: true,
receiveShadows: true,
forceDoubleSided: false,
useAlpha: true,
environmentMap: null,
hashCode: ''
});
function animate() {
controls.update();
renderer.render(scene, camera);
streamController.update();
}
renderer.setAnimationLoop(animate);
camera (required)
The camera used in the scene.
renderer (required)
The WebGL renderer used in the scene.
scene (required)
The scene object.
cameraTarget (required)
The camera target which is a Vector3. If you are using orbit controls it would be controls.target.
options (optional)
You need to provide an object of options. The available options are listed below.
cameraType
'nonPlayer': A camera that is not attached to a player e.g. a camera that orbits an object. 'player': A camera that is attached to a player.
'nonPlayer'
triangleBudget
The maximum amount of triangles that you want to be in the scene at any single point.
5000000
mobileTriangleBudget
The triangle budget used on a mobile device. If it is set to 0 it will use the non-mobile triangle budget.
3000000
minimumDistance
The smallest possible distance to the camera.
0.01
distanceFactor
Preference for nearby objects over objects further away. Values above one mean a preference for nearby objects. Values below one mean a preference for objects further away. One is neutral.
1.1
maximumQuality
Stops improving geometry that exceeds the maximum quality. This can be used to stop far away objects from showing more detail which can be wasteful. Setting it to 0 means there is no maximum quality.
15000
closeUpDistanceFactor
The distance factor used when close-up to an object. Should be higher than the standard distance factor.
5
closeUpDistance
The distance where it starts using close-up distance factor. Set it to 0 to not use close-up distance factor.
3
iOSMemoryLimit
The maximum amount of memory in MB that meshes and textures can consume on iOS devices to avoid the page crashing. Use -1 for no limit.
440
URL (required)
URL of the XRG model. If it doesn't end with .xrg it will append model.xrg to the URL.
model parent (required)
The scene object that the streaming model will be attached to.
options (optional)
You need to provide an object of options. The available options are listed below.
qualityPriority
How much to prioritize the quality of this model relative to the quality of other models in the scene. This parameter does nothing if this is the only model in the scene.
1
initialTrianglePercent
Percentage of triangle budget to initialize the model with.
0.1
castShadows
Whether the model should cast shadows.
true
receiveShadows
Whether the model should receive shadows.
true
forceDoubleSided
Render the model double sided regardless of the setting in the model file.
false
useAlpha
Whether to render semi-transparency in materials. You might turn this off to increase performance but all your materials will render opaque.
true
environmentMap
A cube map environment texture.
null
hashCode
Hash code to validate streaming model
''
06/12/2025
2.12.10
Major Improvements (vs 2.12.6) for converted models
Up to 9% smaller converted file sizes
Up to 29% less draw calls
Up to 38% less total textures
Up to 46% less total materials
Up to 13% less average memory used
Updates
[Fixed] Normal map issue on some mobile and desktop devices
Overall performance improvements
Compatibility Dependencies Requires PlayCanvas Plugin version 2.5.2 or higher Requires Unity Plugin version 2.7.4 or higher
02/24/2025
2.12.6
Major Improvements (vs 2.11.1) for converted models
Up to 60% less draw calls
Up to 33% less total textures
Up to 32% less total materials
Up to 12% less average memory used
Updates
Super Texture Atlas Merging: Texture data can now be merged into super texture atlases, minimizing draw calls coming from several separated small textures in a 3D model enables the data to be more efficient for streaming.
Lightmap Support: Lightmaps are now being supported via a separated file. You can now upload a ZIP file containing your .GLB model and a file called “lightmap.hdr” or “lightmap.png” - this lightmap will be automatically applied to the Streaming Model inside PlayCanvas Engine running with VIVERSE Create SDK. Currently only one lightmap is supported.
Improved Update Logic: With the use of Virtual Bounding Boxes, the update logic is now improved to calculate distances more accurately according to the actual visible mesh geometry in the scene. The camera frustum is also improved with a perspective bias update logic, making what the user is actually seeing more important and prioritized during streaming.
Performance Improvements: Several improvements and modifications regarding performance, such as Material Merging and Super Texture Atlas. Large environment scenes will see bigger improvements with minimized draw calls usage and optimized VRAM usage.
Bug fixes
Compatibility Dependencies Requires PlayCanvas Plugin version 2.5.1 or higher Requires Unity Plugin version 2.7.3 or higher
01/13/2025
2.11.1
Major Improvements (vs 2.9.8) for converted models
Up to 30% less draw calls
Up to 30% less memory for textures used
Up to 49% smaller converted file sizes
Up to 67% less total memory used
Updates
Improved Decimation Algorithm: Able to automatically configure the best possible algorithm when creating LODs with as little visual quality loss as possible.
Shared Materials Feature: The system is now able to look-up materials being used, combine compatible texture-less materials and remove duplicates from a streaming model. This optimizes the usage of texture-less materials and minimizes the amount of draw calls created by duplicates or the use of several similar materials in a model.
Remove the need for Baked Textures: Drastic reduction on memory needed for textures, bringing an important gain on overall performance.
Bug fixes
Compatibility Dependencies Requires PlayCanvas Plugin version 2.5.0 or higher Requires Unity Plugin version 2.7.1 or higher
10/28/2024
2.9.8
Updates
Bug fixes and minor performance updates
Compatibility Dependencies Requires PlayCanvas Plugin version 2.5.0 or higher Requires Unity Plugin version 2.5.0 or higher Requires Unreal Engine Plugin version 2.0.2 or higher
This page details the process of setting up the VIVERSE Polygon Streaming plugin in Unity.
To start using the plugin, there are two Prefabs that can be found in the Plugin's folder which will be all you need to stream your models inside Unity Engine. The Stream Controller, and the Streaming Model.
You can drag and drop both prefabs inside your Unity Scene and be ready to start streaming your converted 3D model inside the engine.
The Stream Controller should only be added once, and it controls the streaming feeds inside your scene, you only need to set the Main Camera in the Camera field. The Streaming Model is a prefab to be used with all models that will be streaming from the cloud into your project, you only need to add the URL of your converted model into the URL field.
Once that's set, press play and your streaming models will already start streaming inside your project!