Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
This page details the file types and formats supported in VIVERSE Polygon Streaming.
Polygon Streaming supports the following formats:
.GLB
.glTF (zipped)
.OBJ (zipped)
It’s highly recommended to use a single .GLB file to convert your 3D models. This will ensure that the file contains all the necessary data, and its following the correct standards used during conversion. 3D models must be a triangulated mesh with PBR materials following metallic-roughness standards.
In case you use a .glTF model, all the necessary files need to be zipped into a single .zip file. It should at least contain a .gltf file, a .bin file and all the texture images in the formats .jpg or .png.
In case you use a .OBJ model, all the necessary files need to be zipped into a single .zip file. It should contain a .obj file, a .mtl file, and all the texture images in the formats .jpg or .png, making sure they are properly set and connected. You can also upload a single .obj file in case there are no textures or .mtl file needed.
Note that in case the textures are not properly connected in your file, they will appear as plain white after conversion. A single .obj file will always look plain white since it doesn’t contain material or texture information.
JPG
PNG
BMP
KTX
Diffuse/Albedo
Metallic/Roughness
Transparency/Opacity
Normals
Ambient Occlusion
Emissive
KHR_draco_mesh_compression
KHR_materials_unlit
KHR_texture_basisu
KHR_materials_specular
KHR_materials_pbrSpecularGlossiness
KHR_texture_transform
KHR_materials_ior
KHR_materials_transmission
KHR_materials_volume
KHR_materials_clearcoat
This page guides you through the steps of uploading an asset to the Polygon Streaming platform.
Start by logging into the Polygon Streaming CMS. Then you will be able to upload a 3D asset into the platform by simply dropping a file in the upload field and clicking the upload button.
After selecting your file you will be able to click Convert to convert the 3D asset into a streaming model. You can also choose different options before conversion:
After selecting your file you will be able to click Convert to convert the 3D asset into a streaming model. You can also choose different options before conversion:
Default Resolution will use default resolution for the textures of your models, a middle term between low and high resolution.
Low Resolution will lower down the resolution of textures, making the file to stream lighter in exchange of a visual quality drop.
High Resolution will ensure the resolution of textures is as high as possible, which will ensure the highest visual quality. However, depending on the model, especially very large models like full environments, it could make the file heavy to stream and display.
This option will generate a collider mesh for your model or not. Which can be used when streaming the model inside the integrations and plugins.
Note that currently, this feature will only work for PlayCanvas engine.
No collider mesh doesn't generate a collision mesh automatically. Basic collider mesh will create a basic mesh that is lighter and more basic. Complex collider mesh will generate a detailed collision mesh which will work with better accuracy, however it could be heavy to use inside your project depending on the model.
Once you choose the desired options you can press Convert and the conversion of the model will start. Once the conversion is done, the converted model will appear in the Models panel.
In the Model panel you can click on the icons next to your model to either preview the model in a new tab on the web-browser, or copy the converted URL address to use inside your projects using the Polygon Streaming integrations and plugins.
In this guide, you will find information on how to prepare you 3D models to achieve the best outcomes when converting and streaming your models within the supported platforms and devices.
In this guide you will find information on how to prepare your 3D models to achieve the best outcomes when converting to Polygon Streaming solutions and streaming your models within the supported platforms and devices.
Physically based rendering (PBR) has recently become the standard in many 3D applications, such as Unity, Unreal, 3D Studio Max and 3D Web platforms. Polygon Streaming is also fully based on this standard, for a more performant and realistic material and textures representations.
This approach differs from older approaches in that instead of using approximations for the way in which light interacts with a surface, a physically correct model is used. The idea is that, instead of tweaking materials to look good under specific lighting, a material can be created that will react correctly under all lighting scenarios.
In case your models are created using legacy standards such as Lambert, Blinn-Phong or Specular-Roughness, they should be converted to a PBR Physically Based Rendering standard before conversion, in order to achieve a correct representation of the materials and textures from the original model.
Models using legacy or custom standards can be converted, however, it’s not possible to guarantee a correct representation of the materials and textures. Using custom standard or custom shader elements may also give failed conversion, since the system won’t be able to proper translate custom elements.
Polygon Streaming maintains the origin point, position, and scale of all 3D objects. When converting entire environments with multiple objects, each object will automatically be positioned at its origin point. You need to make sure you have your objects placed at the correct origin point. If a model is far from the 0,0,0 origin point it will be incorrect for visualization needs, and the wrong origin point will still be translated after conversion. The same goes for the general scale of the model, the scale set for the model will be translated after conversion. Polygon Streaming follows meter units by default, the same as Blender, Unity or Unreal Engine. If the model scale is following another unit system or is set very small on the original model, The model will appear very small after conversion.
Make sure to have your model’s faces facing the correct orientation. In case a face is being oriented in the wrong direction (also known as flipped normals), it will appear transparent depending on the user view-point direction. Face orientation is a design choice and therefore subjective, not possible for an automated system to know which face should be oriented at which direction. The system will follow the original model’s orientation and direction. In case it’s facing the wrong direction on the original model, it will also face wrong after conversion.
Note that our plugins and integrations always provide a double-sided material option, which can be used to avoid flipped normals issues since it renders two sides.
Z-Fighting is a common issue in real-time rendering, it happens when two meshes are overlapping each other, and the GPU is unable to understand which one to render first. It’s very common to happen when displaying NURBS based models, like CAD or BIM models, in a real-time rendering scenario. In order to have a proper real-time display of your streaming model, make sure to avoid z-fighting by avoiding creating overlapping meshes.
Sometimes a 3D model may have data that is being hidden or enclosed by another object, which will never be seen by the final user. Although there is no issue in converting the model with hidden data, and the hidden data won’t be render, since our player plugins make use of Occlusion Culling. It’s a good practice to remove any data that will be hidden, in order to make the overall model smaller, more concise, faster to convert, and overall lighter to stream.
It’s a good practice to maintain a proper mesh topology. There is no issue in converting a model with a bad mesh topology. However, any 3D visualization issues that comes from a bad mesh topology, will be translated for the streaming model as well. Therefore, in order to obtain the best looking model, with faster and lighter stream. A concise, sensible and well done mesh topology will always give much better results than a bad mesh topology, that may contain weird looking patches, holes, weird looking triangles and faces. The best the 3D model is modeled and maintain a sensible mesh topology, the best it will be streamed.
This page details important release notes on convertor releases, version changelogs, bug and feature hotfixes, and compatibility dependencies.
Current version: 2.11.1
01/13/2025
2.11.1
Major Improvements
Up to 30% less draw calls
Up to 30% less memory for textures used
Up to 49% smaller converted file sizes
Up to 67% less total memory used
Updates
Improved Decimation Algorithm: Able to automatically configure the best possible algorithm when creating LODs with as little visual quality loss as possible.
Shared Materials Feature: The system is now able to look-up materials being used, combine compatible texture-less materials and remove duplicates from a streaming model. This optimizes the usage of texture-less materials and minimizes the amount of draw calls created by duplicates or the use of several similar materials in a model.
Remove the need for Baked Textures: Drastic reduction on memory needed for textures, bringing an important gain on overall performance.
Bug fixes
Compatibility Dependencies Requires PlayCanvas Plugin version 2.5.0 or higher Requires Unity Plugin version 2.7.1 or higher
10/28/2024
2.9.8
Updates
Bug fixes and minor performance updates
Compatibility Dependencies Requires PlayCanvas Plugin version 2.5.0 or higher Requires Unity Plugin version 2.5.0 or higher Requires Unreal Engine Plugin version 2.0.2 or higher
This page introduces you to the goals and technology that make up VIVERSE Polygon Streaming.
VIVERSE Polygon Streaming is an intelligent visualization core technology that overcomes modern day 3D visualization challenges, unleashing the potential of spatial content to be streamed from the cloud and sharing high-fidelity models and worlds seamlessly on any device.
Polygon Streaming aims to provide tools to creators, developers, designers, and Enterprises enabling them to build immersive, high-quality, 3D content that is widely compatible with the most popular game engines, allowing the content to be experienced across different platforms.
VIVERSE Polygon Streaming was created to focus on the next shift in computing and the future of the internet. It breaks down barriers and enables accessibility from any device - mobile, tablet, PC, and even XR headsets; giving people the chance to create, share experiences, and collaborate like never before.
Allow end-users to consume big 3D models rapidly by only downloading the geometry that has relevant visual impact for the user, at their specific position and bandwidth. With Polygon Streaming, you can stream large 3D models into your platform or projects without taking local storage space, leveraging the amount of polygons that is possible to be shown in a cross-platform and multi-device way.
Only transfer geometry that is needed in the user’s current position and field of view in the scene over time.
Automatic LOD generation and texture compression. Texture compression method optimized for lightweight decompression.
Minimal back-end server requirements only requiring http requests.
Utilize end-user's GPU for full rendering. No Cloud-GPU necessary at any step.
For license & agreement, please click on this link.
This page details basic information about using VIVERSE Polygon Streaming with the web-based 3D editor and game engine, PlayCanvas.
Polygon Streaming models can be added to PlayCanvas-based projects with the use of our PlayCanvas Plugin, being able to stream high polygon models from the cloud into your projects and build for the web.
The Polygon Streaming Plugin comes as a zipped file where you can find the PolygonStreaming.js main script. To start using inside PlayCanvas, just drag and drop this script inside your project.
Once the script is imported inside your project, you can already use by following the next guide.
The plugin also comes together with the VIVERSE SDK. In case you are using the SDK. After installing the VIVERSE SDK, a Polygon Streaming Component can be added inside PlayCanvas Engine via the VIVERSE Extension.
Download the latest extension version
Change Log
02/01/2025
2.5.0
Fixed support for latest conversion format
11/12/2024
2.4.9
Updated PS Player to support PlayCanvas Engine 2.0
Support for updated convertor and newly converted models (2.11.1)
[FIX] Unlit materials and small modifications
2.4.7
When running in an iFrame and the build version is viewer it will stop all tasks (rendering, physics, decompression worker, sending traffic info in service worker).
iosMemoryLimit option has been added to stream controller.
2.4.6
Three.js and Babylon.js implementations
Vertex colors for Three.js and Babylon.js
ModelOptions.onLoadedEventHandler has been renamed to onModelLoaded. Affects src/model_scripts/streamable-model.js in VIVERSE Create.
StreamController and it's wrappers are no longer a singletons. Affects src/model_scripts/stream-controller.js in VIVERSE Create.
2.4.4
Fixed KHR_materials_pbrSpecularGlossiness issue.
Added support for vertex colors
Added support for glTF alpha mode
Added support for KHR_materials_unlit
Added support for KHR_materials_emissive_strength
Added support for KHR_materials_sheen
Added support for KHR_materials_dispersion
Can now be installed as a NPM package
Fix KHR_materials_pbrSpecularGlossiness
2.3.7
Fix KHR_materials_pbrSpecularGlossiness
2.3.6
Support KHR_materials_pbrSpecularGlossiness
2.3.5
Support KHR_materials_specular
2.3.4
Backwards compatible with old models
2.3.3
Enable diffuseTint to apply diffuse factor
2.3.2
Revert Local Model
Add Force Double Sided Setting
2.3.1
Signature validation for xrg model
2.3.0
Support KHR extensions
2.2.0
Make streaming model to stream without adding "model.xrg" in the url field
2.1.23
Not blocking streaming if failed to send traffic records
2.1.22
Collect traffics record by service worker
Dispatch onceOptimalRendered
event
2.1.21
Fix missing materials
2.1.20
Fix shared texture release
2.1.19
Use Double Sided from xrgc
Support xrgc version 4
2.1.18
Fix service worker CORS issues in VR browser and Firefox
Use mobile triangle budgets for VR devices
2.1.17
Support xrgc version 3
Variable bit depth for mesh data
Read material doublesidedness from xrgc file
2.1.16
Fixed some nodes not update
2.1.15
Added support PlayCanvas realtime baking texture in PNGs format
2.1.14
Error handling on XRG files with empty nodes
2.1.13
Fix Memory Link in Texture
2.1.12
Light Map Support
Added support for the automatically generated collider.all
2.1.11
Shared Texture Support
2.1.10
Add Occlusion Strength support in playcanvas
2.1.9
Add Initial Triangle Percentage of streaming model to init the model with percentage of Triangle Budget
2.1.8
Support for combined use of Metallic Roughness maps and Ambient Occlusion maps.
2.1.7
Fixed issue where local camera position would be offset if the parent of the streamable model was scaled and camera type was player.
Got node info working in Viverse World.
2.1.6
Fixed issue with multiple stremable models with the same URL which would cause textures to be disposed of when they shouldn't have. Now using unique model ID instead of URL for texture reference keys.
Fixed issue where it wasn't passing on qualityPriority from streamable model component to stream controller.
2.1.5
Reversed previous fix as it was affecting materials that have GLTF property alphaMode set to mask. In the XRG this comes in as opaque and any pixel with an alpha less than 0.9 is not rendered. The conclusion was that this needs to be fixed at source i.e. if the material is transparent it should have alphaMode set to blend.
Fixed an issue where the loading model wasn't been removed if there was more than one streamable model.
2.1.4
Supports for normal and AO maps.
Fixed an issue with diffuse textures with an alpha channel not being visible.
2.1.3
Set FUTURE_CAMERA_POSITION_DURATION to 0
Now uses roughness value as multiplier for roughness-metalness map
Fixed an issue with occlusion culling bounding boxes appearing
This page details important for avoiding errors when working with VIVERSE Polygon Streaming.
When working with high-fidelity 3D content using Polygon Streaming, it's essential to differentiate between Preview Mode and Play Mode (also known as Runtime, Live Mode, or Launch Mode depending on the game engine). For simplicity, we will use "Play Mode" as a shared term throughout this article, although each engine may refer to it differently. Understanding these modes is crucial to ensure that developers are testing their models under realistic conditions. Below, we'll explore what to expect when using Polygon Streaming in PlayCanvas, Unity, and Unreal Engine, and provide clear instructions on how to test models effectively in each environment.
Preview Mode: This mode provides a quick visualization within the editor, allowing developers to see a basic representation of their models and adjust placements in the scene. It offers a rough approximation and may show visual artifacts. This mode does not fully utilize Polygon Streaming capabilities and should not be used for assessing the final quality or behavior of the models.
Play Mode: Known by different names such as Runtime or Launch Mode in various engines, Play Mode simulates real-world conditions by running the application as it would be experienced by end-users. Polygon Streaming is fully active in Play Mode, ensuring high-quality rendering and optimal performance. Testing in Play Mode is essential for final model evaluation.
In PlayCanvas, models utilizing Polygon Streaming are treated differently between modes:
Preview Mode: In PlayCanvas, Polygon Streaming models are not visible in Preview Mode. This mode only shows basic placeholders, making it unsuitable for evaluating the model’s quality or performance. Developers can use this mode for simple adjustments and placements, but it doesn't reflect the actual capabilities of Polygon Streaming.
Play Mode: Known as "Launch Mode" in PlayCanvas, this mode is activated by clicking the "Launch" button. Play Mode is where Polygon Streaming becomes fully operational, streaming the model from the cloud and rendering it in high fidelity. Testing models in Play Mode is crucial for an accurate assessment of both performance and quality.
In Unity, the terminology and functionality align with industry standards, using Edit Mode and Play Mode:
Edit Mode: Similar to Preview Mode, Edit Mode allows developers to place and adjust models within the Unity editor. While models can be viewed in this mode, the representation might include artifacts, and streaming optimizations are not fully applied. Edit Mode should not be relied upon for evaluating final model quality or performance.
Play Mode: Activated by pressing the "Play" button in the Unity interface, this mode initiates the full capabilities of Polygon Streaming. The models are streamed in real-time from the cloud, showcasing their high-quality rendering and optimized performance. Play Mode is essential for realistic testing and accurate visualization.
Unreal Engine follows a similar structure, with Editor Mode and Play Mode being key concepts:
Editor Mode: Referred to as Preview Mode in other contexts, this mode in Unreal Engine allows for basic visualization and placement of models. However, it does not utilize the full power of Polygon Streaming, and visual artifacts may be present. Editor Mode is useful for setup but not for final evaluation.
Play Mode: By clicking the "Play" button in Unreal Engine, developers activate Play Mode, where the game is simulated under real-world conditions. Polygon Streaming is fully engaged, streaming high-quality models from the cloud and ensuring that the performance and appearance meet expectations. Play Mode is crucial for validating the final output.
Apart from the in-editor modes like Preview and Play Modes, it's also important to understand how the "Model Preview" feature works within the console. When you upload a model to the Polygon Streaming platform, you can access a "Model Preview" option. This preview opens in a new browser window and provides a basic visualization of your model, giving you a quick way to see the model's appearance and structure.
However, like Preview Mode in development environments, the "Model Preview" is not indicative of the final streamed quality. It serves more as a preliminary check for the model's appearance rather than a comprehensive evaluation of its streaming performance or interactive capabilities. Developers should always perform thorough testing using Play Mode in their chosen game engine to fully understand how their models will behave in a real-world application.
This guide provides instructions for setting up an iFrame with a Polygon Streaming object to implement into your webpage, e-commerce product page, or any other type of website.
Copy the iFrame snippet below.
Paste it right into your webpage editor. We are using Shopify as our example e-commerce platform.
Adjust the width and height to fit your web design.
Publish your website to preview.
You can add "?more=false" to your Polygon Streaming URL to disable the viewing angles & skybox options.
This page guides you through the three options available for using VIVERSE Polygon Streaming with PlayCanvas: with the VIVERSE PlayCanvas Extension, as standalone, or with HTML scripting.
Start by opening your PlayCanvas project and setup using the VIVERSE Extension. Once the extension is setup and you’re logged in, you are ready to add special components. To add a Polygon Streaming component to the scene, add an Empty component to the scene and give a name like "Streaming Model".
With the new component selected, click on EDIT VIVERSE EXTENSION. Choose a Media type of plugin, select the module PolygonStreaming and click the plus + symbol. Now you just added a Polygon Streaming component to your scene.
Paste the Asset ID of your streaming model into the Polygon Streaming URL field. Once you added the URL in the field, the object will preview inside your scene. This is just a preview; once Published, the streaming model will start actively streaming from the cloud.
That’s all you need to stream your models inside your VIVERSE Scene. You can also modify the Polygon Streaming Parameters going to Viverse Scene Settings / Polygon Streaming and change the parameters according to your preferences. You can check the Supported Parameters list below in this documentation.
After you published your VIVERSE World scene, the Streaming Model will automatically start to stream inside the scene.
Once you have the Polygon Streaming main script imported to your project. Start by creating a new empty entity into your scene. Give it the name "StreamController" and add a Script component to the entity.
Choose the StreamController script option and that will open all the options to control your streaming models inside your project. You can check detailed information on these parameters in the Stream Controller section of this guide.
Create a new empty entity and this time give the name "Streamable Model". Add a new Script component to the entity and this time choose the "streamableModel" script option. This will create the component to stream your model inside the scene.
Paste the URL address of your streaming model into the "Path or url to model" field. Once you added the URL in the field, the object will already preview inside your scene. This is just a preview, once the scene is Launched or Published, the streaming model will start actively streaming from the cloud.
After you published your VIVERSE World scene, the Streaming Model will automatically start to stream inside your scene.
In this section, we're going to go through how to implement PlayCanvas Plugin directly in HTML.
After then you need to define a new Entity, add it to the root and define the parameters (described in PlayCanvas Editor Plugin Documentation | Polygon Streaming Settings )
This page details how to control the settings of individual VIVERSE Polygon Streaming models within your PlayCanvas project when using the VIVERSE PlayCanvas extension.
Polygon Streaming component settings are only available when using the VIVERSE of the PlayCanvas editor.
The Streamable Model Component or Polygon Streaming (VIVERSE SDK) component represents the model to be streamed inside your project. It will ask for a Polygon Streaming URL, which is the address of your newly converted model. Once you paste the address in that field, it's ready to be streamed inside your scene
The Quality Priority represents the priority of streaming data between multiple components. This means that some models can have higher priority of streaming than others, it can use only integer numbers and it works in the following logic:
Use Alpha should be checked if your model has transparency or uses alpha in their materials or textures.
Use MetalRoughness should be checked to use the model's MetalRoughness values, in case it's unchecked the model will be unlit. For both options, the default is always On.
Cast Shadows, Cast Lightmap Shadows and Receive Shadows should be checked in order for the model to cast and receive shadows.
Double Sided Materials should be checked in case the model was designed to use double sided materials.
Initial Triangle Percentage is the percentage of the amount of triangles to be first shown in the scene. This means that if you wish to show the model only when it's updated to highest level you can set it to 1.
Environment Asset allows to set a custom environment map to use for the streamed model. If no environment asset is set, it will use the environment of the scene itself.
Polygon Streaming URL
Path of the streamable mode to be streamed into the project.
./model.xrg
Quality Priority
How much to prioritize the quality of this model relative to the quality of other models in the scene. This parameter does nothing if this is the only model in the scene.
1
Use Alpha
Keep on if the model uses any alpha information or transparency.
On
Use MetalRoughness
Keep at on if the model uses PBR materials. Off in case of unlit models.
On
Cast Shadows
Keep on in case the model needs to cast shadow.
On
Cast Lightmap Shadows
Keep on in case the model needs to cast lightmap shadows.
On
Receive Shadows
Keep on in case the model needs to receive shadows.
On
Double Sided Materials
Turn on in case the model needs double sided materials.
Off
Initial LOD
Override the LOD that the model initially loads.
-1
Environment Asset
Use either a cubemap asset with a prefiltered image or the prefiltered image as a texture asset
Empty
Initial triangle percentage
Percentage of triangle budget for initially load the model
0.5
Layers
What layers to render the model
Empty
This page details how to control the global settings of VIVERSE Polygon Streaming within your PlayCanas project when using the VIVERSE PlayCanvas extension.
Polygon Streaming project settings are only available when using the VIVERSE of the PlayCanvas editor.
The Stream Controller or Polygon Streaming Settings (VIVERSE SDK) manages the streaming of models and streaming parameters inside your project. It can be found under Viverse Scene Settings.
Camera Type allows you to choose between Player camera or Non-player camera. In case you use an avatar it should be set as a Player camera. In case you use just a viewer camera, without an avatar or character attached, it should be set to Non-player camera.
Next you can customize Occlusion Culling options, it's recommended to keep as default.
The Triangle Budget is a limit on the amount of triangles that will be drawn per frame. Increasing this will lead to better visual quality, but of course also higher processing and memory utilization. It's recommended to keep Triangle Budget to at least 30% of the full amount of polygons that you are going to stream. For example, if you are going to stream a 3D model of 10 million polygons, it's recommended to use a Triangle Budget of at least 3 million.
The Mobile Triangle Budget is the same as Triangle Budget, however it will be applied in case the system identifies the user is visiting the experience via a mobile device.
Minimum Distance and Maximum Quality should be kept as default.
The Distance Factor is a factor between the distance of the camera to the object being streamed. The default value of 1.1 has a neutral preference. A higher value, such as 3 or 5, will have a preference for nearby objects over objects further away. These parameters can be changed at runtime to find the sweet spot for your scene.
The Distance Type can be set to Bounding Box or Bounding Box Center. In case it's set to Bounding Box it will calculate the distance between the camera and object based on the edges of the bounding box of the object. In case it's set to Bounding Box Center, it will calculate the distance from the center of the object. It's recommended to use Bounding Box for single objects, and Bounding Box Center for full environments in which the user will be walking inside the object.
The Close Up Distance is a change in the distance factor between camera and streaming object when the camera gets too close. For example, when the camera is at 3 units or less of distance from the object, it will use the value at Close Up Distance Factor, when the camera is further than 3 units from the object, it will use the value at Distance Factor. That way the system forces a strong streaming of data when the camera is very close to an object.
Details of parameters are also set on the Supported Parameters section.
Camera Type
The viewer camera to be set, the system will calculate distances based on the selected camera.
Player Camera
Occlusion Culling
Enable Dynamic Occlusion Culling.
True
Occlusion Culling Geometry
Mesh: Use the mesh to check if it's occluded Bounding Box: Use the bounding box of the mesh to check if it's occluded.
Bounding Box
Occlusion Query Frequency
Value is in times per second. A value of 0 means will it run on every frame.
8
Triangle Budget
The maximum amount of triangles that you want to be in the scene at any single point.
5000000
Mobile Triangle Budget
The triangle budget used on a mobile device. If it is set to 0 it will use the non-mobile triangle budget.
3000000
Distance Factor
Preference for nearby objects over objects further away. Values above one mean a preference for nearby objects. Values below one mean a preference for objects further away. One is neutral.
1.1
Distance Type
Distance type from camera to the meshes bounding boxes.
Bounding Box
Maximum Quality
Stops improving geometry that exceeds the maximum quality. This can be used to stop far away objects from showing more detail which can be wasteful. Leaving this at 0 means there is no maximum quality.
15000
Close Up Distance
The distance where it starts using close-up distance factor. Set it to 0 to not use close-up distance factor.
3
Close Up Distance Factor
The distance factor used when close-up to an object. Should be higher than the standard distance factor.
5
This page details the process of setting up the VIVERSE Polygon Streaming plugin in Unity.
To start using the plugin, there are two Prefabs that can be found in the Plugin's folder which will be all you need to stream your models inside Unity Engine. The Stream Controller, and the Streaming Model.
You can drag and drop both prefabs inside your Unity Scene and be ready to start streaming your converted 3D model inside the engine.
The Stream Controller should only be added once, and it controls the streaming feeds inside your scene, you only need to set the Main Camera in the Camera field. The Streaming Model is a prefab to be used with all models that will be streaming from the cloud into your project, you only need to add the URL of your converted model into the URL field.
Once that's set, press play and your streaming models will already start streaming inside your project!
This page details basic information about using VIVERSE Polygon Streaming with Unity.
Polygon Streaming models can be added to Unity Engine based projects with the use of our Unity Plugin, being able to stream multi-million polygon models streaming from the cloud into your projects and build for desktop, mobile and VR headsets.
Open Package Manager inside the Unity Engine - The Plugin should ONLY be installed via Package Manager and not copying inside the project’s folder, since it also need to automatically install dependencies:
Press the + button and choose Add Package from tarball… and select the downloaded .tgz file.
If everything went correct, the package should now be visible under the Packages list in the project panel
Change Log
This page details the process of setting up the VIVERSE Polygon Streaming plugin in Unreal Engine.
To start using the plugin, there are two main classes (Actors) that can be found in the Plugin's folder, or in the Classes menu, which will be all you need to stream your models inside Unreal Engine. The Stream Controller, and the Streaming Model.
You can drag and drop both Actors inside your Unreal Scene, and be ready to start streaming your converted 3D model inside the engine.
The Stream Controller should only be added once, and it controls the streaming feeds inside your scene. The Streaming Model is an Actor to be used with all models that will be streaming from the cloud into your project, you only need to add the URL of your converted model into the URL field to start streaming. You can add as many Streaming Models as you want inside your scene.
Once that's set, press play and your streaming models will already start streaming inside your project!
This page details how to control the global settings of VIVERSE Polygon Streaming within your Unity project.
The Stream Controller manages the streaming of models and streaming parameters inside your project. There must be only one Stream Controller per scene, since it will control all Streaming Models as one.
The Stream Controller needs to have a reference to a Camera object which will based the distance between viewer and object. It also contains several parameters and features to assist with the best streaming quality and control.
The Triangle Budget is a limit on the amount of triangles that will be drawn per frame. Increasing this will lead to better visual quality, but of course also higher processing and memory utilization. It's recommended to keep Triangle Budget to at least 30% of the full amount of polygons that you are going to stream. For example, if you are going to stream a 3D model of 10 million polygons, it's recommended to use a Triangle Budget of at least 3 million.
In case you use too low Triangle Budget, for example 500 thousand to stream 10 million polygons, you may reach the budget before it's possible to showcase the higher quality of the model. Resulting in only showing a lower quality version of the model in order to stay within the budget.
The Distance Factor is a factor between the distance of the camera to the object being streamed. The default value of 1.1 has a neutral preference. A higher value, such as 3 or 5, will have a preference for nearby objects over objects further away. These parameters can be changed at runtime to find the sweet spot for your scene.
The Close Up Distance is a change in the distance factor between camera and streaming object when the camera gets too close. For example, when the camera is at 3 units or less of distance from the object, it will use the value at Close Up Distance Factor, when the camera is further than 3 units from the object, it will use the value at Distance Factor. That way the system forces a strong streaming of data when the camera is very close to an object.
The Distance Type can be set to Bounding Box or Bounding Box Center. In case it's set to Bounding Box it will calculate the distance between the camera and object based on the edges of the bounding box of the object. In case it's set to Bounding Box Center, it will calculate the distance from the center of the object. It's recommended to use Bounding Box for single objects, and Bounding Box Center for full environments in which the user will be walking inside the object.
The other parameters should be left in default or you can check an explanation on the Supported Parameters section.
This page details how to control the settings of individual VIVERSE Polygon Streaming models within your Unity project.
The Streaming Model prefab represents the model to be streamed inside your project. A Streaming Model component always needs a reference to the Stream Controller component to function and a Source URL which is the URL of your converted model. Once those two are correctly set, your model will start streaming inside your project as soon as you press Play.
The Stream Controller is a reference to the Stream Controller inside your scene and needs to be set, and the Source URL is the URL of your converted model that needs to be pasted in the field.
In case your project is using Light Probes, you can tick the Light Probe Usage box. In case you would like to use a custom material in your streaming model, you can tick the Custom Material box, which will open a new panel that will allow you to change the material.
You can also set different Quality Priority options by clicking at Streaming Runtime Options Settings. This means that some models can have higher priority of streaming than others, it can use only integer numbers and it works in the following logic:
You can also turn on or off the Streaming Model Preview script to check a preview of your streaming model in the scene without the need to play it.
This page details basic information about using VIVERSE Polygon Streaming with Unreal Engine.
Polygon Streaming models can be added to Unreal Engine based projects with the use of the Unreal Engine Plugin. This will allow the streaming of models with multiple millions of polygons to stream from the cloud into your Windows-based projects and applications.
Once the plugin’s zip file is downloaded from the Polygon Streaming Web Console. The “PolygonStreaming-Unreal-Version” folder should be added to the “Plugins” folder of your Unreal Engine Project.
In case the “Plugins” folder does not exist, make sure to create a new folder with the name “Plugins” and add the plugin’s folder inside. Also make sure to close your project before installing the plugin.
After adding the “PolygonStreaming-Unreal-Version” folder inside the “Plugins” folder of your Unreal Engine Project. Double click your project’s file (.uproject). In case it asks you to Re-Build the project, choose “Yes” - now the project will be rebuilt and once it opens, it will have the Polygon Streaming Plugin installed.
Sometimes it may appear like nothing is happening - the Unreal Engine welcome screen may not show, but don’t worry, just give it some time for building and compiling and your project should open normally after a couple of minutes.
The plugin is only available for C++ based projects, and not available for Blueprint only projects. In case you want to use it for a Blueprint based project, first you will need to add C++ classes and make it C++ available.
Change Log
Viewer Camera
The viewer camera to be set, the system will calculate distances based on the selected camera.
None
Triangle Budget
The maximum amount of triangles that you want to allow in the scene at any single point.
150000000
Distance Factor
Preference for nearby objects over objects further away. Values above one mean a preference for nearby objects. Values below one mean a preference for objects further away. One is neutral.
1.1
Close Up Distance
The distance where it starts using Close-up Distance Factor instead of Distance Factor. Set it to 0 to not use close-up distance factor.
3
Close Up Distance Factor
The distance factor used when close-up to an object. Should be higher than the standard Distance Factor.
5
Distance To
Distance type from camera to the meshes bounding boxes.
Bounding Box
Max Download Size Mb
Pause downloading more data when the current active downloads exceed this threshold.
4
Maximum Quality
Stops improving geometry that exceeds the maximum quality. This can be used to stop far away objects from showing more detail which can be wasteful. Leaving this at 0 means there is no maximum quality.
0
Occlusion Culling
Enable Dynamic Occlusion Culling.
True
Raycast Mask
Set masks for the occlusion culling raycast.
Everything
Time Slicing
Determine how many frames should be accumulated to decide whether an object should be hidden.
120
Ray Per Frame
How many rays are generated per frame.
256
Stream Controller
Set the Stream Controller from the scene.
None
Source Url
Address of the Streamable Model to be streamed into the scene.
None
Normals
Calculation parameter for the normals of the model being streamed.
Original Vertex Normals
Light Probe Usage
Enable if using Light Probes in the scene.
False
Custom Material
Enable if wish to use custom materials.
False
Quality Priority
How much to prioritize the quality of this model relative to the quality of other models in the scene. This parameter does nothing if this is the only model in the scene.
1
0
Highest Priority
1
Lowest Priority
2+
Higher Priority from the previous number, but still Lower Priority than 0 (zero).
0
Highest Priority
1
Lowest Priority
2+
Higher Priority from the previous number, but still Lower Priority than 0 (zero).
01/02/2025
2.7.1
Hotfix for model updating bug
Support for updated convertor and newly converted models (2.11.1) at player level
Small small performance optimizations
10/28/2024
2.5.0
Support Vertex Color
Support Multi UV Set
Support KHR_materials_emissive_strength in BRP
Occlusion culling improved performance
Distance Type set to only Bounding Box Center
Fix shader bug in Android devices
9/16/2024
2.4.2
Fix shader bug in Android devices
9/10/2024
2.4.1
Fix Preview bug
Fix KHR_texture_transform bug
Fix KHR_materials_specularGlossiness bug
7/30/2024
2.4.0
Support KHR_materials_pbrSpecularGlossiness
7/05/2024
2.3.8
Fix Texture Transform Issue
7/03/2024
2.3.7
Fix Double Sided Material Transparent Issue
Add Signature Check
6/12/2024
2.3.6
Support xrgc version 4 - read materials from info.json
Fix XrgcMaterial default value spec
5/22/2024
2.3.5
Fixed all found memory leaks.
Support Doubled Material in BRP and URP.
Handling Stream Controller destroying/disabling edge cases
Free up resources when hiding/disabling models
Hotfix light probe usage.
Quality Priority now works only with integer numbers and follow the following logic: 0 = Highest priority. 1 = Lowest priority. 2 or more = Higher priority than lower number, but still lower priority than 0.
4/26/2024
2.3.4
Hotfix light probe usage
4/15/2024
2.3.3
Handle destroy / disable more gracefully
Free up streamingcontroller resources when hiding / disabling models
3/23/2024
2.3.2
Add Light Probe Usage in Streaming Model
Small fixes for memory and performance
2/22/2024
2.3.1
Fix Triangle Budget Bug
Remove duplicated metallic roughness and occlusion texture
2.0.2
Updated to work with xrgc5 format
2.0.1
Updated the plugin to work with Unreal 5.4.x versions.
2.0.0
Added support for KTX textures.
Added model preview to StreamingModel actor.
This page details how to control the global settings of VIVERSE Polygon Streaming within your Unreal project.
The Stream Controller manages the streaming of models and streaming parameters inside your project. There must be only one Stream Controller per scene, since it will control all Streaming Models as one.
The Triangle Budget is a limit on the amount of triangles that will be drawn per frame. Increasing this will lead to better visual quality, but of course also higher processing and memory utilization. It's recommended to keep Triangle Budget to at least 30% of the full amount of polygons that you are going to stream. For example, if you are going to stream a 3D model of 10 million polygons, it's recommended to use a Triangle Budget of at least 3 million.
In case you use too low Triangle Budget, for example 500 thousand to stream 10 million polygons, you may reach the budget before it's possible to showcase the higher quality of the model. Resulting in only showing a lower quality version of the model in order to stay within the budget.
The Distance Factor is a factor between the distance of the camera to the object being streamed. The default value of 1.1 has a neutral preference. A higher value, such as 3 or 5, will have a preference for nearby objects over objects further away. These parameters can be changed at runtime to find the sweet spot for your scene.
The Distance Type can be set to Bounding Box or Bounding Box Center. In case it's set to Bounding Box it will calculate the distance between the camera and object based on the edges of the bounding box of the object. In case it's set to Bounding Box Center, it will calculate the distance from the center of the object. It's recommended to use Bounding Box for single objects, and Bounding Box Center for full environments in which the user will be walking inside the object.
The Close Up Distance is a change in the distance factor between camera and streaming object when the camera gets too close. For example, when the camera is at 3 units or less of distance from the object, it will use the value at Close Up Distance Factor, when the camera is further than 3 units from the object, it will use the value at Distance Factor. That way the system forces a strong streaming of data when the camera is very close to an object.
The other parameters should be left in default or you can check an explanation on the Supported Parameters section.
Triangle Budget
The maximum amount of triangles that you want to allow in the scene at any single point.
3000000
Distance Factor
Preference for nearby objects over objects further away. Values above one mean a preference for nearby objects. Values below one mean a preference for objects further away. One is neutral.
1.1
Distance Type
Distance type from camera to the meshes bounding boxes.
Bounding Box
Close Up Distance
The distance where it starts using Close-up Distance Factor instead of Distance Factor. Set it to 0 to not use close-up distance factor.
3
Close Up Distance Factor
The distance factor used when close-up to an object. Should be higher than the standard Distance Factor.
5
Maximum Quality
Stops improving geometry that exceeds the maximum quality. This can be used to stop far away objects from showing more detail which can be wasteful. Leaving this at 0 means there is no maximum quality.
3000000
This page details how to control the settings of individual VIVERSE Polygon Streaming models within your Unreal project.
The Streaming Model Actor represents the model to be streamed inside your project. A Streaming Model component always needs a reference to the Stream Controller component to function and a Source URL which is the Asset ID URL of your converted model. Once those two are correctly set, your model will start streaming inside your project as soon as you press Play.
The URL field is the Asset ID address of your streaming model, once the Asset ID is pasted in this field, the model can be streamed inside your scene.
You can also set different Quality Priority options by clicking at Streaming Runtime Options Settings. This means that some models can have higher priority of streaming than others, it can use only integer numbers and it works in the following logic:
The Stream Controller is a reference to the Stream Controller inside your scene and it needs to be set, otherwise the model won’t appear or be streamed inside your scene.
Show Preview option will create a simple preview of the model in the editor scene, so you can position and scale your model correctly without the need to run the game.
In case you would like to use a custom material in your streaming model, you can tick the Custom Material box, which will open a new panel that will allow you to change the materials of the model.
Name
Description
Default value
Source URL
Asset ID address of the Streamable Model to be streamed into the scene.
None
Quality Priority
How much to prioritize the quality of this model relative to the quality of other models in the scene. This parameter does nothing if this is the only model in the scene.
1
Stream Controller
Set the Stream Controller from the scene.
None
Show Preview
Enable model preview in editor scene.
True
Custom Material
Enable if wish to use custom materials.
False
0
Highest Priority
1
Lowest Priority
2+
Higher Priority from the previous number, but still Lower Priority than 0 (zero).