Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
This page details important for avoiding errors when working with VIVERSE Polygon Streaming.
When working with high-fidelity 3D content using Polygon Streaming, it's essential to differentiate between Preview Mode and Play Mode (also known as Runtime, Live Mode, or Launch Mode depending on the game engine). For simplicity, we will use "Play Mode" as a shared term throughout this article, although each engine may refer to it differently. Understanding these modes is crucial to ensure that developers are testing their models under realistic conditions. Below, we'll explore what to expect when using Polygon Streaming in PlayCanvas, Unity, and Unreal Engine, and provide clear instructions on how to test models effectively in each environment.
Preview Mode: This mode provides a quick visualization within the editor, allowing developers to see a basic representation of their models and adjust placements in the scene. It offers a rough approximation and may show visual artifacts. This mode does not fully utilize Polygon Streaming capabilities and should not be used for assessing the final quality or behavior of the models.
Play Mode: Known by different names such as Runtime or Launch Mode in various engines, Play Mode simulates real-world conditions by running the application as it would be experienced by end-users. Polygon Streaming is fully active in Play Mode, ensuring high-quality rendering and optimal performance. Testing in Play Mode is essential for final model evaluation.
In PlayCanvas, models utilizing Polygon Streaming are treated differently between modes:
Preview Mode: In PlayCanvas, Polygon Streaming models are not visible in Preview Mode. This mode only shows basic placeholders, making it unsuitable for evaluating the model’s quality or performance. Developers can use this mode for simple adjustments and placements, but it doesn't reflect the actual capabilities of Polygon Streaming.
Play Mode: Known as "Launch Mode" in PlayCanvas, this mode is activated by clicking the "Launch" button. Play Mode is where Polygon Streaming becomes fully operational, streaming the model from the cloud and rendering it in high fidelity. Testing models in Play Mode is crucial for an accurate assessment of both performance and quality.
In Unity, the terminology and functionality align with industry standards, using Edit Mode and Play Mode:
Edit Mode: Similar to Preview Mode, Edit Mode allows developers to place and adjust models within the Unity editor. While models can be viewed in this mode, the representation might include artifacts, and streaming optimizations are not fully applied. Edit Mode should not be relied upon for evaluating final model quality or performance.
Play Mode: Activated by pressing the "Play" button in the Unity interface, this mode initiates the full capabilities of Polygon Streaming. The models are streamed in real-time from the cloud, showcasing their high-quality rendering and optimized performance. Play Mode is essential for realistic testing and accurate visualization.
Unreal Engine follows a similar structure, with Editor Mode and Play Mode being key concepts:
Editor Mode: Referred to as Preview Mode in other contexts, this mode in Unreal Engine allows for basic visualization and placement of models. However, it does not utilize the full power of Polygon Streaming, and visual artifacts may be present. Editor Mode is useful for setup but not for final evaluation.
Play Mode: By clicking the "Play" button in Unreal Engine, developers activate Play Mode, where the game is simulated under real-world conditions. Polygon Streaming is fully engaged, streaming high-quality models from the cloud and ensuring that the performance and appearance meet expectations. Play Mode is crucial for validating the final output.
Apart from the in-editor modes like Preview and Play Modes, it's also important to understand how the "Model Preview" feature works within the console. When you upload a model to the Polygon Streaming platform, you can access a "Model Preview" option. This preview opens in a new browser window and provides a basic visualization of your model, giving you a quick way to see the model's appearance and structure.
However, like Preview Mode in development environments, the "Model Preview" is not indicative of the final streamed quality. It serves more as a preliminary check for the model's appearance rather than a comprehensive evaluation of its streaming performance or interactive capabilities. Developers should always perform thorough testing using Play Mode in their chosen game engine to fully understand how their models will behave in a real-world application.
This page details important release notes on convertor releases, version changelogs, bug and feature hotfixes, and compatibility dependencies.
Current version: 2.12.6
02/24/2025
2.12.6
Major Improvements (vs 2.11.1) for converted models
Up to 60% less draw calls
Up to 33% less total textures
Up to 32% less total materials
Up to 12% less average memory used
Updates
Super Texture Atlas Merging: Texture data can now be merged into super texture atlases, minimizing draw calls coming from several separated small textures in a 3D model enables the data to be more efficient for streaming.
Lightmap Support: Lightmaps are now being supported via a separated file. You can now upload a ZIP file containing your .GLB model and a file called “lightmap.hdr” or “lightmap.png” - this lightmap will be automatically applied to the Streaming Model inside PlayCanvas Engine running with VIVERSE Create SDK. Currently only one lightmap is supported.
Improved Update Logic: With the use of Virtual Bounding Boxes, the update logic is now improved to calculate distances more accurately according to the actual visible mesh geometry in the scene. The camera frustum is also improved with a perspective bias update logic, making what the user is actually seeing more important and prioritized during streaming.
Performance Improvements: Several improvements and modifications regarding performance, such as Material Merging and Super Texture Atlas. Large environment scenes will see bigger improvements with minimized draw calls usage and optimized VRAM usage.
Bug fixes
Compatibility Dependencies Requires PlayCanvas Plugin version 2.5.1 or higher Requires Unity Plugin version 2.7.3 or higher
01/13/2025
2.11.1
Major Improvements (vs 2.9.8) for converted models
Up to 30% less draw calls
Up to 30% less memory for textures used
Up to 49% smaller converted file sizes
Up to 67% less total memory used
Updates
Improved Decimation Algorithm: Able to automatically configure the best possible algorithm when creating LODs with as little visual quality loss as possible.
Shared Materials Feature: The system is now able to look-up materials being used, combine compatible texture-less materials and remove duplicates from a streaming model. This optimizes the usage of texture-less materials and minimizes the amount of draw calls created by duplicates or the use of several similar materials in a model.
Remove the need for Baked Textures: Drastic reduction on memory needed for textures, bringing an important gain on overall performance.
Bug fixes
Compatibility Dependencies Requires PlayCanvas Plugin version 2.5.0 or higher Requires Unity Plugin version 2.7.1 or higher
10/28/2024
2.9.8
Updates
Bug fixes and minor performance updates
Compatibility Dependencies Requires PlayCanvas Plugin version 2.5.0 or higher Requires Unity Plugin version 2.5.0 or higher Requires Unreal Engine Plugin version 2.0.2 or higher
This guide provides instructions for setting up an iFrame with a Polygon Streaming object to implement into your webpage, e-commerce product page, or any other type of website.
Copy the iFrame snippet below.
Paste it right into your webpage editor. We are using Shopify as our example e-commerce platform.
Adjust the width and height to fit your web design.
Publish your website to preview.
In this guide, you will find information on how to prepare you 3D models to achieve the best outcomes when converting and streaming your models within the supported platforms and devices.
In this guide you will find information on how to prepare your 3D models to achieve the best outcomes when converting to Polygon Streaming solutions and streaming your models within the supported platforms and devices.
This approach differs from older approaches in that instead of using approximations for the way in which light interacts with a surface, a physically correct model is used. The idea is that, instead of tweaking materials to look good under specific lighting, a material can be created that will react correctly under all lighting scenarios.
In case your models are created using legacy standards such as Lambert, Blinn-Phong or Specular-Roughness, they should be converted to a PBR Physically Based Rendering standard before conversion, in order to achieve a correct representation of the materials and textures from the original model.
Models using legacy or custom standards can be converted, however, it’s not possible to guarantee a correct representation of the materials and textures. Using custom standard or custom shader elements may also give failed conversion, since the system won’t be able to proper translate custom elements.
Polygon Streaming maintains the origin point, position, and scale of all 3D objects. When converting entire environments with multiple objects, each object will automatically be positioned at its origin point. You need to make sure you have your objects placed at the correct origin point. If a model is far from the 0,0,0 origin point it will be incorrect for visualization needs, and the wrong origin point will still be translated after conversion. The same goes for the general scale of the model, the scale set for the model will be translated after conversion. Polygon Streaming follows meter units by default, the same as Blender, Unity or Unreal Engine. If the model scale is following another unit system or is set very small on the original model, The model will appear very small after conversion.
Make sure to have your model’s faces facing the correct orientation. In case a face is being oriented in the wrong direction (also known as flipped normals), it will appear transparent depending on the user view-point direction. Face orientation is a design choice and therefore subjective, not possible for an automated system to know which face should be oriented at which direction. The system will follow the original model’s orientation and direction. In case it’s facing the wrong direction on the original model, it will also face wrong after conversion.
Z-Fighting is a common issue in real-time rendering, it happens when two meshes are overlapping each other, and the GPU is unable to understand which one to render first. It’s very common to happen when displaying NURBS based models, like CAD or BIM models, in a real-time rendering scenario. In order to have a proper real-time display of your streaming model, make sure to avoid z-fighting by avoiding creating overlapping meshes.
Sometimes a 3D model may have data that is being hidden or enclosed by another object, which will never be seen by the final user. Although there is no issue in converting the model with hidden data, and the hidden data won’t be render, since our player plugins make use of Occlusion Culling. It’s a good practice to remove any data that will be hidden, in order to make the overall model smaller, more concise, faster to convert, and overall lighter to stream.
It’s a good practice to maintain a proper mesh topology. There is no issue in converting a model with a bad mesh topology. However, any 3D visualization issues that comes from a bad mesh topology, will be translated for the streaming model as well. Therefore, in order to obtain the best looking model, with faster and lighter stream. A concise, sensible and well done mesh topology will always give much better results than a bad mesh topology, that may contain weird looking patches, holes, weird looking triangles and faces. The best the 3D model is modeled and maintain a sensible mesh topology, the best it will be streamed.
Physically based rendering (PBR) has recently become the standard in many 3D applications, such as , , and 3D Web platforms. Polygon Streaming is also fully based on this standard, for a more performant and realistic material and textures representations.
This document guides you through the three options available for integrating VIVERSE Polygon Streaming into PlayCanvas projects.
Polygon Streaming models can be added to PlayCanvas projects with the use of our PlayCanvas plugins. These plugins will help to stream high polygon models from the cloud into your projects and build for the web. The table below breaks down the plugins and their usage.
Option 1: Chrome Browser Extension Plugin
Used for projects targeting publication to VIVERSE. Dependent on the PlayCanvas Chrome Browser extension being installed.
Option 2: Standalone Plugin
Used for projects NOT targeting publication to VIVERSE. Dependent on creation of PlayCanvas entities in editor.
Option 3: HTML Scripting
Used for projects NOT targeting publication to VIVERSE. Dependent on manually adding HTML code to PlayCanvas project.
This page introduces you to the goals and technology that make up VIVERSE Polygon Streaming.
VIVERSE Polygon Streaming is an intelligent visualization core technology that overcomes modern-day 3D visualization challenges. It unleashes the potential to stream spatial content from the cloud and seamlessly shares high-fidelity models and worlds on any device.
Polygon Streaming aims to provide creators, developers, designers, and enterprises with tools to build immersive, high-quality 3D content that is widely compatible with the most popular game engines and can be experienced across different platforms.
VIVERSE Polygon Streaming focuses on the next shift in computing and the future of the internet. It breaks down barriers and enables accessibility from any device (mobile, tablet, PC, and even XR headsets), allowing people to create, share experiences, and collaborate like never before.
Allow end-users to consume big 3D models rapidly by only downloading the geometry that has relevant visual impact for them at their specific position and bandwidth. With Polygon Streaming, you can stream large 3D models into your platform or projects without taking up local storage space, leveraging the number of polygons that can be shown in a cross-platform and multi-device way.
Only transfer geometry is needed in the user’s current position and field of view in the scene over time.
Automatic LOD generation and texture compression. Texture compression method optimized for lightweight decompression.
Minimal back-end server requirements, only requiring HTTP requests.
Utilize the end-user's GPU for full rendering. No Cloud GPU is necessary at any step.
For license & agreement, please click on this .
This page details the file types and formats supported in VIVERSE Polygon Streaming.
Polygon Streaming supports the following formats:
.GLB
.glTF (zipped)
.OBJ (zipped)
It’s highly recommended to use a single .GLB file to convert your 3D models. This will ensure that the file contains all the necessary data, and its following the correct standards used during conversion. 3D models must be a triangulated mesh with PBR materials following metallic-roughness standards.
In case you use a .glTF model, all the necessary files need to be zipped into a single .zip file. It should at least contain a .gltf file, a .bin file and all the texture images in the formats .jpg or .png.
In case you use a .OBJ model, all the necessary files need to be zipped into a single .zip file. It should contain a .obj file, a .mtl file, and all the texture images in the formats .jpg or .png, making sure they are properly set and connected. You can also upload a single .obj file in case there are no textures or .mtl file needed.
JPG
PNG
BMP
KTX
Diffuse/Albedo
Metallic/Roughness
Transparency/Opacity
Normals
Ambient Occlusion
Emissive
KHR_draco_mesh_compression
KHR_materials_unlit
KHR_texture_basisu
KHR_materials_specular
KHR_materials_pbrSpecularGlossiness
KHR_texture_transform
KHR_materials_ior
KHR_materials_transmission
KHR_materials_volume
KHR_materials_clearcoat
This page details how to control the settings of individual VIVERSE Polygon Streaming models within your PlayCanvas project when using the VIVERSE PlayCanvas extension.
The Streamable Model Component or Polygon Streaming (VIVERSE SDK) component represents the model to be streamed inside your project. It will ask for a Polygon Streaming URL, which is the address of your newly converted model. Once you paste the address in that field, it's ready to be streamed inside your scene
The Quality Priority represents the priority of streaming data between multiple components. This means that some models can have higher priority of streaming than others, it can use only integer numbers and it works in the following logic:
Use Alpha should be checked if your model has transparency or uses alpha in their materials or textures.
Use MetalRoughness should be checked to use the model's MetalRoughness values, in case it's unchecked the model will be unlit. For both options, the default is always On.
Cast Shadows, Cast Lightmap Shadows and Receive Shadows should be checked in order for the model to cast and receive shadows.
Double Sided Materials should be checked in case the model was designed to use double sided materials.
Initial Triangle Percentage is the percentage of the amount of triangles to be first shown in the scene. This means that if you wish to show the model only when it's updated to highest level you can set it to 1.
Environment Asset allows to set a custom environment map to use for the streamed model. If no environment asset is set, it will use the environment of the scene itself.
Polygon Streaming URL
Path of the streamable mode to be streamed into the project.
./model.xrg
Quality Priority
How much to prioritize the quality of this model relative to the quality of other models in the scene. This parameter does nothing if this is the only model in the scene.
1
Use Alpha
Keep on if the model uses any alpha information or transparency.
On
Use MetalRoughness
Keep at on if the model uses PBR materials. Off in case of unlit models.
On
Cast Shadows
Keep on in case the model needs to cast shadow.
On
Cast Lightmap Shadows
Keep on in case the model needs to cast lightmap shadows.
On
Receive Shadows
Keep on in case the model needs to receive shadows.
On
Double Sided Materials
Turn on in case the model needs double sided materials.
Off
Initial LOD
Override the LOD that the model initially loads.
-1
Environment Asset
Use either a cubemap asset with a prefiltered image or the prefiltered image as a texture asset
Empty
Initial triangle percentage
Percentage of triangle budget for initially load the model
0.5
Layers
What layers to render the model
Empty
This document provides a guide for using HTML Scripting to integrate Polygon Streaming into a PlayCanvas project that does not target publication to VIVERSE.
For PlayCanvas projects that are not targeting publication to VIVERSE, this PlayCanvas Direct HTML Plugin is the second path for integrating Polygon Streaming into those projects. Because these projects will not be published to VIVERSE, we've created a direct HTML version of the plugin that does not utilize the PlayCanvas VIVERSE Chrome browser extension.
This document provides a guide for using the Chrome Browser Extension to integrate Polygon Streaming into a PlayCanvas project that targets publication to VIVERSE.
Start by opening your PlayCanvas project and setup using the VIVERSE Extension. Once the extension is setup and you’re logged in, you are ready to add special components. To add a Polygon Streaming component to the scene, add an Empty component to the scene and give a name like "Streaming Model".
With the new component selected, click on EDIT VIVERSE EXTENSION. Choose a Media type of plugin, select the module PolygonStreaming and click the plus + symbol. Now you just added a Polygon Streaming component to your scene.
Paste the Asset ID of your streaming model into the Polygon Streaming URL field. Once you added the URL in the field, the object will preview inside your scene. This is just a preview; once Published, the streaming model will start actively streaming from the cloud.
That’s all you need to stream your models inside your VIVERSE Scene. You can also modify the Polygon Streaming Parameters going to Viverse Scene Settings / Polygon Streaming and change the parameters according to your preferences. You can check the Supported Parameters list below in this documentation.
After you published your VIVERSE World scene, the Streaming Model will automatically start to stream inside the scene.
This page details the process of setting up the VIVERSE Polygon Streaming plugin in Unity.
To start using the plugin, there are two Prefabs that can be found in the Plugin's folder which will be all you need to stream your models inside Unity Engine. The Stream Controller, and the Streaming Model.
You can drag and drop both prefabs inside your Unity Scene and be ready to start streaming your converted 3D model inside the engine.
The Stream Controller should only be added once, and it controls the streaming feeds inside your scene, you only need to set the Main Camera in the Camera field. The Streaming Model is a prefab to be used with all models that will be streaming from the cloud into your project, you only need to add the URL of your converted model into the URL field.
Once that's set, press play and your streaming models will already start streaming inside your project!
This page details basic information about using VIVERSE Polygon Streaming with Unity.
Polygon Streaming models can be added to Unity Engine based projects with the use of our Unity Plugin, being able to stream multi-million polygon models streaming from the cloud into your projects and build for desktop, mobile and VR headsets.
Open Package Manager inside the Unity Engine - The Plugin should ONLY be installed via Package Manager and not copying inside the project’s folder, since it also need to automatically install dependencies:
Press the + button and choose Add Package from tarball… and select the downloaded .tgz file.
If everything went correct, the package should now be visible under the Packages list in the project panel
Change Log
This page guides you through the steps of uploading an asset to the Polygon Streaming platform.
After selecting your file you will be able to click Convert to convert the 3D asset into a streaming model. You can also choose different options before conversion:
After selecting your file you will be able to click Convert to convert the 3D asset into a streaming model. You can also choose different options before conversion:
Default Resolution will use default resolution for the textures of your models, a middle term between low and high resolution.
Low Resolution will lower down the resolution of textures, making the file to stream lighter in exchange of a visual quality drop.
High Resolution will ensure the resolution of textures is as high as possible, which will ensure the highest visual quality. However, depending on the model, especially very large models like full environments, it could make the file heavy to stream and display.
Once you choose the desired options you can press Convert and the conversion of the model will start. Once the conversion is done, the converted model will appear in the Models panel.
In the Model panel you can click on the icons next to your model to either preview the model in a new tab on the web-browser, or copy the converted URL address to use inside your projects using the Polygon Streaming integrations and plugins.
This page details how to control the global settings of VIVERSE Polygon Streaming within your Unity project.
The Stream Controller manages the streaming of models and streaming parameters inside your project. There must be only one Stream Controller per scene, since it will control all Streaming Models as one.
The Stream Controller needs to have a reference to a Camera object which will based the distance between viewer and object. It also contains several parameters and features to assist with the best streaming quality and control.
The Triangle Budget is a limit on the amount of triangles that will be drawn per frame. Increasing this will lead to better visual quality, but of course also higher processing and memory utilization. It's recommended to keep Triangle Budget to at least 30% of the full amount of polygons that you are going to stream. For example, if you are going to stream a 3D model of 10 million polygons, it's recommended to use a Triangle Budget of at least 3 million.
In case you use too low Triangle Budget, for example 500 thousand to stream 10 million polygons, you may reach the budget before it's possible to showcase the higher quality of the model. Resulting in only showing a lower quality version of the model in order to stay within the budget.
The Distance Factor is a factor between the distance of the camera to the object being streamed. The default value of 1.1 has a neutral preference. A higher value, such as 3 or 5, will have a preference for nearby objects over objects further away. These parameters can be changed at runtime to find the sweet spot for your scene.
The Close Up Distance is a change in the distance factor between camera and streaming object when the camera gets too close. For example, when the camera is at 3 units or less of distance from the object, it will use the value at Close Up Distance Factor, when the camera is further than 3 units from the object, it will use the value at Distance Factor. That way the system forces a strong streaming of data when the camera is very close to an object.
The Distance Type can be set to Bounding Box or Bounding Box Center. In case it's set to Bounding Box it will calculate the distance between the camera and object based on the edges of the bounding box of the object. In case it's set to Bounding Box Center, it will calculate the distance from the center of the object. It's recommended to use Bounding Box for single objects, and Bounding Box Center for full environments in which the user will be walking inside the object.
The other parameters should be left in default or you can check an explanation on the Supported Parameters section.
Polygon Streaming component settings are only available when using the VIVERSE of the PlayCanvas editor.
After then you need to define a new Entity, add it to the root and define the parameters (described in )
The PlayCanvas Chrome Browser Extension includes the Polygon Streaming Plugin. Instructions for downloading and installing the browser extension can be found .
Start by logging into the . Then you will be able to upload a 3D asset into the platform by simply dropping a file in the upload field and clicking the upload button.
02/28/2025
2.7.3
Support for updated convertor and newly converted models (2.12.6)
Small performance optimizations
01/02/2025
2.7.1
Hotfix for model updating bug
Support for updated convertor and newly converted models (2.11.1) at player level
Small performance optimizations
10/28/2024
2.5.0
Support Vertex Color
Support Multi UV Set
Support KHR_materials_emissive_strength in BRP
Occlusion culling improved performance
Distance Type set to only Bounding Box Center
Fix shader bug in Android devices
9/16/2024
2.4.2
Fix shader bug in Android devices
9/10/2024
2.4.1
Fix Preview bug
Fix KHR_texture_transform bug
Fix KHR_materials_specularGlossiness bug
7/30/2024
2.4.0
Support KHR_materials_pbrSpecularGlossiness
7/05/2024
2.3.8
Fix Texture Transform Issue
7/03/2024
2.3.7
Fix Double Sided Material Transparent Issue
Add Signature Check
6/12/2024
2.3.6
Support xrgc version 4 - read materials from info.json
Fix XrgcMaterial default value spec
5/22/2024
2.3.5
Fixed all found memory leaks.
Support Doubled Material in BRP and URP.
Handling Stream Controller destroying/disabling edge cases
Free up resources when hiding/disabling models
Hotfix light probe usage.
Quality Priority now works only with integer numbers and follow the following logic: 0 = Highest priority. 1 = Lowest priority. 2 or more = Higher priority than lower number, but still lower priority than 0.
4/26/2024
2.3.4
Hotfix light probe usage
4/15/2024
2.3.3
Handle destroy / disable more gracefully
Free up streamingcontroller resources when hiding / disabling models
3/23/2024
2.3.2
Add Light Probe Usage in Streaming Model
Small fixes for memory and performance
2/22/2024
2.3.1
Fix Triangle Budget Bug
Remove duplicated metallic roughness and occlusion texture
Viewer Camera
The viewer camera to be set, the system will calculate distances based on the selected camera.
None
Triangle Budget
The maximum amount of triangles that you want to allow in the scene at any single point.
150000000
Distance Factor
Preference for nearby objects over objects further away. Values above one mean a preference for nearby objects. Values below one mean a preference for objects further away. One is neutral.
1.1
Close Up Distance
The distance where it starts using Close-up Distance Factor instead of Distance Factor. Set it to 0 to not use close-up distance factor.
3
Close Up Distance Factor
The distance factor used when close-up to an object. Should be higher than the standard Distance Factor.
5
Distance To
Distance type from camera to the meshes bounding boxes.
Bounding Box
Max Download Size Mb
Pause downloading more data when the current active downloads exceed this threshold.
4
Maximum Quality
Stops improving geometry that exceeds the maximum quality. This can be used to stop far away objects from showing more detail which can be wasteful. Leaving this at 0 means there is no maximum quality.
0
Occlusion Culling
Enable Dynamic Occlusion Culling.
True
Raycast Mask
Set masks for the occlusion culling raycast.
Everything
Time Slicing
Determine how many frames should be accumulated to decide whether an object should be hidden.
120
Ray Per Frame
How many rays are generated per frame.
256
This document guides you through integrating Polygon Streaming to JavaScript projects.
Polygon Streaming models can be added to JavaScript projects with the use of our JavaScript SDKs. These plugins will help to stream high polygon models from the cloud into your projects and build for the web. The table below breaks down the plugins and their usage.
This page details basic information about using VIVERSE Polygon Streaming with Unreal Engine.
Polygon Streaming models can be added to Unreal Engine based projects with the use of the Unreal Engine Plugin. This will allow the streaming of models with multiple millions of polygons to stream from the cloud into your Windows-based projects and applications.
Once the plugin’s zip file is downloaded from the Polygon Streaming Web Console. The “PolygonStreaming-Unreal-Version” folder should be added to the “Plugins” folder of your Unreal Engine Project.
In case the “Plugins” folder does not exist, make sure to create a new folder with the name “Plugins” and add the plugin’s folder inside. Also make sure to close your project before installing the plugin.
After adding the “PolygonStreaming-Unreal-Version” folder inside the “Plugins” folder of your Unreal Engine Project. Double click your project’s file (.uproject). In case it asks you to Re-Build the project, choose “Yes” - now the project will be rebuilt and once it opens, it will have the Polygon Streaming Plugin installed.
Sometimes it may appear like nothing is happening - the Unreal Engine welcome screen may not show, but don’t worry, just give it some time for building and compiling and your project should open normally after a couple of minutes.
The plugin is only available for C++ based projects, and not available for Blueprint only projects. In case you want to use it for a Blueprint based project, first you will need to add C++ classes and make it C++ available.
Change Log
2.0.2
Updated to work with xrgc5 format
2.0.1
Updated the plugin to work with Unreal 5.4.x versions.
2.0.0
Added support for KTX textures.
Added model preview to StreamingModel actor.
This document provides a guide for using the Standalone Plugin to integrate Polygon Streaming into a PlayCanvas project that does not target publication to VIVERSE.
For PlayCanvas projects that are not targeting publication to VIVERSE, this PlayCanvas standalone plugin is the first option for integrating Polygon Streaming into those projects. Because these projects will not be published to VIVERSE, we've created a standalone version of the plugin that does not utilize the PlayCanvas VIVERSE Chrome browser extension.
This page details the process of setting up the VIVERSE Polygon Streaming plugin in Unreal Engine.
To start using the plugin, there are two main classes (Actors) that can be found in the Plugin's folder, or in the Classes menu, which will be all you need to stream your models inside Unreal Engine. The Stream Controller, and the Streaming Model.
You can drag and drop both Actors inside your Unreal Scene, and be ready to start streaming your converted 3D model inside the engine.
The Stream Controller should only be added once, and it controls the streaming feeds inside your scene. The Streaming Model is an Actor to be used with all models that will be streaming from the cloud into your project, you only need to add the URL of your converted model into the URL field to start streaming. You can add as many Streaming Models as you want inside your scene.
Once that's set, press play and your streaming models will already start streaming inside your project!
This document provides a guide for integrating Polygon Streaming web player for Three.js using the NPM package directly.
This is a guide to use Polygon Streaming web player for Three.js using the NPM package directly.
You can download a sample project to get started from this link:
npm install npm run dev
It will open a browser window and display the 3D model.
Import the StreamController from the package:
import { StreamController } from '@polygon-streaming/web-player-threejs';
Instantiate the stream controller:
const streamController = new StreamController(camera, renderer, scene, controls.target, {
cameraType: 'nonPlayer',
triangleBudget: 3000000,
mobileTriangleBudget: 1000000,
closeUpDistance: 0.2,
minimumDistance: 0.01,
distanceFactor: 1.1,
distanceType: 'boundingBoxCenter',
maximumQuality: 15000,
closeUpDistanceFactor: 5
});
Add a streaming model, passing it a model URL and a Group to act as a model parent:
const modelParent = new THREE.Group(); modelParent.position.set(0, 1, 0); scene.add(modelParent);
streamController.addModel('the URL of your model to be streamed (Asset ID)', modelParent, { qualityPriority: 1, initialTrianglePercent: 0.1, castShadows: true, receiveShadows: true, castShadowsLightmap: true, forceDoubleSided: false, useAlpha: true, environmentMap: null, hashCode: '' });
Call the stream controller's update method in the animation loop:
function animate() { controls.update(); renderer.render(scene, camera); streamController.update(); }
renderer.setAnimationLoop(animate);
Now you have everything setup to start streaming your 3D models inside your Three.js application.
Paste this URL as the first parameter of streamController.addModel() method.
camera (Required): The camera used in the scene.
renderer (Required): The WebGL renderer used in the scene.
scene (Required): The scene object.
cameraTarget (Required): The camera target which is a Vector3. If you are using orbit controls it would be controls.target.
options (Optional): All options are optional. The options are:
cameraType: 'nonPlayer' | 'player', default: 'nonPlayer'
nonPlayer: A camera that is not attached to a player e.g. a camera that orbits an object.
player: A camera that is attached to a player.
triangleBudget: number, default: 3000000. The maximum amount of triangles that you want to be in the scene at any single point.
mobileTriangleBudget: number, default: 1000000. The triangle budget used on a mobile device. If it is set to 0 it will use the non-mobile triangle budget.
minimumDistance: number, default: 0.01. The smallest possible distance to the camera.
distanceFactor: number, default: 1.1. Preference for nearby objects over objects further away. Values above one mean a preference for nearby objects. Values below one mean a preference for objects further away. One is neutral.
distanceType: 'boundingBoxCenter' | 'boundingBox', default: 'boundingBoxCenter'
boundingBoxCenter: Uses the center of the bounding box to caluclate the distance to the node.
boundingBox: Uses the bounding box corners and face centers to calulcate the distance to the node.
maximumQuality: number, default: 15000. Stops improving geometry that exceeds the maximum quality. This can be used to stop far away objects from showing more detail which can be wasteful. Setting it to 0 means there is no maximum quality.
closeUpDistance: number, default: 3. The distance where it starts using close-up distance factor. Set it to 0 to not use close-up distance factor.
closeUpDistanceFactor: number, default: 5. The distance factor used when close-up to an object. Should be higher than the standard distance factor.
URL (Required): URL of the XRG model. If it doesn't end with .xrg it will append model.xrg to the URL.
model parent (Required): The scene object that the streaming model will be attached to.
options (Optional): All options are optional. The options are:
qualityPriority: number, default: 1. How much to prioritize the quality of this model relative to the quality of other models in the scene. This parameter does nothing if this is the only model in the scene.
initialTrianglePercent: number, default: 0.1. Percentage of triangle budget to initialize the model with.
castShadows: boolean, default: true. Whether the model should cast shadows.
receiveShadows: boolean, default: true. Whether the model should receive shadows.
castShadowsLightmap: boolean, default: true. Whether the model casts shadows when rendering lightmaps.
forceDoubleSided: boolean, default: false. Render the model double sided regardless of the setting in the model file.
environmentMap: texture, default: null. A cube map environment texture.
hashCode: string, default: ''. Hash code to validate streaming model.
This page details how to control the settings of individual VIVERSE Polygon Streaming models within your Unreal project.
The Streaming Model Actor represents the model to be streamed inside your project. A Streaming Model component always needs a reference to the Stream Controller component to function and a Source URL which is the Asset ID URL of your converted model. Once those two are correctly set, your model will start streaming inside your project as soon as you press Play.
The URL field is the Asset ID address of your streaming model, once the Asset ID is pasted in this field, the model can be streamed inside your scene.
You can also set different Quality Priority options by clicking at Streaming Runtime Options Settings. This means that some models can have higher priority of streaming than others, it can use only integer numbers and it works in the following logic:
The Stream Controller is a reference to the Stream Controller inside your scene and it needs to be set, otherwise the model won’t appear or be streamed inside your scene.
Show Preview option will create a simple preview of the model in the editor scene, so you can position and scale your model correctly without the need to run the game.
In case you would like to use a custom material in your streaming model, you can tick the Custom Material box, which will open a new panel that will allow you to change the materials of the model.
This page details how to control the settings of individual VIVERSE Polygon Streaming models within your Unity project.
The Streaming Model prefab represents the model to be streamed inside your project. A Streaming Model component always needs a reference to the Stream Controller component to function and a Source URL which is the URL of your converted model. Once those two are correctly set, your model will start streaming inside your project as soon as you press Play.
The Stream Controller is a reference to the Stream Controller inside your scene and needs to be set, and the Source URL is the URL of your converted model that needs to be pasted in the field.
In case your project is using Light Probes, you can tick the Light Probe Usage box. In case you would like to use a custom material in your streaming model, you can tick the Custom Material box, which will open a new panel that will allow you to change the material.
You can also set different Quality Priority options by clicking at Streaming Runtime Options Settings. This means that some models can have higher priority of streaming than others, it can use only integer numbers and it works in the following logic:
You can also turn on or off the Streaming Model Preview script to check a preview of your streaming model in the scene without the need to play it.
Option 1: Chrome Browser Extension Plugin
PlayCanvas Chrome Browser Extension includes the Polygon Streaming Plugin.
Instructions for using the Polygon Streaming Plugin can be found .
Option 2: Standalone Plugin
Add polygon-streaming.js to PlayCanvas project.
Instructions for using the Standalone plugin can be found .
Option 3: HTML Scripting
Download build of PlayCanvas project to access HTML.
Instructions for using HTML Scripting can be found .
Babylon.js
Instructions for installation and setup can be found .
Three.js
Instructions for installation and setup can be found .
Download the latest version of the PlayCanvas standalone plugin . Import the polygon-streaming.js script into the Assets window of the PlayCanvas project.
B. Add the StreamController script to the StreamController entity. You can check detailed information on these parameters .
To run the example you first need to make sure you have installed. Then run the following in a terminal:
Upload your 3D model to the online console:
To get the model URL go to the models section of the console: and click on the three dots next to your model and select "Copy asset ID".
0
Highest Priority
1
Lowest Priority
2+
Higher Priority from the previous number, but still Lower Priority than 0 (zero).
Name
Description
Default value
Source URL
Asset ID address of the Streamable Model to be streamed into the scene.
None
Quality Priority
How much to prioritize the quality of this model relative to the quality of other models in the scene. This parameter does nothing if this is the only model in the scene.
1
Stream Controller
Set the Stream Controller from the scene.
None
Show Preview
Enable model preview in editor scene.
True
Custom Material
Enable if wish to use custom materials.
False
Stream Controller
Set the Stream Controller from the scene.
None
Source Url
Address of the Streamable Model to be streamed into the scene.
None
Normals
Calculation parameter for the normals of the model being streamed.
Original Vertex Normals
Light Probe Usage
Enable if using Light Probes in the scene.
False
Custom Material
Enable if wish to use custom materials.
False
Quality Priority
How much to prioritize the quality of this model relative to the quality of other models in the scene. This parameter does nothing if this is the only model in the scene.
1
0
Highest Priority
1
Lowest Priority
2+
Higher Priority from the previous number, but still Lower Priority than 0 (zero).
0
Highest Priority
1
Lowest Priority
2+
Higher Priority from the previous number, but still Lower Priority than 0 (zero).
This page details how to control the global settings of VIVERSE Polygon Streaming within your PlayCanas project when using the VIVERSE PlayCanvas extension.
The Stream Controller or Polygon Streaming Settings (VIVERSE SDK) manages the streaming of models and streaming parameters inside your project. It can be found under Viverse Scene Settings.
The Occlusion Culling option is recommended to keep as default.
The Triangle Budget is a limit on the amount of triangles that will be drawn per frame. Increasing this will lead to better visual quality, but of course also higher processing and memory utilization. It's recommended to keep Triangle Budget to at least 30% of the full amount of polygons that you are going to stream. For example, if you are going to stream a 3D model of 10 million polygons, it's recommended to use a Triangle Budget of at least 3 million.
The Mobile Triangle Budget is the same as Triangle Budget, however it will be applied in case the system identifies the user is visiting the experience via a mobile device.
Maximum Quality should be kept as default.
The Distance Type can be set to Bounding Box or Bounding Box Center. In case it's set to Bounding Box it will calculate the distance between the camera and object based on the edges of the bounding box of the object. In case it's set to Bounding Box Center, it will calculate the distance from the center of the object. It's recommended to use Bounding Box for single objects, and Bounding Box Center for full environments in which the user will be walking inside the object.
The Close Up Distance is a change in the distance factor between camera and streaming object when the camera gets too close. For example, when the camera is at 3 units or less of distance from the object, it will use the value at Close Up Distance Factor, when the camera is further than 3 units from the object, it will use the value at Distance Factor. That way the system forces a strong streaming of data when the camera is very close to an object.
Details of parameters are also set on the Supported Parameters section.
Occlusion Culling
Enable Dynamic Occlusion Culling.
FALSE
FALSE
TRUE
FALSE
FALSE
FALSE
Occlusion Geometry
Mesh: Use the mesh to check if it's occluded Bounding Box: Use the bounding box of the mesh to check if it's occluded.
Bounding Box
Bounding Box
Bounding Box
Bounding Box
Bounding Box
Bounding Box
Occlusion Query Frequency
Value is in times per second. A value of 0 means will it run on every frame.
0
8
60
5
8
12
Triangle Budget
The maximum amount of triangles that you want to be in the scene at any single point.
0
5000000
Depends on device
Mobile won't use this parameter
5000000
5000000
Mobile Triangle Budget
The triangle budget used on a mobile device. If it is set to 0 it will use the non-mobile triangle budget.
0
3000000
Depends on device
3000000
PC won't use this parameter
PC won't use this parameter
Distance Factor
Preference for nearby objects over objects further away. Values above one mean a preference for nearby objects. Values below one mean a preference for objects further away. One is neutral.
0
1.1
10
1.1
1.1
1.1
Maximum Quality
Stops improving geometry that exceeds the maximum quality. This can be used to stop far away objects from showing more detail which can be wasteful. Leaving this at 0 means there is no maximum quality.
0
15000
300000
15000
15000
0
This page details how to control the global settings of VIVERSE Polygon Streaming within your Unreal project.
The Stream Controller manages the streaming of models and streaming parameters inside your project. There must be only one Stream Controller per scene, since it will control all Streaming Models as one.
The Triangle Budget is a limit on the amount of triangles that will be drawn per frame. Increasing this will lead to better visual quality, but of course also higher processing and memory utilization. It's recommended to keep Triangle Budget to at least 30% of the full amount of polygons that you are going to stream. For example, if you are going to stream a 3D model of 10 million polygons, it's recommended to use a Triangle Budget of at least 3 million.
In case you use too low Triangle Budget, for example 500 thousand to stream 10 million polygons, you may reach the budget before it's possible to showcase the higher quality of the model. Resulting in only showing a lower quality version of the model in order to stay within the budget.
The Distance Factor is a factor between the distance of the camera to the object being streamed. The default value of 1.1 has a neutral preference. A higher value, such as 3 or 5, will have a preference for nearby objects over objects further away. These parameters can be changed at runtime to find the sweet spot for your scene.
The Distance Type can be set to Bounding Box or Bounding Box Center. In case it's set to Bounding Box it will calculate the distance between the camera and object based on the edges of the bounding box of the object. In case it's set to Bounding Box Center, it will calculate the distance from the center of the object. It's recommended to use Bounding Box for single objects, and Bounding Box Center for full environments in which the user will be walking inside the object.
The Close Up Distance is a change in the distance factor between camera and streaming object when the camera gets too close. For example, when the camera is at 3 units or less of distance from the object, it will use the value at Close Up Distance Factor, when the camera is further than 3 units from the object, it will use the value at Distance Factor. That way the system forces a strong streaming of data when the camera is very close to an object.
The other parameters should be left in default or you can check an explanation on the Supported Parameters section.
Triangle Budget
The maximum amount of triangles that you want to allow in the scene at any single point.
3000000
Distance Factor
Preference for nearby objects over objects further away. Values above one mean a preference for nearby objects. Values below one mean a preference for objects further away. One is neutral.
1.1
Distance Type
Distance type from camera to the meshes bounding boxes.
Bounding Box
Close Up Distance
The distance where it starts using Close-up Distance Factor instead of Distance Factor. Set it to 0 to not use close-up distance factor.
3
Close Up Distance Factor
The distance factor used when close-up to an object. Should be higher than the standard Distance Factor.
5
Maximum Quality
Stops improving geometry that exceeds the maximum quality. This can be used to stop far away objects from showing more detail which can be wasteful. Leaving this at 0 means there is no maximum quality.
3000000
Polygon Streaming project settings are only available when using the VIVERSE of the PlayCanvas editor.
This document provides a guide for integrating Polygon Streaming web player for Babylon.js using the NPM package directly.
You can download a sample project to get started from this link:
npm install npm run dev
This will open a browser window and display the 3D model.
Import Babylon.js, glTF loader and the StreamController:
import * as BABYLON from '@babylonjs/core/Legacy/legacy'; import '@babylonjs/loaders/glTF'; import { StreamController, loadWasmModule } from '@polygon-streaming/web-player-babylonjs';
Load Ammo.js. This is only required if you want to make use of the optional embedded collider in the model. Other physics plugins are not supported as Ammo.js is the only one that supports concave colliders. The Ammo.js physics plugin uses version 1 of the physics engine so you will need to add physics impostors to your meshes rather than physics aggregates or bodies.
import ammoWasmJsUrl from './lib/ammo.wasm.js?url'; import ammoWasmWasmUrl from './lib/ammo.wasm.wasm?url'; import ammoJsUrl from './lib/ammo.js?url';
loadWasmModule('Ammo', ammoWasmJsUrl, ammoWasmWasmUrl, ammoJsUrl).then(ammoInstance => {
Instantiate the stream controller:
const streamController = new StreamController(camera, engine, scene, cameraTarget, { cameraType: 'nonPlayer', triangleBudget: 3000000, mobileTriangleBudget: 1000000, closeUpDistance: 0.2, minimumDistance: 0.01, distanceFactor: 1.1, distanceType: 'boundingBoxCenter', maximumQuality: 15000, closeUpDistanceFactor: 5, ammoInstance: ammoInstance });
Add a streaming model, passing it a model URL and a TransformNode to act as a model parent:
const modelParent = new BABYLON.TransformNode('Model parent', scene); modelParent.position.set(0, 1, 0);
streamController.addModel('the URL of your model to be streamed (Asset ID)', modelParent, { qualityPriority: 1, initialTrianglePercent: 0.1, castShadows: true, receiveShadows: true, castShadowsLightmap: true, forceDoubleSided: false, useAlpha: true, environmentMap: null, hashCode: '' });
Call the stream controller's update method in the render loop:
engine.runRenderLoop(function () {
scene.render();
streamController.update();
});
Now you have everything setup to stream your 3D model inside your Babylon.js application.
Paste this URL as the first parameter of streamController.addModel() method.
camera (Required): The camera used in the scene. engine (Required): The engine used in the scene. scene (Required): The scene object. cameraTarget (Required): The camera target which is a Vector3. options (Optional): All options are optional. The options are: cameraType: 'nonPlayer' | 'player', default: 'nonPlayer' nonPlayer: A camera that is not attached to a player e.g. a camera that orbits an object. player: A camera that is attached to a player. triangleBudget: number, default: 3000000. The maximum amount of triangles that you want to be in the scene at any single point. mobileTriangleBudget: number, default: 1000000. The triangle budget used on a mobile device. If it is set to 0 it will use the non-mobile triangle budget. minimumDistance: number, default: 0.01. The smallest possible distance to the camera. distanceFactor: number, default: 1.1. Preference for nearby objects over objects further away. Values above one mean a preference for nearby objects. Values below one mean a preference for objects further away. One is neutral. distanceType: 'boundingBoxCenter' | 'boundingBox', default: 'boundingBoxCenter' boundingBoxCenter: Uses the center of the bounding box to caluclate the distance to the node. boundingBox: Uses the bounding box corners and face centers to calulcate the distance to the node. maximumQuality: number, default: 15000. Stops improving geometry that exceeds the maximum quality. This can be used to stop far away objects from showing more detail which can be wasteful. Setting it to 0 means there is no maximum quality. closeUpDistance: number, default: 3. The distance where it starts using close-up distance factor. Set it to 0 to not use close-up distance factor. closeUpDistanceFactor: number, default: 5. The distance factor used when close-up to an object. Should be higher than the standard distance factor. ammoInstance: object, default: null. The Ammo.js instance. Required if you want to make use of the embedded collider.
URL (Required): URL of the XRG model. If it doesn't end with .xrg it will append model.xrg to the URL.
model parent (Required): The scene object that the streaming model will be attached to.
options (Optional): All options are optional. The options are:
qualityPriority: number, default: 1. How much to prioritize the quality of this model relative to the quality of other models in the scene. This parameter does nothing if this is the only model in the scene.
initialTrianglePercent: number, default: 0.1. Percentage of triangle budget to initialize the model with.
castShadows: boolean, default: true. Whether the model should cast shadows.
receiveShadows: boolean, default: true. Whether the model should receive shadows.
castShadowsLightmap: boolean, default: true. Whether the model casts shadows when rendering lightmaps.
forceDoubleSided: boolean, default: false. Render the model double sided regardless of the setting in the model file.
environmentMap: CubeTexture, default: null. A cube map environment texture.
hashCode: string, default: ''. Hash code to validate streaming model.
To run the sample project you first need to make sure you have installed. Then run the following in a terminal:
Upload your 3D model to the online console:
To get the model URL go to the models section of the console: and click on the three dots next to your model and select "Copy asset ID".