Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
This page details important information about using the VIVERSE Creator Studio to publish and manage settings for content associated with your account.
All creations published to your VIVERSE account are known as "Worlds" and can be accessed on the "My Worlds" tab of the account page or the VIVERSE Studio. Here, you can also see any creators that you are following or worlds that you have favorited.
Currently, Worlds can only be associated with, owned and managed by one VIVERSE account. The owner's account name will display under the world on the VIVERSE world discovery page.
The is the one-stop-shop for managing all content associated with your VIVERSE account and profile.
The main dashboard of the VIVERSE studio includes metrics on how users engage with your content. Currently, this includes the number of views, unique viewers, new viewers, and likes. You may update the date range to filter for specific sections of metrics.
The content page lists all of the worlds you have published on VIVERSE. Each tile displayes the world's name, discoverability setting, and engagement metrics with controls to access its settings. You may search or filter for specific categories of worlds using the toolbars.
The upload page allows you to manage and add specific content to your VIVERSE account. For each world, you can see its status, visibility setting, and an area to edit its settings.
The upload page can also be used to create and upload new worlds to VIVERSE.
The settings page can be used to manage and update your profile information which is displayed for any viewers looking at your creator page.
When publishing to VIVERSE, creations will be given a single-player Scene URL and a multi-player World URL.
The Scene URL, such as https://create.viverse.com/scenes/1234567, is designed as a preview for creators to quickly iterate on their project's visuals and mechanics without worrying about networked gameplay. The Scene URL can be shared and accessed by anyone with access to the URL, however you will not be able to see or hear other networked avatars.
The World URL, such as https://create.viverse.com/aBCdEfG, is the fully published, multiplayer environment that may be made discoverable by other VIVERSE users on the world discovery page. The World URL's accessibility is determined by the World's settings. Here you will be able to see and hear other networked avatars who have access to the World URL
A world's settings can be edited by clicking the kebab menu next to your world on your account page.
With the world settings, you can configure your world's name, description, thumbnail, and accessibility settings. Both Genre and Access will influence how your world is categorized and displayed on the VIVERSE world discovery page.
Worlds can be set to one of three accessibility states and can be password protected to add an additional level of security.
Worlds can be assigned a genre, which impacts curation and discovery on the VIVERSE webpages.
Worlds can also be assigned a device filter, regulating which user devices are intended to experience the world based on its optimization.
Worlds that are made "Public" are discoverable on the VIVERSE world discovery page. Worlds are categorized automatically by genre settings and device compatibility (corresponding to file size and optimization level), as well as manually by the VIVERSE curation team.
This page is a starting point if you are interested in learning about VIVERSE's tools for artists, creators, and developers.
VIVERSE includes a robust series of tools for artists and developers to build engaging 3D experiences. These tools are sorted into two main categories:
Integrations with development platforms, such as PlayCanvas and Unity, that allow creators to publish directly to VIVERSE.
Developer tools and SDKs, such as Polygon Streaming, which can either be used in creations published to VIVERSE or hosted elsewhere.
VIVERSE is a web-based platform for 3D content where creators can showcase their work to the VIVERSE user-base and monetize. Our goal is to help you publish from the platform that works best for you and leverage your chosen tools. Whether you are hosting a meetup, creating an engaging narrative, or building a full length multiplayer game, there are several options for you to publish your work on VIVERSE.
Don't see your platform? Join our and let us know how you would like to be able to publish to VIVERSE.
Along with our platform integrations that allow you to publish to VIVERSE, we also have a number of standalone developer tools that can be utilized in experiences hosted on and off of VIVERSE. These tools help creators get the most out of their 3D experience and make the difficult parts of online development easier.
NOTE: VIVERSE SDKs cannot be used with projects published via the , which do not have App IDs.
VIVERSE is proud to support the work many of the most innovative artists, developers, educators, and entrepreneurs across the world. Our is the epicenter of our creator community and the best way to get connected with other innovative builders.
In the Discord Server, the community and VIVERSE team would love to hear your questions, bug reports, and ideas for making a more accessible platform. Simply create a new post in the #get-help channel when you land.
For issues related to your VIVERSE / HTC account, we recommend reaching out to the company-wide support channel:
Navigate to the "Content Versions" tab and select the file you would like to use.
After uploading the first build and adding iframe settings, you can continue uploading iterative builds by clicking the "Upload" section in the sidebar, then "Manage Content" next to the world in question. This will bring up the same upload and permissions screen shown in Step 3.








Create from Templates
Create from our library of world templates and add 3D elements and media directly in VIVERSE. World decoration is great for meetup hosts and beginners looking to customize their first world from a wide range of template environments.
Standalone Publishing
Developers may publish to VIVERSE using any platform that can build for WebGL/HTML5. Publish from UnityWebGL, Godot, ThreeJS, Babylon, AFRAME, and more!
See here
PlayCanvas
PlayCanvas is an open source game engine with a web-based editor. We have created an extension of the PlayCanvas editor that allows creators to publish to VIVERSE with automatic support for multiplayer VIVERSE avatars.
Polygon Streaming
Upload and embed high quality 3D assets in web-based experiences. Polygon Streaming's innovative technology makes highly-detailed, large assets accessible to billions of devices through the web browser!
Avatar & Account SDK
Utilize our avatar and virtual identity system to bring multiplayer to your 3D experience.
Leaderboard SDK
Add a leaderboard to single and multiplayer gaming experiences. Our leaderboard system allows you to keep track of player performance between sessions.
Matchmaking & Networking SDK
Add persistent data to single and multiplayer gaming experiences.
Learn how you can earn by building on VIVERSE...
We are proud to support developers and teams of all sizes and artistic backgrounds on VIVERSE. In addition to investing in the development of new content on VIVERSE through our , we are excited to offer a number of ways for creators to earn on VIVERSE.
We are currently working to explore and release these tools to our developers. Here are some of the monetization methods that are/will be available to creators...
Channel Subscriptions
Players pay monthly to access content listed by you under a channel subscription. This may be one or multiple experiences listed under one subscription.
Closed Beta
80% Developer / 20% Platform
One-Time Purchases
Players pay once to access an individual VIVERSE experiences.
Closed Beta
80% Developer / 20% Platform
Donations
Players may contribute to your favorite Creator as they develop more amazing content on VIVERSE.
Coming Soon...
TBD
In-App Purchases SDK
Players may purchase unlock-able content from with a your VIVERSE experience.
Coming Soon...
TBD
What else is on the roadmap? In addition to the methods listed above, we are also exploring ad revenue sharing, VIVERSE platform subscriptions, and opening our pay-per-session compensation structure that is currently only available to Creator Program participants. If you would like to see the VIVERSE monetization tools go in a specific direction, join our Discord Server and let us know your thoughts in #monetization. We want to hear how you think VIVERSE can become the most developer-friendly platform for 3D creations.
Right now, our monetization tools are in closed beta until the end of 2025. If you would like to participate during this time to help us test our monetization tools, please contact [email protected] and include details about how you would like to monetize your work on VIVERSE!
*All figures referenced are shares of net revenue. In addition to this baseline, VIVERSE may offer more generous rates on a case-by-case basis or through limited time promotions. For full details about revenue sharing, check out the VIVERSE Developer Platform Agreement: https://www.viverse.com/terms-of-use







Learn how to create a custom loading screen on VIVERSE during asset pre-loading
Because web games and assets take time to transfer over the internet and load onto the player's device, user experience during loading is important. If shown a blank or broken-looking screen during this time, users may leave early, and never get to try the full experience.
While creators should optimize load time as much as possible, it's also somewhat inevitable. As such, VIVERSE provides a default loading experience during the asset pre-load phase:
However, some creators may wish to make custom loading screens more in line with the aesthetics of their game or app. This can help sell an overall sense of quality and immediately set the tone for users.
VIVERSE creators can make full use of . Just navigate to PlayCanvas Project Settings > Loading Screen and click "Create Default."
If you have the installed, this will inject a VIVERSE logo into the loading screen template, which is required for all projects.
Other than that, loading-screen.js is yours to customize. Add your logo, a tagline in a custom font, or a cool background image, grounding your users in your experience immediately.
All assets with the "Preload" box checked in asset settings will be downloaded during the loading screen phase. Limiting pre-loaded assets can help get users to your game view faster, which will result in fewer overall "bounces" from your game (meaning users who quit the game during loading). So only check "Preload" if the asset is immediately needed - otherwise it will stream in behind the scenes as needed, .
This document provides a guide that can be used to setup Polygon Streaming in a VIVERSE project.
Control how users see and express themselves with the .changeAvatar() method on LocalPlayer
If you don't want to use default VIVERSE avatars for your world, the PlayCanvas SDK features a simple method to swap to any .vrm avatar asset in your project.
Per the API docs, import the PlayerService into your .mjs script, which has a localPlayer property of type LocalPlayer, which in turn has a .changeAvatar() method.
import { Script, Asset } from "playcanvas";
import { PlayerService } from "../@viverse/create-sdk.mjs";
export class VvSwitchAvatars extends Script {
static scriptName = "vvSwitchAvatars";
/**
* @attribute
* @type {Asset}
*/
vrmAsset = null;
initialize() {
this.playerService = new PlayerService();
this.playerService.localPlayer.changeAvatar(this.vrmAsset);
}
}
NOTE: this script asset must be placed in
/scriptsor another subfolder, since it assumes the VIVERSE SDK is one level up, located at:"../@viverse/create-sdk.mjs"- or you can alter this import path as needed. For more information on how .mjs scripts and imports work, see Introduction to MJS.
Because vrmAsset is defined as an attribute of type Asset, we can then select which asset to use directly in the PlayCanvas editor once the script is added to an entity.
When we publish to VIVERSE and load the experience, the .vrm is loaded immediately. Avatar switching for the local player is possible at any point during runtime and can be triggered with UI, trigger colliders, or with any other programmatic callback. Reference Code:
This document provides a guide that can be used to setup videos and extend the functionality of videos in a VIVERSE project.
This page details the different kinds of media that can be uploaded to a world in Edit Mode and the settings you can control.
Use `XrService` to interact with virtual reality devices and controllers
WebXR experiences can run on desktop, mobile and virtual reality devices alike. Utilizing the XrService in the Create SDK, you can write custom code to manage VR controllers, locomotion settings, and XR session callbacks.
By default, the VIVERSE character controller uses teleport for locomotion, since this is the more comfortable option, in general. However, smooth locomotion (where the player glides smoothly across the floor) can be enabled on either or both controllers by setting the of each.
Import the from the Create SDK into an .mjs script. Per the API docs, this gives you access to both
This document provides a guide that can be used to setup an entity that can be picked up and threw out. Users in the World can pick the object with "G", Throw it with "T", and put it down with "H"
This page overviews the interface for controlling global settings for your VIVERSE world in PlayCanvas.






A. In the VIVERSE extension, select the Media plugin for the Select plugins dropdown.
B. Select the PolygonStreaming module and add it.
C. Add the polygon streaming url. The URL should be in the following format: https://stream-stage.viverse.com/polygon_file/b9e62012-11e5-49cb-8ede-de596eec537e/aee7496b-b459-4e5d-87fe-2658955eb4f0/model.xrg
The 3D models are visible in the PlayCanvas editor when utilizing Polygon Streaming.
The 3D model streaming into the environment with the avatar.

A. In the VIVERSE extension, select the Media plugin for the Select plugins dropdown.
B. Select the Video module and add it, then select either "Asset" for files contained in your PlayCanvas project, or "URL" for a video to stream, such as a YouTube URL (but please note: YouTube or other embeds are run inside iframe elements, which do not render in VR). For this example, choose "Asset."
C. Add the video to the Asset field by selecting it within your project.
D. Uncheck the auto play property to prevent the video from starting when the avatar enters the environment.



Web support for Godot currently requires the engine version to be Godot 4.1 or higher for successful exports. Currently using Godot without C# is essential. However, web export support for C# is expected soon from a recent announcement.
Godot 3 web exports are technically supported but not focused on future support.
Select the desired platform for export. For web exporting, choose HTML5. Also make sure that the export path is set to Build/index.html. This will lower your packaged build size and rename project exported .html to index.html so VIVERSE can easily find and run the project.
From here you can publish the project using the CLI tool just like any other platform.
A. To re-publish content to VIVERSE when a project is already published, type the following command with the project path to the project's production build folder: viverse-cli publish <path>, then click Enter.
B. Confirm the manifest file is updated.
C. Confirm the content was published successfully.
List of Godot WebGL rendering issues: https://github.com/godotengine/godot/issues/66458
Search for and embed assets from . Please note that not all assets may be optimized for webXR and properties, such as animations, might not transfer.
Import assets associated with your account which you have uploaded to or purchased from the VIVERSE marketplace.
Upload files to your world from your computer.
Embed assets hosted on servers external to VIVERSE.
GLB/glTF 2.0 files
GLB/glTF 2.0 files
JPG, PNG, GIF, PDF, MP4, MP3, GLB/glTF, VRM files
Websites, Images, Videos, Audio files, PDFs, .m3u8, YouTube videos/live streams
Once media have been uploaded, you are able to control several of its properties through the asset's details:
Asset Details | Here you can add an expandable description and a clickable link to an external url.
Transform Properties | Here you can control the asset's position, rotation, and scale.
Frame | For 2D media, such as images or external website links, you can add a colored frame to give your media extra dimensionality.
Playback Settings | For media with some form of playback, such as videos or streams, you can control audio and playback settings.
Configuring media playback settings for MP4s can be done via Audio (A) toggle.
Configuring media playback settings for MP3s can be done via Audio (A) toggle.
To add interactions to VRMs, click Add button (A) under Interactions. Configure the interaction in the panel (B).


NOTE: this script asset must be placed in
/scriptsor another subfolder, since it assumes the VIVERSE SDK is one level up, located at:"../@viverse/create-sdk.mjs"- or you can alter this import path as needed. For more information on how .mjs scripts and imports work, see Introduction to MJS.
When switching to Smooth Locomotion, this also enables flight in VR in any World where flight is enabled in World Settings. Just "click" the Smooth Locomotion joystick to enter flight mode. Once flying, pressing forward on the joystick will fly forward along the VR camera's forward axis (i.e. wherever you're looking), and vice versa backwards.
Checking the IXrController interface further, we can use the setModelAsset() function to set custom 3D models for our controllers, instead of the default VIVERSE models.
After defining the vrControllerAssetL and vrControllerAssetR attributes of type Asset in the above script, we then reference custom 3D controller assets in the editor, which our script instantiates at runtime in VR.
A. Create an new entity
B. Setup collision component
C. Setup render component
D. Setup rigidbody and select "Dynamic" in Type
The Polygon Streaming settings control how all assets embedded with VIVERSE's Polygon Streaming service are rendered in your world. For details on these settings, please consult this documentation:
The Quest Config settings allow you to create and add steps to Quests in your world. For details on these settings, please consult this documentation: Quests
The Post Effect settings allow you to add and control the values of post-processing effects on your avatar's camera. For more details on the specific post effects and their values, please consult this documentation: https://developer.playcanvas.com/user-manual/graphics/posteffects/


This page details the basics of Edit Mode for decorating your world.
Edit Mode is available on all worlds you own or where you have been added as a co-owner/moderator. If you are a world owner or co-owner, you will have access to the "Edit Mode" button when you join the world.
When enabling Edit Mode, your controls and interface will be changed, and your camera will be detached from your avatar so that you can more easily navigate and edit your world. When in Edit Mode, your avatar will still be visible and your audio will be audible to others in the world.
The Edit Mode interface has three sections, the performance report in the top-left corner, the media upload toolbar in the bottom of your screen, and the object list in the top right where you can access and control the settings of all media added to your world in Edit Mode.
In order to save your customizations and have others see them, you will need to select the "Close edit mode" button and exit Edit Mode.
Media can be added from several different sources through the toolbar in the bottom of the Edit Mode interface: Sketchfab, Marketplace assets, uploaded assets, and assets linked from external sources. For more information on the types of media you can add and their settings, see .
When media are added to your world, you can access and update their properties through the object list on the right-hand side of the Edit Mode interface. Furthermore, you can quickly reposition your media by clicking and dragging them through your world.
You can add five additional VIVERSE accounts as co-owners of your world. By promoting these users, you are giving them full control over Edit Mode, moderation tools, and world settings for your world. Co-owners can be removed and will not be able to delete the world on your behalf. Co-owners can be added outside of Edit Mode by going to the hamburger settings button in the top-right of your world, selecting the "Permissions" tab, and adding Co-Owners by their user id or display name.
This document provides a guide that can be used to add a seat to objects in VIVERSE project. This allows the avatar to sit down.
Add functionality to allow avatar to sit down
When the avatar enters the SeatHintFarAwayTriggerSphere, the SeatHintFarAway icon (white dot) shows above the seat.
When the avatar enters the SeatHintTriggerSphere, the SeatHint button (sitting person icon) shows above the seat.
This page details how to publish to VIVERSE from the VR illustration tool Open Brush...
In December 2025, VIVERSE and the Icosa Foundation, Open Brush's steward, teamed up with digital artist SUTU to create a publishing pipeline for Open Brush creators to host, distribute, and monetize their work on VIVERSE. When publishing to VIVERSE, creators are able to share their work on Mobile, Desktop, and VR through the web browser, complete with networked VRM avatars and real-time chat. The publishing process is a simple and powerful way for Open Brush creators to expand their audiences and make their work more accessible!
This page houses the links for utilizing Dan Greenheck's FPS demo on PlayCanvas, which utilizes his open-source library for mesh destruction.
This project was created by Dan Greenheck, a 3D web developer and educator, who authored all projects and the custom mesh destruction library. Check out his socials to see the other awesome work he is doing...
YouTube | LinkedIn | X/Twitter | Bluesky
🎯 FPS SHOOTER DEMO —PlayCanvas Project: —VIVERSE World:
🏃🏻♂️ PLATFORMER DEMO —PlayCanvas Project: —VIVERSE World:
🪅 THREE-PINATA —NPM Package: —Source:
This document details the process of creating a world from a template.
This page details the usage of the networked component on individual entities in PlayCanvas.
The Networked plugin allows local updates to certain properties of an entity's components to be sent to other connected avatars.
By adding the Transform component, an entity's position and rotation will be networked across clients.
At this time, Transform does not network the Scale property of the entity.
By adding the Anim component, an entity's animation state will be networked across clients.
In this video, we have created an arena with a floor, four walls and a ball. The networking module has been added to the ball. After publishing and creating the world in VIVERSE, multiple players can join in the environment. The location of the ball is tracked across all player sessions.
To create the arena and ball in the video, you can follow the tutorial. The instructions below can be used to add Networking functionality to the ball or any other entity of your choosing.
The page introduces the basic information about implementing custom code with PlayCanvas in VIVERSE.
VIVERSE allows developers to use nearly any of the custom scripting interfaces provided by PlayCanvas, including but not limited to WebRequests like http & websockets, GLSL shaders, and 3rd party libraries imported directly into your project's hierarchy.
However, by default, VIVERSE's PlayCanvas SDK does automatically handles avatar transform and audio networking. For customizing any features related to the VIVERSE avatar, we provide an additional API reference (see VIVERSE SDK APIs below).
The documentation for VIVERSE's SDK APIs is currently housed here: . We provide interfaces for customizing the default VIVERSE Cameras, Player, Network, and Environment, as well as the unique attributes when in an XR/immersive session in a VR headset.
These interfaces are automatically injected when you activate the VIVERSE Chrome Extension; The folder @viverse is added to your asset folder, which contains all interfaces in the create-sdk.mjs file. All interfaces can be imported and manipulated in your custom scripts.
For accessing the APIs in the VIVERSE SDK, you must use .mjs file types, which support import and export statements.
In addition to adding the create-sdk.mjs file on initialization, the VIVERSE Chrome Extension adds create-extensions-sdk.mjs, which contains all of the interfaces between the no-code tools and PlayCanvas.
Within this file, you can see that each no-code function is assigned a unique trigger string, such as 'trigger:300006' for EntitySubscribeTriggerEnter. This is useful, because you can arbitrarily listen for and fire these exact same event names in custom scripts, connecting together the VIVERSE No-Code Functions and your custom code.
This document provides a guide for creating a sample app in Unity, building the app for WebGL, and deploying the app to VIVERSE.
Anyone can publish their WebGL-compatible Unity project to VIVERSE in a few simple steps. In this guide, we'll walk through the process of creating an new Unity project, making sure it is compatible with WebGL, and publishing to VIVERSE using the .
While VIVERSE is a great place for multiplayer games with networked avatars — and we have a number of services that can help you implement these features — it is not required to implement networked avatars to publish to VIVERSE.
The VIVERSE command-line tool provides simple yet powerful functionality to manage your VIVERSE content
import { Script, Asset } from "playcanvas";
import { XrService, XrTypes } from "../@viverse/create-sdk.mjs";
export class ViverseXrManager extends Script {
static scriptName = "viverseXrManager";
initialize() {
this.xrService = new XrService();
this.xrService.controllers.right.locomotionType = XrTypes.LocomotionTypes.Teleport;
this.xrService.controllers.left.locomotionType = XrTypes.LocomotionTypes.Smooth; // Smooth
// See enum definitions here:
// https://viveportsoftware.github.io/pc-lib/enums/XrTypes.LocomotionTypes.html
}
}import { Script, Asset } from "playcanvas";
import { XrService } from "../@viverse/create-sdk.mjs";
export class ViverseXrManager extends Script {
static scriptName = "viverseXrManager";
/**
* @attribute
* @type {Asset}
*/
vrControllerAssetR = null;
/**
* @attribute
* @type {Asset}
*/
vrControllerAssetL = null;
initialize() {
this.xrService = new XrService();
this.xrService.controllers.right.setModelAsset(this.vrControllerAssetR);
this.xrService.controllers.left.setModelAsset(this.vrControllerAssetL);
}
}













Step 1. View the Performance Manager report (B).







A. The SeatHintFarAwayTriggerSphere is automatically generated.
B. Adjust the Radius on the Collision component to modify the distance away before the SeatHintFarAway icon displays. Also ensure the height of the collider is above ground level, so the player capsule collider is sure to intersect with it.
When the avatar clicks on the SeatHint button, the avatar sits down.



Step 1. Select the Details page (A).
Step 2. Give the world a name in the Name field (B).
Step 3. Give the world a description in the Description field (C).
Step 4. Upload a thumbnail of the world in the Thumbnail (D) section.
Step 5. Assign at least genre to the world in the Genre (E) section.
Step 6. Set the compatible devices types for the world in the Devices (F) section.

A. In the VIVERSE extension, select the Networking plugin for the Select plugins dropdown.
B. Change the dropdown to Transform in the Select a module and add field. Click the plus sign.
C. Confirm the enabled checkbox is checked.


Unity Hub and Unity installed on your device.
In this tutorial, we will be using Unity v6.1, however any WebGL-compatible version of Unity should be supported.
Navigate to Publish and select "WebGL Publish". In the pop-up, click "Build and Publish", selecting the desired folder for your build. When doing this for the first time, Unity will automatically publish to their web-servers for testing. For future builds, you can disable this behavior to just the builds without publishing.
Login to VIVERSE platform:
Or directly pass in login credentials for CI/CD integration:
In such CI/CD environments, it's recommended to use environment variables:
After login, check your authentication status:
And to logout:
After install and authentication, the VIVERSE CLI can be used to publish any working WebGL build to the VIVERSE platform after an authentication process. When publishing, you'll either access your existing projects, or create a new one.
You can use the VIVERSE CLI to create a new application directly:
Or specify a name:
Alternately, you can use the VIVERSE Studio workflow to create an application ID:
Once authenticaed, you can view your account's available application list:
The output will be displayed in a table format with the following columns:
ID: Application identifier
STATE: Application state
TITLE: Application name
URL: Application preview URL
Publishing content requires two inputs:
App ID — the target application to publish to (required)
Content path — the directory containing your content (optional if you're already in that directory)
Option 1: Specify content path
Option 2: From within content directory
Note: The App ID is required. You can use the
viverse-cli app listcommand to query your existing application IDs, or view the IDs of newly created applications after usingapp create.
Important: After uploading content successfully, you'll need to visit the Creator Studio website to complete the review and publishing process.
Warning: The
<path>parameter MUST point to your build output folder and NOT your source code folder. Publishing source code folders (containingsrc/,node_modules/, or development files like.tsx,.jsx,.vue,.unity, etc.) will result in non-functional content and deployment failures.
For security purposes, creators must submit their application for approval






This document provides guides that can be used to change the spawn location of an avatar and transfer an avatar to a different location. These actions can be configured to execute with triggers.
Create An Action That Sets A New Spawn Point
This guide provides instructions for setting up the EntityCheckPoint action. In the sample app, once the avatar enters one of the green trigger areas, the area becomes the new spawn location.
In this example, a trigger area is created and when triggered, an action sets a new spawn location. Any object can be used as a trigger, as long as the object has a collision component.
A. Create a new Sphere entity.
B. The Collision component is not required for EntityCheckPoint action to work. The Collision component is required for the EntitySubscribeTriggerEnter trigger that will be used in this example.
C. Adding a material is optional. A transparent material has been added so that the trigger area is visible in play mode.
Create An Action That Teleports An Object To A Specific Location
This guide provides instructions for setting up the TeleportAvatar action. In the sample app, once the avatar enters one of the green cylinder trigger areas, the avatar is teleported to another location.
In this example, a trigger is created and when the avatar or other objects enter the trigger area, an action teleports the avatar to a specific location. Any object can be used as a trigger, as long as the object has a collision component.
A. Create a new Sphere entity.
B. The Collision component is not required for TeleportAvatar action to work. The Collision component is required for the EntitySubscribeTriggerEnter trigger that will be used in this example.
C. Adding a material is optional. A transparent material has been added so that the trigger area is visible in play mode.
How to use VIVERSE SDK's CameraService to manage camera settings and switch between different VIVERSE and custom cameras.
VIVERSE provides a powerful camera system alongside its player controller and input scripting that allows developers to switch between different views and control various camera behaviors. This guide will show you how to implement camera switching in your VIVERSE project.
The VIVERSE player rig creates several cameras and switches between them depending on your settings and/or world scripting: CAMERA_3RD, CAMERA_1ST, CAMERA_VR, CAMERA_ORBITAL are all separate in the scene hierarchy.
The provides methods to switch between different camera views provided by the VIVERSE avatar rig. Here's a basic .mjs script importing the CameraService and switching to first-person camera mode on init. Simply attach this script to any entity in your project.
NOTE: this script asset must be placed in
/scriptsor another subfolder, since it assumes the VIVERSE SDK is one level up, located at:"../@viverse/create-sdk.mjs"- or you can alter this import path as needed.
If the VIVERSE player rig cameras don't totally meet your needs, you can switch to your own camera entity for cut scenes or preview cameras. Here's a practical example of how to override the default VIVERSE camera with a custom camera on init:
You can listen for certain events from other services that may be related to your desired camera behavior. For example, on the player:ready event, we know that all player cameras and scripting have loaded in and are ready to interact with the scene:
The CameraService provides several properties to customize camera behavior:
This document provides a guide that can be used to add a physics force to objects in VIVERSE project. The action can be configured to execute when a trigger is activated.
Create An Action That Adds Force To An Object
This guide provides instructions for setting up the EntityRigidbodyAddForceInPhysics action. In the sample app, once the avatar enters the blue trigger area, a force is applied to the sphere.
In this example, a trigger is created and when the avatar or other objects enters the trigger area, an action adds force to an object. Any object can be used as a trigger, as long as the object has a collision component. This example uses a 3D box as the trigger area.
A. Create a new 3D Box entity.
B. This Collision component is not required for the EntityRigidbodyAddForceInPhysics action to work. The Collision component is required for the EntitySubscribeTriggerEnter trigger that will be used in this example.
C. Adding a material is optional. A transparent material has been added so that the trigger area is visible in play mode.
This document provides a guide that can be used to setup images and extend the functionality of images in a VIVERSE project.
Adding Images And Extending Image Functionality
An image has been added to the scene and the image has been configured to always face the avatar
VIVERSE Studio is a powerful tool for creators looking to share their work. By following the guidelines for uploading content, you can maximize your reach and engage with a vibrant community.
Using VIVERSE Studio offers several advantages:
User-Friendly Interface: The platform is designed to be intuitive, making it easy for users of all skill levels.
Community Engagement: Connect with other creators and receive feedback on your work.
This document provides several guides that can be used to optimize VIVERSE projects by controlling when assets are rendered.
Create An Action That Unloads An Asset From The Scene, Stops The Asset From Rendering And Retains The Asset In Memory
This guide provides instructions for setting up the EntityAssetUnload action. In the sample app, the trigger area is outlined in blue. Once the avatar enters into the trigger area, the asset stops rendering in the scene. This will decrease draw calls and improve frames per second, but the asset will be retained in memory.
npm install -g @viverse/cliviverse-cli auth loginviverse-cli auth login -e <email> -p <password>viverse-cli auth login -e $VVS_EMAIL -p $VVS_PASSWORDviverse-cli auth statusviverse-cli auth logoutviverse-cli app createviverse-cli app create --name <application-name>viverse-cli app listviverse-cli app publish <path> --app-id <your-app-id>viverse-cli app publish --app-id <your-app-id>

































import { Script } from "playcanvas";
import { CameraService, CameraTypes } from "../@viverse/create-sdk.mjs";
export class CameraManager extends Script {
initialize() {
this.cameraService = new CameraService();
// Switch to first person view to first person
this.cameraService.switchPov(CameraTypes.PovTypes.FirstPerson);
// At any time, you can switch back to CameraTypes.PovTypes.ThirdPerson
// Prevent users from switching POV, which is usually done in user settings or with the keyboard shortcut "V"
this.cameraService.canSwitchPov = false;
}
}import { Script, Entity } from "playcanvas";
import { CameraService } from "../@viverse/create-sdk.mjs";
export class OverrideViverseCamera extends Script {
/**
* @attribute
* @type {Entity}
*/
overrideCamera = null;
initialize() {
if (!this.overrideCamera) {
console.error("No override camera set!");
return;
}
this.cameraService = new CameraService();
this.cameraService.switchCamera(this.overrideCamera);
}
}// Wait for the XR start event then get the VR camera
this.app.xr.on("start", () => {
// The VIVERSE player rig will switch to its "CAMERA_VR" automatically
// So getting reference to the activeCamera at this time will point to
// the "CAMERA_VR" entity
const vrCamera = this.cameraService.activeCamera;
// You could also search for this by name
// const vrCamera = this.app.root.findByName("CAMERA_VR");
// Then raycast from the forward vector, apply scripts, etc.
});
// Or wait for player to be ready before switching cameras
this.playerService.localPlayer.on("player:ready", () => {
this.cameraService.switchCamera(this.previewCamera);
});// Set minimum and maximum zoom distances
this.cameraService.minZoomDistance = 1;
this.cameraService.maxZoomDistance = 10;
// Disable camera zooming
this.cameraService.canZoom = false;
// Disable camera rotation
this.cameraService.canRotate = false;
// Get current POV type
const currentPov = this.cameraService.pov; // Returns PovTypes enum value
// Get active camera entity
const activeCamera = this.cameraService.activeCamera; // Returns null | EntityA. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select EntitySubscribeTriggerEnter.
C. Add local-player to the tags to filter field.
D. Add an Action and select EntityCheckPoint.
E. Add an entity that has a position that will be used for the new spawn location. SpawnLocation1 has been added to the pick up an entity id.
A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select EntitySubscribeTriggerEnter.
C. Add local-player to the tags to filter field.
D. Add an Action and select EntityTeleportAvatar.
E. Add an entity that has a position that will be used for the teleport location. TeleportLocation2 has been added to the Specify the Entity whose location you want to teleport to.
The avatar enters the green trigger area and the trigger area becomes the new spawn location.
The avatar has fallen off the map and needs to be respawned.
When the avatar respawns, it is spawned at the new location.
The avatar enters the green trigger area.
After entering the green trigger area, the avatar is teleported to another location.

D. Click the Edit Viverse Extension button.
A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select EntitySubscribeTriggerEnter.
C. Add local-player to the tags to filter field.
D. Add an Action and select NotificationCenterPublish.
E. Create a unique name for the notification and add it to the notification name to publish field. In this example, the AddForce name is added.
Create the 3D object that the physics force will be applied to
A. Create a new 3D object.
B. Add a Collision component.
C. Add a Rigidbody component.
D. Set the Collision Type to Dynamic.
E. Click the Edit Viverse Extension button.
A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select NotificationCenterSubscribe.
C. The same text that was added to the notification name to publish needs to be added to the notification name to subscribe.
D. Add an Action and select EntityRigidbodyAddForceInPhysics.
E. In the X Force, Y Force and Z Force fields, add values for the amount of force to apply in each direction.
The sphere is not moving before the avatar enters the trigger area.
Once the avatar enters the trigger area, a force is applied to the sphere in the direction based on the parameters.
To open VIVERSE Studio from the Viverse.com landing page
Step 1. Click on your avatar in the upper-right hand corner (A)
Step 2. Click on VIVERSE Studio (B)
To upload a project
Step 1. Click on the Content Versions tab (A)
Step 2. Click the Select File button (B)
Step 3. A window will be displayed that allows selection of a zip file of a project. After selecting the zip file, click the Upload button (C). Refer to the
When the avatar is outside the blue trigger area, the house asset is being rendered.
When the avatar enters the blue trigger area, the house asset is unloaded, but is retained in memory.
In this example, a trigger is created and when triggered, an action unloads an object. Any object can be used as a trigger, as long as the object has a collision component. This example uses a 3D box as the trigger area.
A. Create a new 3D Box entity.
B. The Collision component is not required for EntityAssetUnload action to work. The Collision component is required for the EntitySubscribeTriggerEnter trigger that will be used in this example.
C. Adding a material is optional. A transparent material has been added so that the trigger area is visible in play mode.
D. Click the Edit Viverse Extension button.
Create An Action That Reloads An Asset Into The Scene From Memory And Causes The Asset To Be Rendered
This guide provides instructions for setting up the EntityAssetReload action. In the sample app, the trigger area is outlined in blue. Once the avatar leaves the trigger area, the asset is reloaded from memory and starts rendering in the scene.
When the avatar enters the blue trigger area, the house asset is unloaded, but is retained in memory.
When the avatar leaves the blue trigger area, the house asset is reloaded from memory.
In this example, a trigger is created and when triggered, an action reloads an object. Any object can be used as a trigger, as long as the object has a collision component. This example uses a 3D box as the trigger area.
A. Create a new 3D Box entity.
B. The Collision component is not required for EntityAssetReload action to work. The Collision component is required for the EntitySubscribeTriggerLeave trigger that will be used in this example.
C. Adding a material is optional. A transparent material has been added so that the trigger area is visible in play mode.
D. Click the Edit Viverse Extension button.
Create An Action That Destroys An Asset, Removes It From Memory and Stops It From Rendering
This guide provides instructions for setting up the EntityDestroy action. In the sample app, the trigger area is outlined in blue. Once the avatar enters the trigger area, the asset is destroyed, which removes it from memory and stops it from rendering.
When the avatar is outside the blue trigger area, the house asset is being rendered.
When the avatar enters the blue trigger area, the house asset is destroyed and removed from memory.
In this example, a trigger is created and when triggered, an action destroys an object. Any object can be used as a trigger, as long as the object has a collision component. This example uses a 3D box as the trigger area.
A. Create a new 3D Box entity.
B. The Collision component is not required for EntityDestroy action to work. The Collision component is required for the EntitySubscribeTriggerEnter trigger that will be used in this example.
C. Adding a material is optional. A transparent material has been added so that the trigger area is visible in play mode.
D. Click the Edit Viverse Extension button.



A PlayCanvas sample project that provides demonstrations of how to use the features in the VIVERSE Extension.
We created a PlayCanvas sample project that contains examples of how to utilize the PlayCanvas Extension. Download the project and import it into a PlayCanvas account. Publish the project to VIVERSE and enter Preview mode to test out the functionality. The pages listed under API Reference are the functionality that the sample project is demoing.
VIVERSE_PlayCanvas_Sample_1.2
Sample Project Version: 1.2
PlayCanvas Extension Version: 3.44.1
This project is separated into different stations. Each station has Tags added to help identify the VIVERSE functionality the station is demonstrating.
This document provides a guide that can be used to setup a quest system in a VIVERSE project.
Create A Quest System
This guide provides instructions for setting up the Quest system. In the sample app, the trigger area is outlined in blue. Once the avatar enters into the trigger area, the Quest system starts. The first task requires the user to click on the red box. Once the user clicks on the red box, the first task is completed and the Quest system updates. The second task requires the user to click on multiple green boxes. Each green box that is clicked adds progress to completing the Quest task. Once all of the green boxes have been clicked, the task is completed and the Quest is completed. If the user clicks on a blue box during the quest, the quest system resets and needs to be triggered again in order to restart the quest.














allow-modals - Open modal dialogs
allow-popups - Open popup windows
allow-top-navigation - Navigate top window
allow-pointer-lock - Use pointer lock
allow-presentation - Start presentations
allow-downloads - Download files
allow-orientation-lock - Allow orientation lock
allow-popups-to-escape-sandbox - Allow popups to escape sandbox
allow-top-navigation-by-user-activation - Allow top navigation by user activation
camera - Allows access to device camera
gyroscope - Allows access to gyroscope sensor data
magnetometer - Allows access to magnetometer sensor data
midi - Allows access to MIDI devices
window-management - Allows multi-window management
xr-spatial-tracking - Allows access to VR/AR features






E. Add an object that will be unloaded when the avatar enters the trigger area. The House2 entity has been added to the pick up specify execution entity.
E. Add an object that will be unloaded when the avatar enters the trigger area. The House2 entity has been added to the pick up specify execution entity.
E. Add an object that will be destroyed when the avatar enters the trigger area. The House2 entity has been added to the pick up specify execution entity.









3
3
4
4
4
5
5
6
6
6
7
7
8
9
10
11
12
13
14
15
15
15
16
17
18
19
20
21
21
22
1
1
1
1
2
2

A. Give the quest a name in the Quest name field. The text: Find the boxes! was added.
B. Give the quest a description in the Quest description field. The text: Click on the boxes that are a specific color. was added.
C. Create the first task and give the task a description in the Task description field. The text: Click on the red box. was added.
D. Set the Task type to check.
E. Create the second task and give the task a description in the Task description field. The text: Click on the green boxes. was added.
F. Set the Task type to progressBar.
G. Add the value 4 to the Progress Steps field.
A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select EntitySubscribeTriggerEnter.
C. Add local-player to the tags to filter field.
D. Add an Action and select Quest.
E. In the selected quest field, choose Find the boxes!
F. In the quest response field, choose startQuest.
A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select NotificationCenterSubscribeEntityPicking.
C. Add an Action and select Quest.
D. In the selected quest field, choose Find the boxes!
E. In the quest response field, choose completeTask.
F. In the selected task field, choose Click on the red box.
A. Create multiple 3D objects.
B. Add a Collision component to each object.
C. Add a material. Green has been added because these will be the boxes that will be clicked on to complete the second task.
D. Click the Edit Viverse Extension button.
The following steps are completed for all green boxes.
A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select NotificationCenterSubscribeEntityPicking.
C. Add an Action and select Quest.
D. In the selected quest field, choose Find the boxes!
E. In the quest response field, choose addTaskProgress.
F. In the selected task field, choose Click on the green boxes.
The following steps are completed for all blue boxes.
A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select NotificationCenterSubscribeEntityPicking.
C. Add an Action and select Quest.
D. In the selected quest field, choose Find the boxes!
E. In the quest response field, choose resetQuest.
The avatar enters the trigger area and the Quest system dialog appears.
With the Quest system started, the user clicks on the red box and the first task is completed.
When the user clicks each green box, progress is added to the second task.
Once the user clicks on the final green box, the second task is complete and the Quest system dialog disappears.
If the user clicks on a blue box before both tasks are completed, the Quest system resets and needs to be triggered again.

To ensure consistent performance and platform compatibility, our system classifies content based on VRAM usage per world and provides platform-specific performance recommendations. Developers should optimize their content to target the desired range of supported devices.
The system automatically assigns a performance boundary level based on the amount of VRAM your world uses:
Worlds marked Very High will trigger a warning popup for users, indicating that the content is intended for high-end PCs only.
Each boundary level maps to a set of recommended platforms, reflecting performance expectations:
Targeting Wider Compatibility: Aim to keep VRAM usage under 500 MB to support the broadest range of devices.
Optimizing for Android or iOS: Stay within Low or Medium boundary levels to ensure mobile device compatibility.
High-End Visuals: Worlds exceeding 800 MB VRAM should be optimized for PC-only audiences.
For best results, monitor your world’s VRAM usage regularly during development and test on target platforms. Pressing 'Shift + .' will open the console to view various profiling values including VRAM.
This guide is a walkthrough for uploading and publishing a PlayCanvas project to VIVERSE Create.
This guide is a walkthrough creating World from Scene.
This document provides several guides that can be used to enable and disable colliders in a VIVERSE project. These actions can be configured to execute when triggers are activated.
An Action That Toggles An Object's Collider On/Off
This guide provides instructions for setting up the EntityToggleCollision action. In the sample app, every time the avatar enters the green trigger area, a notification is sent to the wall and the collider on the wall is toggled on/off. The wall is green when the avatar can pass through and red when the avatar can not pass through.
In this example, a trigger is created and when the avatar enters the trigger area, an action sends a notification to toggle another object's collider on/off. Any object can be used as a trigger, as long as the object has a collision component. This example uses a 3D box as the trigger area.
A. Add a 3D object to the scene.
B. The Collision component is not required for EntityToggleCollision action to work. The Collision component is required for the EntitySubscribeTriggerEnter trigger that will be used in this example.
C. Click the Edit Viverse Extension button.
Create An Action That Enables An Object's Collider
This guide provides instructions for setting up the EntityEnableCollision action. In the sample app, every time the avatar enters the red trigger area, a notification is sent to the wall object and the collider on the wall object is enabled. The wall is green when the avatar can pass through and red when the avatar can not pass through.
In this example, a trigger is created and when the avatar enters the trigger area, an action sends a notification to enable another object's collider. Any object can be used as a trigger, as long as the object has a collision component.
Create An Action That Disables An Object's Collider
This guide provides instructions for setting up the EntityDisableCollision action. In the sample app, every time the avatar enters the green trigger area, a notification is sent to the wall object and the collider on the wall object is disabled. The wall is green when the avatar can pass through and red when the avatar can not pass through.
In this example, a trigger is created and when the avatar enters the trigger area, an action sends a notification to disable another object's collider. Any object can be used as a trigger, as long as the object has a collision component.
This document provides a guide for creating a project in PlayCanvas that can be published to VIVERSE






















Exceeding 2 GB VRAM: Consider splitting large scenes, using LODs, or reducing texture resolutions to avoid user warnings.
Give the World a name and click the "Create" button to publish it to the VIVERSE Create platform. The resulting URL is the public link to the World.
The World can be accessed from the VIVERSE Create site here. Once logged into VIVERSE Create, click the avatar.
Low
≤ 500 MB
Medium
500 MB and ≤ 800 MB
High
800 MB and ≤ 2 GB
Very High
2 GB
Low
PC / iOS / Android / HMD (VR Headsets)
Medium
PC / Android
High
PC
Very High
PC (High-End only, with warning popup)

A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select EntitySubscribeTriggerEnter.
C. Add local-player to the tags to filter field.
D. Add an Action and select NotificationCenterPublish
E. Create a unique notification name and add it to the notification name to publish field. In this example, the notification is called ToggleWall.
A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select NotificationCenterSubscribe.
C. The same text that was added to the notification name to publish needs to be added to the notification name to subscribe.
D. Add an Action and select EntityToggleCollision.
A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select EntitySubscribeTriggerEnter.
C. Add local-player to the tags to filter field.
E. Add an Action and select NotificationCenterPublish.
F. Create a unique notification name and add it to the notification name to publish field. In this example, the notification is called EnableWall.
A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select NotificationCenterSusbscribe.
C. The same text that was added to the notification name to publish needs to be added to the notification name to subscribe.
D. Add an Action and select EntityEnableCollision.
A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select EntitySubscribeTriggerEnter.
C. Add local-player to the tags to filter field.
E. Add an Action and select NotificationCenterPublish.
F. Create a unique notification name and add it to the notification name to publish field. In this example, the notification is called DisableWall.
A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select NotificationCenterSubscribe.
C. The same text that was added to the notification name to publish needs to be added to the notification name to subscribe.
D. Add an Action and select EntityDisableCollision.
Before the avatar enters the green trigger area, the wall is red and the avatar can not pass through it.
Once the avatar enters the green trigger area, the collider on the wall is toggled to be disabled. The wall is green and the avatar can pass through it.
The avatar is able to pass through the wall when the wall’s collider is disabled.
Once the avatar enters the red trigger area, the EntityEnableCollision action enables the wall’s collider
The avatar is unable to pass through the wall once the wall’s collider is enabled.
The avatar is unable to pass through the wall when the wall’s collider is enabled.
Once the avatar enters the green trigger area, the action disables the wall’s collider
The avatar is able to pass through the wall when the wall’s collider is disabled.
A. Select the Plane entity and rename it Floor.
B. To increase the size of the Floor, change the Scale to (20, 1, 20).
C. To allow the players to collide with the Floor, add a Collision component.
D. To increase the size of the Collision component so that it covers the entire Floor, change Half Extents to (10, .1, 10).
E. To prevent the players from falling through the Floor, add a Rigidbody component.
F. Click the Import Ammo button.
A. Create a new folder and name it Materials.
B. Inside the Materials folder, create a new material and name it FloorMaterial.
C. Expand the Diffuse section and click the Color field.
D. For the Color, change the values to (97, 255, 104).
E. Add the FloorMaterial to the Material slot on the Floor entity.
A. Select the Box entity and rename it to Wall1.
B. To move the wall to the side, change the Position to (0, 0, 10).
C. To increase the size of the wall, change Scale to (20, 5, .1).
D. To allow the players to collide with the wall, add a Collision component.
E. To increase the size of the Collision component so that it covers the entire wall, change Half Extents to (10, 2.5, .1).
F. To prevent the players from going through the wall, add a Rigidbody component.
A. Add a 3D Sphere to the scene. Rename it to Ball.
B. Raise the Ball above the ground by changing Position to (0, 1, 0).
C. To allow the players to collide with the Ball, add a Collision component.
D. To change the collider shape to be the same shape as the Ball, change Type to Sphere.
E. To prevent players from passing through the Ball, add a Rigidbody component.
F. Since the Ball will be moving, change the Type to Dynamic.
G. Decrease the Friction to .1 so that the Ball doesn't lose as much velocity when coming in contact with other entities.
H. Increase the Restitution to 1 to maximize the bounciness of the Ball.
A. Add a new Entity to the scene. Rename the entity to SpawnPoint.
B. Add spawn-point to the Tags field.
C. To have the spawn point above the ground and close to the wall, change the Position to (0, 1, 8).
D. To change the direction of the SpawnPoint so that the player will face the ball when spawned in, change Rotation to (0, 180, 0).
Create a game with floor and walls. Use your avatar to move the ball around.

This document provides several guides that can be used to setup event listeners for animations, control animations and control audio in a VIVERSE project.
Create An Action That Plays An Animation
This guide provides instructions for setting up the EntityPlayAnimation action. In the sample app, once the avatar enters the trigger area, the character switches to the dancing animation state. When the avatar leaves the trigger area, the character switches back to the idle animation state.
In this example, a trigger area is created and when triggered, an action plays an animation. Any object can be used as a trigger, as long as the object has a collision component.
Create A Trigger Based On When An Animation Starts
This guide provides instructions for setting up the EntitySubscribeAnimationStart trigger. In the sample app, when the avatar enters the blue trigger area, the character’s dancing animation begins to play. The dancing animation has an animation event added and the animation event is triggered whenever the animation starts to play.
In the EntityPlayAnimation example, a 3D model was added, an animation state graph was created, animation states were created and a trigger area was created to initiate the animations. In this example, an animation event is created for the start of the animation.
Create A Trigger Based On When An Animation Ends
This guide provides instructions for setting up the EntitySubscribeAnimationEnd trigger. In the sample app, when the avatar enters the blue trigger area, the character’s dancing animation begins to play. The dancing animation has an animation event added and the animation event is triggered whenever the animation stops playing.
In the EntityPlayAnimation example, a 3D model was added, an animation state graph was created, animation states were created and a trigger area was created to initiate the animations. In this example, an animation event is created for the end of the animation.
Create A Trigger Based On A Specific Time During An Animation
This guide provides instructions for setting up the EntitySubscribeAnimationEvent trigger. In the sample app, when the avatar enters the blue trigger area, the character’s dancing animation begins to play. The dancing animation has an animation event added and the animation event is triggered during the animation.
In the EntityPlayAnimation example, a 3D model was added, an animation state graph was created, animation states were created and a trigger area was created to initiate the animations. In this example, an animation event is created for during the animation.
Create An Action That Plays A Sound
This guide provides instructions for setting up the EntityPlaySound action. In the sample app, once the avatar enters the trigger area, a sound plays.
In this example, a trigger is created and when triggered, an action plays a sound. Any object can be used as a trigger, as long as the object has a collision component. This example uses a 3D box as the trigger area.
A. Create a new 3D Box entity.
B. The Collision component is not required for EntityPlaySound action to work. The Collision component is required for the EntitySubscribeTriggerEnter trigger that will be used in this example.
C. Adding a material is optional. A transparent material has been added so that the trigger area is visible in play mode.
Create An Action That Stops Playing A Sound
This guide provides instructions for setting up the EntityStopSound action. In the sample app, once the avatar leaves the trigger area, the sound stops playing.
In this example, a trigger is created and when triggered, an action plays a sound. Any object can be used as a trigger, as long as the object has a collision component. This example uses a 3D box as the trigger area.
A. Create a new 3D Box entity.
B. The Collision component is not required for EntityStopSound action to work. The Collision component is required for the EntitySubscribeTriggerLeave trigger that will be used in this example.
C. Adding a material is optional. A transparent material has been added so that the trigger area is visible in play mode.
This document provides a guide for creating a sample app in Three.js, building the app with Vite and deploying the app to VIVERSE.
In this getting-started guide, we will cover the basics of setting up a ThreeJS project and publishing to VIVERSE using the VIVERSE CLI.
This guide is a walkthrough for creating an example Three.js project
This page outlines the usage of the custom shader components contributed by ShuShu VR and Niko Lang as part of their commission for VIVERSE.
The following specifications and guidelines will allow you to use our custom shaders in your own project. The project contains custom shaders, custom textures, materials and audio files.
You may use the Shaders, Models, Materials, cube Maps, Seamless Textures and Equirectangular Maps, yet, due to copyright, you are not permitted to use any other assets (such as 3rd party scripts: Audio files, Fonts, Noise Maps or Normal Maps).
Project Title: ShadeArt > Custom shaders showcase [VIVERSE VR Development Workshop -Experimental Shaders Development Space]. VIVERSE world link: Project link:


















































A. Create a new 3D Box entity.
B. The Collision component is not required for EntityPlayAnimation to work. The Collision component is required for the EntitySubscribeTriggerEnter trigger that will be used in this example.
C. Adding a material is optional. A transparent material has been added so that the trigger area is visible in play mode.
D. Click the Edit Viverse Extension button.
A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select EntitySubscribeTriggerEnter.
C. Add local-player to the tags to filter field.
D. Add an Action and select EntityPlayAnimation.
E. Add the name of an animation state to the animate state to play field. In this example, the animation state name is Dancing.
F. Add the 3d model to the pick up specify execution entity field.
A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select EntitySubscribeTriggerLeave.
C. Add local-player to the tags to filter field.
D. Add an Action and select EntityPlayAnimation.
E. Add the name of an animation state to the animate state to play field. In this example, the animation state name is Idle.
F. Add the 3d model to the pick up specify execution entity field.
A. Create an event, add value 0 to the time field and start to the name field.
A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select EntitySubscribeAnimationStart.
C. Add an Action and select EntityEnableById.
D. For the EntityEnableById action to work, an object needs to be added that will be enabled. In this example, the AnimationStartText object is added.
A. Create an event, add value 8 to the time field and end to the name field. The value 8 is used because the dancing animation has a duration of 8.83 seconds.
A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select EntitySubscribeAnimationEnd.
C. Add an Action and select EntityEnableById.
D. For the EntityEnableById action to work, an object needs to be added that will be enabled. In this example, the AnimationEndText object is added.
A. Create an event, add value 4 to the time field and create a unique name for the event. In this example, the event name animate was added to the name field.
A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select EntitySubscribeAnimationEvent.
C. Add the event name to the event name to subscribe.
D. Add an Action and select EntityEnableById.
E. For the EntityEnableById action to work, an object needs to be added that will be enabled. In this example, the AnimationEventText object is added.
E. Create a unique name for the sound file and add it to the name field. In this example, Slot2 is added.
F. Add the audio file to the Asset field.
G. Click the Edit Viverse Extension button.
A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select EntitySubscribeTriggerEnter.
C. Add local-player to the tags to filter field.
D. Add an Action and select EntityPlaySound.
E. In the sound name to play field, add the same name that was created on the Sound component.
F. Add the entity with the Sound component to the pick up specify execution entity.
E. Create a unique name for the sound file and add it to the name field. In this example, Slot2 is added.
F. Add the audio file to the Asset field.
G. Click the Edit Viverse Extension button.
A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select EntitySubscribeTriggerLeave.
C. Add local-player to the tags to filter field.
D. Add an Action and select EntityStopSound.
E. In the sound name to play field, add the same name that was created on the Sound component.
F. Add the entity with the Sound component to the pick up specify execution entity.
The character is in the idle state when the avatar is outside of the trigger area.
The character switches to the dancing animation state when the avatar enters the trigger area.
Before the character’s dancing animation begins.
When the character’s dancing animation begins, the animation event is triggered and text is displayed to show that the animation event was triggered.
The character’s dancing animation has started.
When the character’s dancing animation ends, the animation event is triggered and text is displayed to show that the animation event was triggered.
The character’s dancing animation has started.
During the character’s dancing animation, the animation event is triggered and text is displayed to show that the animation event was triggered.
The avatar is outside of the blue trigger area and the sound does not play.
The avatar enters the blue trigger area and the sound plays.
The avatar enters the blue trigger area and the sound plays.
The avatar is outside of the blue trigger area and the sound stops playing.





A. Use the defaults during the installation, but place a checkmark in the Automatically install the necessary tools checkbox
A. Create the index.html page inside the project folder. This can be done by creating a text file, pasting the code and saving it as a .HTML page or using an IDE, such as Visual Studio Code.
index.html
A. To create a development build of the Three.js project, type the following command inside command prompt under the Three.js project directory: npx vite.
B. Confirm the development build of the Three.js project was built successfully when Vite provides a localhost URL to test.
A. To test a development build of the Three.js project, open the browser and navigate to the URL that was printed in the previous step. In this example, the URL is http://localhost:5173. Confirm the app works as expected.
A. To create a production build of the Three.js project, type the following command inside command prompt under the Three.js project directory: npx vite build.
B. Confirm the production build of the Three.js project was built successfully by confirming the dist folder was created and populated.
macOS and Linux : In mac or Linux packages are typically installed in /usr/local/lib/node_modules or a user-specific directory like ~/.npm-global.
B. Confirm that the command line tool is installed based on screen feedback.
A. To re-publish content to VIVERSE when a project is already published, type the following command with the project path to the project's production build folder: viverse-cli publish <path>, then click Enter.
B. Confirm the manifest file is updated.
C. Confirm the content was published successfully.





EquiProjection (Skybox shader for Equirectangular Map)
EquiProjectionDistort (Skybox shader for Equirectangular Map, with Distortion Map and Additive Equirectangular Map)
EquiProjectionUVDistort (Projection Shader for seamless textures).
Water Shader (Amplify Water)
The standard Play Canvas Editor does not allow a preview of the custom shaders. You will have to publish the world and check the result in game.
All materials applied in the custom shaders function only as the material "place holders" and are not displayed in game. You may use any standard material in this purpose.
The textures and maps presented in the ShadeArt gallery in VIVERSE are used as examples only. You may apply any other image instead of the existing images.
A major part of the custom shaders development was to adjust the shaders to work properly with the different camera perspectives used in VIVERSE.
Here is the Camera Function settings in each of the shader scripts:
You may reuse this for your own custom shaders development.
---ShuShu & Niko
This is a skybox shader for Equirectangular Map (2x1 texture) assigned on an inverted sphere. The shader ensures that the skybox is correctly projected onto the scene in such a way that it always appears at an infinite distance, which means it gives the illusion of an infinite environment around the player.
No matter how the camera (User) moves, the skybox doesn’t change in size nor resolution, and it stays fixed in the background. The shader do not relate to any UV maps and can by applied on any type of mesh with inverted normals/faces.
Import a sphere with inverted normals/faces into your Play Canvas project. We recommend using the sphere model-template from our shared project (see: ScenesAssets > Models > Common > ProjSphere1x1_64x32_Inv).
The sphere size is 1 m. x 1 m. in diameter. Scale it to 100 m. or any other size that will fit to your environment. Scaling the mesh up or down will not affect the skybox image resolution.
Place the sphere in the Hierarchy.
The Equirectangular Map is an image with a ratio of 2x1. A 4K image size is 4096x2048 pixels, while 8K is 8192x4096 pixels. In our sample project we use 8K images.
Upload an e. map into your project.
Assign the e. map to the shader script.
Notes : Make sure to uncheck the Mipmaps in the texture settings. You do not need to generate a Play Canvas cube map, simply link your Equirectangular Map to the shader.
If you wish to rotate the skybox image (the equirectangular map) remember, rotating the mesh itself will not affect the image rotation.
In order to rotate the image, use the Rotation parameters in the script settings.
Notes
In the immersive world, an equirectangular map will usually be displayed differently than it would display in Photoshop or a any image preview app.
It will be rotated 90 d. to the right (for example, if you are using 1 8192x4096 image, where the center of your image is at 4096, in the immersive world the center will be at the 6144 pixels position, which is 90 d. to the right).
The skybox Rotation parameters : 0.25 rotates the image by 2048 pixels (90) d., 0.5 by 4096 pixels (180 d.), 0.75 by 3072 pixels (270 d.) and 1 by 360 d.
This is an upgraded version of the Equiprojection Shader mentioned above. The shader has an additive layer called "Equirectangular Map Stars" and a Distortion Map (Noise Map).
You may review the shader in our ShadeArt gallery in VIVERSE (see: skybox Super Star, Space Nebula and Galactic Abyss).
Import a sphere with inverted normals/faces into your Play Canvas project.
We recommend using the sphere model-template from our shared project (see: ScenesAssets > Models ? Common > ProjSphere1x1_64x32_Inv).
The sphere size is 1 m. x 1 m. in diameter. Scale it to 100 m. or any other size that will fit to your environment. Scaling the mesh up or down will not affect the skybox image resolution.
Place the sphere in the Hierarchy.
The Equirectangular Map is an image with a ratio of 2x1. A 4K image size is 4096x2048 pixels, while 8K is 8192x4096 pixels. In our sample project we use 8K images.
Upload an e. map into your project.
Assign the e. map to the shader script in section : Equirectangular Map.
Notes : Make sure to uncheck the Mipmaps in the texture settings. You do not need to generate a Play Canvas cube map, simply link your Equirectangular Map to the shader.
This is where you assign an additive equirectangular map (see additive map example in: ScenesAssets > Textures_Equirectangular > SkyBox_Stars)
It can be an image with stars on a black background or any other graphics, such as logo - make sure the black is set to R=0 G=0 B=0).
Assign the Map in the in section : Equirectangular Map Stars.
Chose your preferred distortion tiling.
The noise map tiling will not affect the equirectangular tiling.
This factor will affect the speed of the distortion created by the Noise Map.
If you wish to rotate the skybox image (the equirectangular map) remember, rotating the mesh itself will not affect the image rotation.
In order to rotate the image, use the Rotation parameters in the script settings.
Notes:
In the immersive world, an equirectangular map will usually be displayed differently than it would display in Photoshop or any image preview app.
It will be rotated 90 d. to the right (for example, if you are using 1 8192x4096 image, where the center of your image is at 4096, in the immersive world the center will be at the 6144 pixels position, which is 90 d. to the right).
The skybox Rotation parameters : 0.25 rotates the image by 2048 pixels (90) d., 0.5 by 4096 pixels (180 d.), 0.75 by 3072 pixels (270 d.) and 1 by 360 d.
This factor will affect the intensity of the distortion effect.
This is a projection shader for seamless textures with a noise movement factor and a background equirectangular map. You may review the shader in our ShadeArt gallery in VIVERSE (see: Equiprojection UV distort samples in the tunnels).
The shader may be applied on any mesh with inverted normals/faces: on a floor, a wall, a sphere or any other model. This shader reacts to the mesh UV. It includes a Distortion Map that allows to generate a movement of the image.
Import any model with inverted normals/faces into your Play Canvas project. Make sure the mesh has correct UV mapping.
Feel free to experiment the results with various UV Mapping (an interesting effect can be achieved when the mesh has Follow active Quads mapping).
Place the sphere in the Hierarchy.
This is where you assign an equirectangular map that will serve as the background image (see additive map example in: ScenesAssets > Textures_Equirectangular > SkyBox_Stars).
Upload an equirectangular image to your project and assign it to the shader script
Upload a seamless texture with a ratio of 1x1 (1 K is 1024x1024 pixels, 2K is 2048x2048 pixels, 4K is 40996x4096 pixels).
The images presented in our shared project and in the ShaderArt gallery in VIVERSE are mostly AI generated with some Photoshop post production. You may place your own image as long as it is a seamless texture.
Apply the texture to the shader script in the Additive Map section.
Notes : Make sure to uncheck the Mipmaps in the texture settings.
This map will affect the distortion effect.
In the examples seen in our shared project and in the ShadeArt gallery in VIVERSE, the Distortion Map is the same texture used as an Additive Map with our without a photoshop effect (blur and B&W filters). See distortion maps examples in: ScenesAssets > TexturesTilable.
Apply the Distortion Map to the shader script in the section: Distortion Map.
Notes : Make sure to uncheck the Mipmaps in the texture settings.
Chose your preferred distortion tiling.
The noise map tiling will not affect the equirectangular tiling, yet it will react to the mesh UV mapping.
This factor will affect the speed of the distortion created by the Distortion Map. It may be set to positive or negative values (for example: 0.05 or -0.05).
This is a custom made water shader that reacts to the mesh UV and may be applied on any mesh. You may review the shader in our ShadeArt gallery in VIVERSE (see: CyberCity). This shader reacts to the mesh UV. It includes a normal Map that allows to generate the movement of the water surface (waves).
Import any model with into your Play Canvas project. Make sure the mesh has correct UV mapping since the normal map applied to the shader will reacts to the UV mapping.
For a simple water surface use a circle shaped plane with Project From View (fit to bounds) UV mapping.
You may use the circle-plane model-template from our shared project (see: ScenesAssets > Models ? Common > CirclePlane1x1m128).The sphere size is 1 m. x 1 m. in diameter. Scale it to any size that will fit to your environment.
Scaling the mesh up or down will affect the normal map tiling.
Place the model Template in the Hierarchy.
Just like in reality, a water without a reflection will not be visible. To achieve the water effect, you need to apply a Cube Map to the section : Cube Map.
Upload a Cube Map into your project and assign it to the shader script.
Creating a Cube Map from an Equirectangular Map: If you wish to create a Cube Map from the same Equirectangular Map you use in the skybox shader, you will need to generate a Cube Map from the Equirectangular Map. To do so, us the Play Canvas texture tool:
https://playcanvas.com/texture-tool
Chose your preferred distortion tiling.
This factor will affect the intensity of the distortion effect.
The Base Colour serves as a colour filter. Complete Black will make the water look highly reflective.
This factor will affect the speed of the distortion effect created by the Normal Map.
This factor will influence the intensity of the reflection onto the water surface.
Set this factor to achieve more or less visible reflection.
Set this factor to achieve more or less visible water surface.
Nirvana Grove é uma tranquila floresta de bambu projetada para o máximo bem-estar, desenvolvida com Unity 6 para a plataforma Viverse pela Thorium Labs.Nirvana Grove é uma tranquila floresta de bambu
Nirvana Grove é uma floresta de bambu desenvolvida com Unity 6. A ferramenta Terrain foi usada para criar a paisagem, escolhemos cuidadosamente a vegetação, composta por bambu, grama, samambaias e flores, as texturas corretas e um conjunto de elementos decorativos japoneses para compor o ambiente.
O Nirvana Grove pode ser vivenciado aqui:
Uma visão geral deste projeto pode ser encontrada aqui: {Link TBD}
Terreno montanhoso, entrecortado por trilhas e um grande lago.
Vegetação exuberante e variada.
Trilha sonora composta por três efeitos especiais da natureza e uma música relaxante.
Elementos decorativos japoneses com significados especiais
Usamos a ferramenta Terrain do Unity para construir a paisagem, um terreno montanhoso, interseccionado por caminhos com um enorme lago como pano de fundo.
O terreno tem 1.000 por 1.000 metros quadrados e chega a 600 metros de altura. Aqui está uma visão geral do terreno e dos principais cenários:
Utilizamos apenas quatro texturas para representar cada detalhe do terreno da seguinte maneira:
A vegetação desempenha o papel mais proeminente no espaço, mas mesmo assim, nos preocupamos muito com o desempenho e minimizamos o uso de pré-fabricados. As texturas das plantas têm um papel crucial, pois os avatares também podem caminhar entre elas.
The soundtrack is composed of three special nature effects and a calming song to compose the landscape and immerse the avatar.
A decoração é outro ponto importante do espaço, pois faz com que o usuário se sinta em um jardim japonês.
Esses são os elementos finais que encerram a cena. Eles criam o cenário dramático para criar uma experiência ainda mais imersiva
Aqui estão os próximos passos para que seu espaço seja publicado no VIVERSE.
This document provides a guide for setting up no-code events in custom scripts.
The VIVERSE no-code events can be accessed through custom scripts. The events can be found in the root directory of your PlayCanvas project in the @viverse/create-extensions-sdk.mjs file. Here's a look at the file.
Calling custom code from script when the player clicks on an object
Let's say we want to give our players the ability to click on an object and when that object is clicked, we want to call a method from a custom script. This is a good use-case for the NotificationCenterSubscribeEntityPicking trigger. In this example, when the user clicks on the specific object, a door opens.
Calling custom code from script when the player clicks on an object and making the script more flexible
In the previous example, we created a custom script called ClickableObject.mjs. The script added functionality that rotates a door whenever a specific object is clicked. Let's say there was another scenario where we wanted to have a 2nd object that the user could click on, but we wanted to fire a different event other than the rotateDoor method. We could easily copy our ClickableObject.mjs script, give it a different name and change the rotateDoor call inside the script. The downside here is that we now have 2 scripts that we have to maintain. As alternative, we can remove the rotateDoor event call, add an attribute to our script for the event name and then add the different event names through the PlayCanvas editor.
Calling custom code from script when a player enters a trigger area
Another common scenario that creators may face is firing an event when a player enters a trigger area. This is a good use-case for the EntitySubscribeTriggerEnter trigger. In this example, when the user enters a trigger area, a door opens.
Calling custom code from script when an action executed
Let's say we want to give our players the ability to click on an object and when that object is clicked, we want to call a method from a custom script. This is a good use-case for the NotificationCenterSubscribeEntityPicking trigger. In this example, when the user clicks on the specific object, a door opens.
This document provides several guides that can be used to setup event listeners in a VIVERSE project. The event listeners use triggers. Triggers can be configured to perform actions once activated.
Create A Trigger Based On When An Object Enters An Area
This guide provides instructions for setting up the EntitySubscribeTriggerEnter trigger. In the sample app, the trigger areas are outlined in blue. Once the avatar enters into the blue area, an action occurs.
In this example, a trigger area is created and when the avatar enters the trigger area, an action is initiated. Any object can be used as a trigger, as long as the object has a collision component. This example uses a 3D box as the trigger area.
Create A Trigger Based On When An Object Leaves An Area
This guide provides instructions for setting up the EntitySubscribeTriggerLeave trigger. In the sample app, the trigger areas are outlined in blue. Once the avatar leaves the blue area, an action occurs.
In this example, a trigger area is created and when the avatar leaves the trigger area, an action is initiated. Any object can be used as a trigger, as long as the object has a collision component. This example uses a 3D box as the trigger area.
A. Create a new 3D Box entity.
B. Add a Collision component.
C. Adding a material is optional. A transparent material has been added so that the trigger area is visible in play mode.
D. Adding a material is optional. A transparent material has been added so that the trigger area is visible in play mode.
Create A Trigger Based On When An Object Collides With Another Object
This guide provides instructions for setting up the EntitySubscribeCollisionStart trigger. In the sample app, the golden_coin is a trigger. Once the avatar collides with the golden_coin, an action occurs.
In this example, a trigger is created and when the avatar collides with the trigger, an action is initiated. Any object can be used as a trigger, as long as the object has a collision component. Because this specific example uses a 3D object that the avatar can collide with, a RigidBody component needs to be added. In this example, the object golden_coin is used, but a simple cube will suffice.
Create A Trigger Based On When An Object Stops Colliding With Another Object
This guide provides instructions for setting up the EntitySubscribeCollisionEnd trigger. In the sample app, a flattened 3D cube is a trigger. Once the avatar stops colliding with the cube, an action occurs.
In this example, a trigger is created and when the avatar or other objects collide with the trigger, an action is initiated. Any object can be used as a trigger, as long as the object has a collision component. Because this specific example uses a 3D object that the avatar can collide with, a RigidBody component needs to be added.
Create A Trigger Based On When A User Clicks An Object
This guide provides instructions for setting up the NotificationCenterSubscribeEntityPicking trigger. In the sample app, the user clicks on the buttons to show and hide the whale.
In this example, a trigger area is created on a button. When the user clicks on the trigger area, an action causes the 3D model to be disabled. Any object can be used as a trigger, as long as the object has a collision component.
An Action And Trigger Combination That Allows An Object To Send A Notification And Another Object Receives The Notification
This guide provides instructions for setting up the NotificationCenterPublish action and the NotificationCenterSubscribe trigger. In the sample app, every time the avatar enters the green trigger area, a notification is sent to the wall and the collider on the wall is toggled on/off. The wall is green when the avatar can pass through and red when the avatar can not pass through.
In this example, a trigger is created and when the avatar enters the trigger area, an action sends a notification to another object. The other object receives the notification via trigger, then initiates an action.
A. Add a 3D object to the scene.
B. The Collision component is not required for NotificationCenterPublish action or NotificationCenterSubscribe trigger to work. The Collision component is required for the EntitySubscribeTriggerEnter trigger that will be used in this example.
C. Click the
This document provides several guides that can be used to show and hide objects in a VIVERSE project. These actions can be configured to execute when triggers are activated.
import * as THREE from 'three';
const scene = new THREE.Scene();
const camera = new THREE.PerspectiveCamera( 75, window.innerWidth / window.innerHeight, 0.1, 1000 );
const renderer = new THREE.WebGLRenderer();
renderer.setSize( window.innerWidth, window.innerHeight );
renderer.setAnimationLoop( animate );
document.body.appendChild( renderer.domElement );
const geometry = new THREE.BoxGeometry( 1, 1, 1 );
const material = new THREE.MeshBasicMaterial( { color: 0x00ff00 } );
const cube = new THREE.Mesh( geometry, material );
scene.add( cube );
camera.position.z = 5;
function animate() {
cube.rotation.x += 0.01;
cube.rotation.y += 0.01;
renderer.render( scene, camera );
}{
"scripts": {
"dev": "vite",
"build": "vite build"
},
"dependencies": {
"three": "^0.175.0"
},
"devDependencies": {
"vite": "^6.3.2"
}
}
import { defineConfig } from 'vite';
import path from 'path';
// https://vite.dev/config/
export default defineConfig({
base: './', // Use relative path as base URL
});
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="utf-8">
<title>My first three.js app</title>
<style>
body { margin: 0; }
</style>
</head>
<body>
<script type="module" src="/main.js"></script>
</body>
</html>function getActiveCameraPosition(app) {
var cameraNames = ['CAMERA_AUTO_ROTATE', 'CAMERA_1ST', 'CAMERA_OVERLAP', 'CAMERA_VR'];
var camera = null;
for (var i = 0; i < cameraNames.length; i++) {
camera = app.root.findByName(cameraNames[i]);
if (camera && camera.enabled) {
var pos = camera.getPosition();
return [pos.x, pos.y, pos.z];
}
}
return [0, 0, 0];
}export const TriggerTypes = {
Base: 'trigger:0',
Echo: 'trigger:1',
PCAppEventSubscribe: 'trigger:100001',
NotificationCenterSubscribe: 'trigger:200001',
NotificationCenterSubscribeEntityPicking: 'trigger:200002',
TheatreJSSubscribe: 'trigger:210001',
TheatreJSSubscribeSheetEnd: 'trigger:210002',
EntitySubscribeAnimationEvent: 'trigger:300001',
EntitySubscribeAnimationStart: 'trigger:300002',
EntitySubscribeAnimationEnd: 'trigger:300003',
EntitySubscribeCollisionStart: 'trigger:300004',
EntitySubscribeCollisionEnd: 'trigger:300005',
EntitySubscribeTriggerEnter: 'trigger:300006',
EntitySubscribeTriggerLeave: 'trigger:300007',
SitInSeat: 'trigger:300008',
SharePhoto: 'trigger:400001',
EnterAnyWorld: 'trigger:400002',
EnterMyWorld: 'trigger:400003'
}
export const ActionTypes = {
Base: 'action:0',
Echo: 'action:1',
PCAppEventPublish: 'action:100001',
NotificationCenterPublish: 'action:200001',
TheatreJSPublish: 'action:210001',
TheatreJSPlaySheet: 'action:210002',
EntityRigidbodyAddForceInPhysics: 'action:300001',
EntityPlayAnimation: 'action:300002',
EntityEnable: 'action:300003',
EntityDisable: 'action:300004',
EntityToggleEnabled: 'action:300005',
EntityFadeIn: 'action:300006',
EntityFadeOut: 'action:300007',
EntityPlaySound: 'action:300008',
EntityEnableCollision: 'action:300009',
EntityDisableCollision: 'action:300010',
EntityToggleCollision: 'action:300011',
InworldNpcWaitingSpeak: 'action:300012',
InworldNpcStopWaitingSpeak: 'action:300013',
EntityEnableByTag: 'action:300014',
EntityDisableByTag: 'action:300015',
EntityCheckPoint: 'action:300016',
EntityEnableById: 'action:300017',
EntityDisableById: 'action:300018',
EntityStopSound: 'action:3000019',
EntityAssetUnload: 'action:3000020',
EntityAssetReload: 'action:3000021',
EntityDestroy: 'action:3000022',
ParticleSystemPlay: 'action:3000023',
Quest: 'action:400001',
TeleportAvatar: 'action:400002',
AssignUserAsset: 'action:400003',
Vote: 'action:400004',
ShowToastMessage: 'action:400005',
TaskComplete: 'action:400006'
}














































Samambaia 1
Textura
As samambaias estão distribuídas aleatoriamente pelo solo.
Samambaia 2
Textura
As samambaias estão distribuídas aleatoriamente pelo solo.
Samambaia 3
Textura
As samambaias estão distribuídas aleatoriamente pelo solo.
Flor
Textura
Usado para criar áreas de destaque, interrompendo a sequência de áreas verdes com seus tons brancos.
Folha de Bambu
Textura
Usado nos sistemas de partículas para simular a queda das folhas de bambu, trabalhando em sintonia com o efeito sonoro do vento.
Escadas e Paredes
Esses elementos são meramente decorativos e ajudam a conduzir o avatar pela cena. Não têm nenhum significado especial além do estrutural.
Solo
É aqui que você encontrará todas as plantas do terreno.
Barro
É utilizado para separar a área plantada da pavimentação.
Pavimento
A área do caminho usada pelos avatares para caminhar.
Areia
Usado para o fundo dos corpos d'água. Esta área não pode ser acessada pelos avatares. Ela só é vista através do shader de água.
Bambu 1
Prefab
Usado principalmente na área de divisa entre o solo e o pavimento para criar o efeito “túnel”
Bambu 2
Prefab
É o elemento central da floresta de bambu, cobrindo todos os principais bosques de bambu
Grama
Textura
Cosmic Breaths of the Universe
Música
Uma música de baixa frequência que inspira relaxamento e calma.
Pássaros
Efeito Sonoro
Vários pássaros cantando.
Vento
Efeito Sonoro
Vento soprando sobre a floresta e balançando suavemente as folhas.
Água
Efeito Sonoro
Um fluxo suave de água.
Portão Torii
Um portão Torii é um portão tradicional japonês, mais comumente encontrado na entrada ou dentro de um santuário xintoísta, marcando a transição do mundano para o sagrado. Simboliza a fronteira entre o mundo humano e o mundo dos kami (espíritos ou divindades). Passar por um torii significa entrar em um espaço sagrado.
Estátuas de Cães
Nos jardins japoneses, estátuas de cães, frequentemente chamadas de Komainu, simbolizam proteção e servem para afastar espíritos malignos. Elas são normalmente colocadas aos pares na entrada de santuários, templos e, às vezes, até mesmo de casas, atuando como guardiões. Uma estátua tem a boca aberta (A-gyo), representando o início, enquanto a outra tem a boca fechada (Un-gyo), representando o fim, simbolizando, juntas, a totalidade da existência.
Santuário
Em um jardim japonês, um pequeno santuário (ou hokora) é uma representação em miniatura de um santuário xintoísta, servindo como um espaço sagrado para venerar kami, espíritos ou divindades associadas à natureza, ancestrais ou figuras históricas. Esses santuários são frequentemente integrados ao projeto do jardim para conectar o espaço físico com o reino espiritual e promover um senso de harmonia e respeito pela natureza.
Lanternas
As lanternas japonesas, ou tōrō, são mais do que meros elementos decorativos; elas simbolizam luz, esperança e estão profundamente conectadas à cultura e espiritualidade japonesas. Elas costumam aparecer em jardins, templos e durante festivais, guiando espíritos e criando uma atmosfera serena.
Directional Light
Esta é a principal fonte de luz da cena e a que cria toda a atmosfera do pôr do sol. Ela está alinhada com o sol no Skybox.
Directional Light
Esta é a segunda fonte de luz da cena e ajuda a criar mais iluminação e sombras contra a fonte de luz principal.
Skybox
Um Skybox de pôr do sol com uma lua distante e estrelas.
Sistema de Nuvens
Um dos meus itens favoritos para compor uma cena. Possui dois níveis de nuvens, baixo e alto, com configurações separadas, como formato, densidade, quantidade, velocidade e cor.



O principal elemento do solo, preenchendo as lacunas entre o bambu, as samambaias e as flores
A. Add the DoorRotator:rotateDoor text to the Event To Fire attribute text field.
B. If we added ClickableObject.mjs to a 2nd object and wanted to fire a different method, it could look like the following screenshot.
A. Create the following script and name it DisableObject.mjs.
B. Add DisabledObject.mjs script to trigger area that the player will walk through. In this example the DisabledObject.mjs script was added to the trigger area in the previous example. Be sure to disable or remove the TriggerArea.mjs script from the previous example.



A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select EntitySubscribeTriggerEnter.
C. Add local-player to the tags to filter field.
D. Add an Action and select EntityEnableById.
E. Add an object that will be enabled when the avatar enters the trigger area. The golden_coin entity has been added to the pick up specify execution entity. The golden_coin entity has also been disabled by default.
A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select EntitySubscribeTriggerLeave.
C. Add local-player to the tags to filter field.
D. Add an Action and select EntityDisableById.
E. Add an object that will be disabled when the avatar leaves the trigger area. The golden_coin entity has been added to the pick up specify execution entity.
A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select NotificationCenterSubscribeEntityPicking.
C. Add an Action and select EntityDisable. Add an object that will be disabled.
A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select EntitySubscribeTriggerEnter.
C. Add local-player to the tags to filter field.
D. Add an Action and select NotificationCenterPublish.
E. Create a unique notification name and add it to the notification name to publish field. In this example, the notification is called ToggleWall.
A. Add a 3D object to the scene.
B. The Collision component is not required for NotificationCenterPublish action or NotificationCenterSubscribe trigger to work. The Collision component is required for the EntityToggleCollision action that will be used in this example.
C. The Rigidbody component is not required for the NotificationCenterPublish action or NotificationCenterSubscribe trigger to work. The Rigidbody component is required for the EntityToggleCollision action that will be used in this example.
D. Click the Edit Viverse Extension button.
A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select NotificationCenterSubscribe.
C. The same text that was added to the notification name to publish needs to be added to the notification name to subscribe.
D. Add an Action and select EntityToggleCollision.
The avatar is outside the blue trigger area and the gold coin is not visible.
The avatar has entered the blue trigger area and the gold coin is visible.
The avatar is inside the the blue trigger area and the gold coin is visible.
The avatar leaves the blue trigger area and the gold coin is no longer visible.
The avatar approaches the gold coin.
Once the avatar collides with the gold coin, the gold coin is removed.
The avatar is colliding with the red platform.
Once the avatar stops colliding with the red platform, the red platform begins to fade out.
The whale is visible.
When the user clicks on the whale, the whale is disabled.
Before the avatar enters the green trigger area, the wall is red and the avatar can not pass through the wall.
Once the avatar enters the green trigger area, the green trigger sends a notification to the wall and the collider on the wall is disabled.

PlayCanvas is an open-source game engine that we have implemented in the frontend of VIVERSE. Creators will use the PlayCanvas Editor to publish fully-functional worlds to VIVERSE. PlayCanvas has options for both free and paid accounts. Paid accounts allow for extra storage and private projects.
Be sure to use the same email address to create your PlayCanvas account as you used for your VIVERSE account.
Navigate to PlayCanvas.com and click the Sign Up button to begin account creation.
This guide is a walkthrough for adding Playcanvas Extension to the Chrome browser and will allow creation and publishing Scene of Playcanvas projects to VIVERSE.
Download the latest version of the Playcanvas Extension.
At times, issues in the installation process can produce persistent bugs that are only fixed by fully resetting the VIVERSE extension.
Go back to your PlayCanvas project and delete: a. The Extension Entity from your scene hierarchy b. The @viverse folder from your project assets c. The extension-image folder from your project assets d. The extension-script folder from your project assets
8/15/2025
3.57.1
Adds support for publishing PlayCanvas projects that utilize multiple scenes, as well as an IWorldNavigationService to the Create SDK API to allow for programmatic scene switching.
Adds the ability to set nearClip and farClip camera configuration:
8/5/2025
3.55.0
Quest Celebration Event Bug: Fix the issue where quests with celebrations configured are not triggered as completed after the user finishes the task.
Sync the Trigger Fix: Fix the issue where "Sync the trigger" in Trigger and Action is not working.
7/18/2025
3.52.0
Flying in VR: switch to Smooth Locomotion using the XRService, then in any World where flying is enabled in World Settings, press down/click the Smooth Locomotion joystick to enter flight mode. Once flying, pressing forward on the joystick will fly forward along the VR camera's forward axis (i.e. wherever you're looking), and vice versa backwards.
The Enter VR button has been made more reliable.
7/7/2025
3.51.0
We created a PlayCanvas sample project based on Playcanvas extension 3.40.7, which contains several demos that you can try on your own.
Now that you have successfully set up your PlayCanvas account and installed the VIVERSE PlayCanvas extension, see the first steps of creating a project here.
The avatar is not colliding with the gold coin.
Once the avatar collides with gold coin, the gold coin is disabled.
In this example, a trigger is created and when triggered, an action disables an object.
A. Add 3D object to the scene.
B. The Collision component is not required for EntityDisable action to work. The Collision component is required for the EntitySubscribeCollisionStart trigger that will be used in this example.
C. The Rigidbody component is not required for the EntityDisable action to work. The Rigidbody component is required for the EntitySubscribeCollisionStart trigger that will be used in this example.
D. Click the Edit Viverse Extension button.
Create An Action That Enables A Single Object
This guide provides instructions for setting up the EntityEnableById action. In the sample app, once the avatar enters the trigger area, the gold coin is enabled.
When the avatar is outside of the blue trigger area, the gold coin is disabled.
When the avatar enters the blue trigger area, the gold coin is enabled.
In this example, a trigger is created and when triggered, an action enables an object. Any object can be used as a trigger, as long as the object has a collision component. This example uses a 3D box as the trigger area.
A. Create a new 3D Box entity.
B. The Collision component is not required for EntityDisableById action to work. The Collision component is required for the EntitySubscribeTriggerEnter trigger that will be used in this example.
C. Adding a material is optional. A transparent material has been added so that the trigger area is visible in play mode.
D. Click the Edit Viverse Extension button.
Create An Action That Disables A Single Object
This guide provides instructions for setting up the EntityDisableById action. In the sample app, once the avatar leaves the trigger area, the gold coin is disabled.
When the avatar is inside the blue the trigger area, the gold coin is enabled.
When the avatar leaves the blue trigger area, the gold coin is disabled.
In this example, a trigger is created and when triggered, an action disables an object. Any object can be used as a trigger, as long as the object has a collision component. This example uses a 3D box as the trigger area.
A. Create a new 3D Box entity.
B. The Collision component is not required for EntityDisableById action to work. The Collision component is required for the EntitySubscribeTriggerLeave trigger that will be used in this example.
C. Adding a material is optional. A transparent material has been added so that the trigger area is visible in play mode.
Add the EntityDisableById action
A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select EntitySubscribeTriggerLeave.
C. Add local-player to the tags to filter field.
D. Add an Action and select EntityDisableById.
E. Add an object that will be disabled when the avatar leaves the trigger area. The golden_coin entity has been added to the pick up specify execution entity. The golden_coin entity has also been disabled by default.
Create An Action That Enables Multiple Objects
This guide provides instructions for setting up the EntityEnableByTag action. In the sample app, once the avatar enters the trigger area, multiple balls are enabled.
When the avatar is outside the blue trigger area, the balls are disabled.
When the avatar enters the blue trigger area, the balls are enabled.
In this example, a trigger is created and when triggered, an action enables multiple objects. Any object can be used as a trigger, as long as the object has a collision component. This example uses a 3D box as the trigger area.
A. Create a new 3D Box entity.
B. The Collision component is not required for EntityEnableByTag action to work. The Collision component is required for the EntitySubscribeTriggerEnter trigger that will be used in this example.
C. Adding a material is optional. A transparent material has been added so that the trigger area is visible in play mode.
D. Click the Edit Viverse Extension button.
Create An Action That Disables Multiple Objects
This guide provides instructions for setting up the EntityDisableByTag action. In the sample app, once the avatar leaves the trigger area, multiple balls are disabled.
When the avatar enters the blue trigger area, the balls are enabled.
When the avatar leaves the blue trigger area, the balls are disabled.
In this example, a trigger is created and when triggered, an action disables multiple objects. Any object can be used as a trigger, as long as the object has a collision component. This example uses a 3D box as the trigger area.
A. Create a new 3D Box entity.
B. The Collision component is not required for EntityDisableByTag action to work. The Collision component is required for the EntitySubscribeTriggerLeave trigger that will be used in this example.
C. Adding a material is optional. A transparent material has been added so that the trigger area is visible in play mode.
Click the Edit Viverse Extension button.
Create An Action That Fades In An Object
This guide provides instructions for setting up the EntityFadeIn action. In the sample app, once the avatar starts colliding with the red platform, the red platform will fade in.
The red platform has been faded out and is invisible, but the avatar can still collide with it.
Once the avatar begins colliding with the invisible red platform, the red platform fades in and is now visible.
In this example, a trigger is created and when triggered, an action fades in the object it has been added to. Any object can be used as a trigger, as long as the object has a collision component. This example uses a 3D box as the trigger area.
A. Add a 3D object to the scene.
B. A Collision component is not required for the EntityFadeIn action to work. The Collision component is required for the EntitySubscribeCollisionStart trigger that will be used in this example.
C. A Rigidbody component is not required for the EntityFadeIn action to work. The Rigidbody component is required for the EntitySubscribeCollisionStart trigger that will be used in this example.
D. Click the Edit Viverse Extension button.
EntityFadeOut - Create An Action That Fades Out An Object
This guide provides instructions for setting up the EntityFadeOut action. In the sample app, once the avatar stops colliding with the red platform, the red platform will fade out.
The avatar is colliding with the red platform.
Once the avatar stops colliding with the red platform, the red platform begins to fade out.
In this example, a trigger is created and when triggered, an action fades an object. Any object can be used as a trigger, as long as the object has a collision component.
A. Add a 3D object to the scene.
B. A Collision component is not required for the EntityFadeOut action to work. The Collision component is required for the EntitySubscribeCollisionEnd trigger that will be used in this example.
C. A Rigidbody component is not required for the EntityFadeOut action to work. The Rigidbody component is required for the EntitySubscribeCollisionEnd trigger that will be used in this example.
D. Click the Edit Viverse Extension button.
Create An Action That Toggles An Object's Enabled State
This guide provides instructions for setting up the EntityToggleEnabled action. In the sample app, every time the avatar enters the blue trigger area, a notification is sent to the purple box to toggle it’s enabled state.
The purple box is enabled/visible before the avatar enters the blue trigger area.
Once the avatar enters the blue trigger area, a notification is sent to the purple box to toggle it’s enabled stated. The purple box is disabled.
The avatar leaves the blue trigger area.
Once the avatar enters the blue trigger area again, a notification is sent to the purple box to toggle it’s enabled stated. The purple box is enabled again.
In this example, a trigger is created and when the avatar or other objects enter the trigger area, an action sends a notification to toggle another object's enabled state. Any object can be used as a trigger, as long as the object has a collision component.
A. Create a new 3D Box entity.
B. A Collision component is not required for the EntityToggleEnabled action to work. The Collision component is required for the EntitySubscribeTriggerEnter trigger that will be used in this example.
C. Adding a material is optional. A transparent material has been added so that the trigger area is visible in play mode.
D. Click the Edit Viverse Extension button.
A. In the VIVERSE extension, select the TriggerAndAction plugin for the Select plugins dropdown.
B. Add a Trigger and select NotificationCenterSubscribe.
C. In the notification name to subscribe field, add the same name from the notification name to publish field.
This document provides an introduction to MJS (.mjs) also known as Modular JavaScript
Modular JavaScript is an evolved form of JavaScript in which related functionality is kept in a single file or module and that functionality is exposed when required using import and export functionality.
Here's a comparison between MJS and CommonJS. The latter requires node.js, and relies on require() statements to import other scripts/modules:
MJS, on the other hand, can use import statements like other languages.
import { Script } from 'playcanvas';
import { TriggerTypes } from '../@viverse/create-extensions-sdk.mjs';
export class ClickableObject extends Script {
/**
* Called when the script is about to run for the first time.
*/
initialize() {
const event = TriggerTypes.NotificationCenterSubscribeEntityPicking;
this.entity.on(event, () => {
this.objectClicked();
this.app.fire('DoorRotator:rotateDoor');
});
}
/**
* Called for enabled (running state) scripts on each tick.
*
* @param {number} dt - The delta time in seconds since the last frame.
*/
update(dt) {
}
objectClicked() {
console.log('Objected Clicked!');
}
}import { Script } from 'playcanvas';
export class DoorRotator extends Script {
/**
* Called when the script is about to run for the first time.
*/
initialize() {
// listen for the DoorRotate:rotateDoor event
this.app.on('DoorRotator:rotateDoor', this.rotateDoor);
}
/**
* Called for enabled (running state) scripts on each tick.
*
* @param {number} dt - The delta time in seconds since the last frame.
*/
update(dt) {
}
rotateDoor() {
console.log('Rotate Door!');
}
}import { Script } from 'playcanvas';
import { TriggerTypes } from '../@viverse/create-extensions-sdk.mjs';
export class TriggerArea extends Script {
/**
* Called when the script is about to run for the first time.
*/
initialize() {
const event = TriggerTypes.EntitySubscribeTriggerEnter;
this.entity.on(event, () => {
this.triggerEntered();
this.app.fire('DoorRotator:rotateDoor');
});
}
/**
* Called for enabled (running state) scripts on each tick.
*
* @param {number} dt - The delta time in seconds since the last frame.
*/
update(dt) {
}
triggerEntered() {
console.log('Trigger Entered!');
}
}import { Script } from 'playcanvas';
import { ActionTypes } from '../@viverse/create-extensions-sdk.mjs';
export class DisabledObject extends Script {
/**
* Called when the script is about to run for the first time.
*/
initialize() {
const event = ActionTypes.EntityDisable;
this.entity.on(event, () => {
this.objectDisabled();
this.app.fire('DoorRotator:rotateDoor');
});
}
/**
* Called for enabled (running state) scripts on each tick.
*
* @param {number} dt - The delta time in seconds since the last frame.
*/
update(dt) {
}
objectDisabled() {
console.log('Object Disabled!');
}
}import { Script } from 'playcanvas';
import { TriggerTypes } from '../@viverse/create-extensions-sdk.mjs';
export class ClickableObject extends Script {
/**
* @attribute
* eventToFire
* @type { string }
* @title Event To Fire
*/
eventToFire
/**
* Called when the script is about to run for the first time.
*/
initialize() {
const event = TriggerTypes.NotificationCenterSubscribeEntityPicking;
this.entity.on(event, () => {
this.objectClicked();
this.app.fire(this.eventToFire);
});
}
/**
* Called for enabled (running state) scripts on each tick.
*
* @param {number} dt - The delta time in seconds since the last frame.
*/
update(dt) {
}
objectClicked() {
console.log('Objected Clicked!');
}
}























































E. Add an object that will be enabled when the avatar enters the trigger area. The golden_coin entity has been added to the pick up specify execution entity. The golden_coin entity has also been disabled by default.
E. Create a unique Tag and add it to the enable entity with tag field. In this example, the Balls tag is added.
E. Create a unique Tag and add it to the enable entity with tag field. In this example, the Balls tag is added.
E. Create a unique name for the notification and add it to the notification name to publish field. In this example, the ToggleEnabled name is added.





























Further, MJS is now supported by PlayCanvas and the VIVERSE Create SDK, to make script imports easier and mroe modern. Here's an example of a script called GameManager.mjs (critically, the file extension is .mjs, not just .js as usual) created for PlayCanvas, extending its built-in Script class:
DoorRotator.mjs can can be added to a PlayCanvas entity. This allows functions to be executed on that entity such as a door that needs to be opened. After adding additional code inside the rotateDoor function, this function can be called from another script to open the door.
TriggerArea.js - Script can be added to a trigger area entity. When another entity or the avatar enters the trigger area, the rotateDoor function is called and the door will open.
The process for debugging MJS files is simple. You can add the debugger; statement to any of the methods.
With the project running in Chrome, pressing F12 will bring up the Chrome DevTools window. Refresh the page with the DevTools window open and the project will stop at the debugger statement.
When testing your project in PlayCanvas Launch mode, the Chrome Dev Tools can also be opened by right-clicking on the screen and selecting Inspect.
Open up Chrome DevTools through browser menu
Keyboard shortcut for DevTools window
Breakpoints can also be set manually by clicking on the line code number in the margin. The line code number will be highlighted in blue when a breakpoint is set.
GameManager.mjs - GameManager script can be used to control the different game states of the project. This example script has multiple properties of different types added (including arrays), some with default values. In addition to the properties, there are multiple functions added that show how these properties can be utilized and their values are printed to the console.
Fixes most cases where the no-code extension was losing being lost when the PlayCanvas editor disconnects & reconnects to necessary backend services.
Also now displays a warning modal if the extension does enter a disconnected state.
XRService.start() now accepts a callback so code execution can be paused prior to successful VR entry.
6/23/2025
3.50.1
XRService now includes new locomotion options you can set per controller: smooth, teleport, and none
As a minor optimization, we now hide the VR cursor when its inputSource is lost
1st- and 3rd-person camera FOVs now settable in PlayCanvas:
5/23/2025
3.49.0
Bug fix: prevent duplicate injection of custom loading screen scripts during publishing.
Re-enables debugging in the browser by adding publishing modes: Debug mode is designed to assist in development and debugging processes, whereas Standard mode delivers an optimized, minified build intended for distribution.
4/29/2025
3.48.0
Add SDK support to functions
-Methods: start
-Methods: end
Re-enable debugging in the browser
-Align game_script.js behavior with esm.js to allow setting breakpoints
Enable custom loading screen
4/16/2025
3.46.1
Add 'Enable Flying' option in VIVERSE scene settings
Fixed turnToward method
4/9/2025
3.45.3
Add SDK support to functions
-Methods: addCheckTask
-Methods: addProgressBarTask
-Methods: fire
-Methods: getTaskById
-Methods: off
-Methods: on
-Methods: reset
-Methods: start
-Methods: addQuest
-Methods: fire
-Methods: getQuestById
-Methods: getQuestByName
-Methods: off
-Methods: on
-Methods: resetAllQuests
-Methods: fire
-Methods: off
-Methods: on
-Methods: addProgress
-Methods: fire
-Methods: off
-Methods: on
3/27/2025
3.45.0
Add SDK support to functions
-Methods: resetToViverseAvatar
-Properties: avatar
-Properties: network
-Properties: nametag
-Properties: profile
-Properties: localPlayer
-Properties: remotePlayers
-Properties: playerCount Add enable/disable toggle of microphone and microphone permission function in “Player Config > Disable Microphone”
3/12/2025
3.44.12
Add SDK support to functions
-Properties: handedness
-Properties: inputSource
-Properties: modelEntity
-Methods: resetModelAsset
-Methods: setModelAsset
-Properties: add
-Properties: remove
-Properties: controllers
-Properties: controller:addInput
-Properties: controller:removeInput
-Properties: canRotate
-Properties: canZoom
-ILocalPlayer
-Methods: scaleAvatar
2/26/2025
3.44.7
Adding switchPov methods to change 1st & 3rd person POV through code
2/6/2025
3.43.1
Add 3 features in Trigger & Action - EntityAssetUnload, EntityAssetReload, EntityAssetDestroy
1/7/2025
3.40.7
Add post effects settings
Add scene ownership check on publish tab
12/3/2024
3.38.0
Handle response err message from PlayCanvas server job api
Fix service worker error
12/27/2024
3.36.4
Error handling when publishing the scene
10/21/2024
3.35.19
Support latest Chrome version











import { Script } from 'playcanvas';
import * as pc from 'playcanvas';
import { PlayerService } from '../@viverse/create-sdk.mjs'
export class LocalPlayerManager extends Script
{
initialize() {
this.playerService = new PlayerService();
//attach playerService for global access, reference this.app.playerServiceManager in other files
this.app.playerServiceManager = this;
//FOR ALL CUSTOMIZABLE PROPS AND METHODS, see: https://viveportsoftware.github.io/pc-lib/interfaces/ILocalPlayer.html
//enable flight
this.playerService.localPlayer.canFly = true;
//enable movement
this.playerService.localPlayer.canMove = true;
//hide avatar
this.playerService.localPlayer._entity.visibility = false;
}
update(dt)
{
}
}import { Script } from 'playcanvas';
import * as pc from "playcanvas"
import { CameraService } from './@viverse/create-sdk.mjs'
/**
* The {@link https://api.playcanvas.com/classes/Engine.Script.html | Script} class is
* the base class for all PlayCanvas scripts. Learn more about writing scripts in the
* {@link https://developer.playcanvas.com/user-manual/scripting/ | scripting guide}.
*/
export class CameraServiceManager extends Script {
/**
* Called when the script is about to run for the first time.
*/
initialize() {
this.cameraService = new CameraService();
//FOR ALL CUSTOMIZABLE PROPS AND METHODS, see https://viveportsoftware.github.io/pc-lib/interfaces/ICameraService.html
//switch to 1st person pov
this.cameraService.switchPov(0);
//prevent pov switching
this.cameraService.canSwitchPov = false;
}
/**
* Called for enabled (running state) scripts on each tick.
*
* @param {number} dt - The delta time in seconds since the last frame.
*/
update(dt) {
}
}// To include the File System module, use the require() method
const fs = require('fs');
// Using the File System module to read files
fs.readFile();
// Exporting the greet function so it can be used in other modules, use module.exports
module.exports = function greet(name) {
return 'Hello, ${name}!';
};// To include the File System module, you no longer use require() method
// Use import instead.
import fs from 'fs';
// Using the File System module to read files
fs.readFile();
// Exporting the greet function so it can be used in other modules, you no longer
// use `module.exports`. Use `export` instead.
export const greet = (name) => {
return 'Hello, ${name}!';
};import { Script } from 'playcanvas';
export class GameManager extends Script {
/**
* Called when the script is about to run for the first time.
*/
initialize() {
}
/**
* Called for enabled (running state) scripts on each tick.
*
* @param {number} dt - The delta time in seconds since the last frame.
*/
update(dt) {
}
}import { Script } from 'playcanvas';
export class DoorRotator extends Script {
/**
* @attribute
* @type {number}
* @title First Number
*/
firstNumber = 10;
initialize() {
// Add the DoorRotator script to this.app
this.app.doorRotator = this;
}
rotateDoor() {
console.log('Rotate Door!');
}
}var TriggerArea = pc.createScript('triggerArea');
TriggerArea.prototype.initialize = function() {
// Setup listening for the triggerenter event
this.entity.collision.on('triggerenter', this.onTriggerEnter, this);
};
// Handle the onTriggerEnter event
TriggerArea.prototype.onTriggerEnter = function(entity) {
// Calling function in DoorRotator.mjs
this.app.doorRotator.rotateDoor();
// Accessing property in DoorRotator.mjs
console.log("FirstNumber: " + this.app.doorRotator.firstNumber);
}import { Script } from 'playcanvas';
export class Debug extends Script {
/**
* Called when the script is about to run for the first time.
*/
initialize() {
debugger;
console.log("Debug initialize!");
this.testFunction();
}
testFunction() {
debugger;
console.log('Test Function!');
}
}import { Script } from 'playcanvas';
export class GameManager extends Script {
/**
* @attribute
* @type {number}
* @title First Number
*/
firstNumber
/**
* @attribute
* @type {number}
* @title Second Number
*/
secondNumber = 5;
/**
* @attribute
* @type {string}
* @title First String
*/
firstString
/**
* @attribute
* @type {string}
* @title Second String
*/
secondString = 'Default Text';
/**
* @attribute
* @type {boolean}
* @title First Boolean
*/
firstBoolean
/**
* @attribute
* @type {boolean}
* @title Second Boolean
*/
secondBoolean = false;
/**
* @attribute
* @type { pc.Entity }
* @title Entity
*/
entity
/**
* @attribute
* @type {number[]}
* @title Number Array
*/
numberArray
/**
* @attribute
* @type {string[]}
* @title String Array
*/
stringArray
/**
* @attribute
* @type {boolean[]}
* @title Boolean Array
*/
booleanArray
/**
* @attribute
* @type { pc.Entity[] }
* @title Entities
*/
entityArray
/**
* @attribute
* @type { CustomObject[] }
* @title Custom Objects Array
*/
customObjectArray
/**
* Called when the script is about to run for the first time.
*/
initialize() {
this.changeFirstNumber();
this.printValue("FirstNumber: " + this.firstNumber);
this.printValue("SecondNumber: " + this.secondNumber);
this.modifyFirstString();
this.printValue("FirstString: " + this.firstString);
this.printValue("SecondString: " + this.secondString);
this.modifyFirstBoolean();
this.printValue("FirstBoolean: " + this.firstBoolean);
this.printValue("SecondBoolean: " + this.secondBoolean);
this.printValue("Entity Name: " + this.entity.name);
this.printValue("Number Array: ");
this.printArray(this.numberArray);
this.printValue("String Array: ");
this.printArray(this.stringArray);
this.printValue("Boolean Array: ");
this.printArray(this.booleanArray);
this.printValue("Entity Array: ");
this.printEntityArray(this.entityArray);
this.printValue("CustomObjectArray: ");
this.printCustomObjectArray(this.customObjectArray);
}
/**
* Called for enabled (running state) scripts on each tick.
*
* @param {number} dt - The delta time in seconds since the last frame.
*/
update(dt) {
}
changeFirstNumber() {
this.firstNumber = 3;
}
modifyFirstString() {
this.firstString = "modified string";
}
modifyFirstBoolean() {
this.firstBoolean = true;
}
printValue(value) {
console.log(value);
}
printArray(array) {
for (var x = 0; x < array.length; x++) {
console.log(array[x]);
}
}
printEntityArray(array) {
for (var x = 0; x < array.length; x++) {
console.log(array[x].name);
}
}
printCustomObjectArray(array) {
for (var x = 0; x < array.length; x++) {
console.log(array[x].valueOf());
}
}
}
/** @interface */
class CustomObject {
/**
* @attribute
* @type { number }
* @title Number
*/
number
/**
* @attribute
* @type { pc.Entity[] }
* @title Entities Array
*/
entities
}
```



















This document provides a guide for creating a Pet Rescue replica project.
The Pet Rescue game is a treasure hunt style game in which the avatar is placed in a 3D environment and has to search for multiple cats. The user navigates the world with their avatar and uses the mouse button to click on the cats to capture them. The game includes a quest system, scoreboard, timer and many other features that can be useful for creating games in VIVERSE. Read up on Pet Rescue here. Download the project and import the zip file into PlayCanvas.
Pet_Rescue_Sample_Scenes_1.0
Sample Project Version: 1.0
PlayCanvas Extension Version: 3.44.1
Pet_Rescue_Template_Project_1.0
The Pet Rescue Sample Scene Project includes 2 scenes.
Original_Scene - The original Pet Rescue which can be found by searching in VIVERSE Worlds.
Tutorial_Scene - This scene contains a different iteration of the Pet Rescue. This is the scene that was created using the guides provided below.
Template_Scene - A stripped down version of Pet Rescue. The 3D environment and colliders have been removed and all the Pet Rescue core game mechanics have been left in the scene to help creators do what they do best, BUT FASTER! Read through the checklist below. The components listed as Required will need to be created and/or configured. The components marked as Optional are already added and configured in the project, but only need to be modified if the creator wants to customize.
A 3D environment needs to be added to the template project. The basic process is in the following guide, but the process can differ based on projects.
The spawn point is already added to the template project, but needs to be configured for a desired Position and Rotation. The following steps show how it was setup and configured.
A. In the Hierarchy, add a new entity.
B. The spawn point's name is arbitrary, but the spawn-point tag needs to be added.
C. Update the Position and Rotation so that the location is above the ground.
D. At this point, the game can be published to VIVERSE for testing, ensuring that the avatar can traverse the environment.
The cat groups are already added to the template project, but groups can be added or removed as needed. The following steps show how they were setup and configured.
The cat hiding positions are already added to the template project, but hiding positions can be added or removed as needed. Moving and rotating the cat hiding positions is recommended as opposed to moving and rotating the cat models.
The cat models are already added to the template project, but cat models can be added or removed as needed. The following steps show how they were setup and configured.
The Catbox is already added to the template project, but needs to be configured for a desired Position, Rotation and Scale. The following steps show how it was setup and configured.
The random waypoint assignment object is already added to the project and configured. It can be customized if cats or cat groups are added or removed. The following steps show how it was setup and configured.
The instructions board with start button object is already added to the project and configured. The following steps show how it was setup and configured.
The countdown user interface is already added to the project and configured. The following steps show it was setup and configured.
The scoreboard user interface is already added to the project and configured. The following steps show it was setup and configured.
The game over user interface is already added to the project and configured. The following steps show how it was setup and configured.
The GameManager object is already added to the template project and configured. The following steps show it was setup and configured.
The quest system is already added to the template project and configured, but quest tasks can be modified, added or removed. The following steps show how it was setup and configured.
The animations are already added to the template project and configured. The following steps show how they were setup and configured.
The audio files are already added to the template project and configured, but the project can be customized to play different sounds and music. The following steps show how they were setup and configured.
The particle effects are already added to the template project and configured. The following steps show how they were setup and configured.
A. Locate the 3D model's Template file inside the folder that was generated by the 3D model that was added to the Assets window.
B. Drag the Template file to the Hierarchy. In the sample project, lights were added to the model.
C. Update Position, Rotation and Scale of the model.
A. In the Hierarchy, create a new entity for the collider.
B. Add a Collision component.
C. Change the collision type to Mesh.
D. Add the Render file to the Render Asset.
E. Add a Rigidbody component.
F. Update the Position, Rotation and Scale in order to match up the collider with the model.
A. For each cat, drag the Template from the Assets window to a hiding position in the Hierarchy.
B. Select the cat in the Hierarchy.
C. Update the Scale for the cat model so the size is appropriate for the environment.
A. For each cat, add a NotificationCenterSubscribeEntityPicking trigger with throttle in ms set to 1600.
B. Add a EntityDisable action with delay in ms set to 2000.
C. Add a EntityEnableById action with the pick up specify execution entity set to the corresponding cat in the cat box. For cat 1, select cat_1_collect, for cat 2, select cat_2_collect, etc.
A. Create a new entity called RandomWaypointAssignment and add the RandomWaypointAssignment script to it.
B. Place a checkmark on the DebugMode checkbox for testing.
C. Add the number of groups to the Array Size box. In the sample, there are 4 groups. Add each of the 4 groups to the Waypoint Group Entity boxes. Add the number of cats in each group to the Random Entity box. Add each cat from the Cats entity to the boxes below the Random Entity box.
A. Publish the project to VIVERSE to test the script. Click the Reset button to confirm the cats are being placed in the appropriate locations. Remove the checkmark placed on the DebugMode checkbox on the RandomWaypointAssignment script to prevent the dialog box from showing in-game.
A. Under the GameInstructionsBoard entity, create a new button called StartButton. Remove the Text entity that is created under the button.
B. Update the Position, Rotation and Scale so the button fits appropriately on the board.
C. On the Button component, change Transition Mode to Sprite Change.
D. Add the 3 button sprites.
E. Add a Collision component and resize it using Half Extents.
Add the CountDown.mjs script to the project
A. Drag the CountDown.mjs script to the Assets window.
B. Select the CountDown.mjs script and click the Parse button.
A. Create a Text Element entity under the Scoreboard entity and name it CatCount.
B. Change the Position to (44, 0, 0) for X, Y and Z.
C. Update the Preset to Left Anchor & Pivot.
D. Add the SFProText-Regular.ttf font to the Font slot.
E. Change the Text field to 0/9.
F. Change Font Size and Line Height to 16.
G. Change the Color to FAF5CD.
A. Create a Text Element under the Stopwatch entity and name it Sec.
B. Update the Position to (35, 0, 0) for X, Y and Z.
C. Update the Preset to Left Anchor & Pivot.
D. Add SFProText-Bold.ttf font to the Font slot.
E. Change the Text field to 00.
F. Change Font Size and Line Height to 20.
G. Change the Color to FAF5CD.
A. Create a new Image Element entity under 2D Screen.
B. Change the Width to 240 and Height to 177.
C. Add the congrats-bg.png texture to the Texture slot.
A. Add a new Quest and name it Pet Rescue.
B. For Quest Description, add the text:
It's lunchtime, but the cats are still hiding around...Quickly find them, click them, and bring them back to their cat tree hideout.
C. Add a new Task and for the description, add the text: 9 cats, how many can you find?
D. Change the Task type to progressBar.
E. Update Progress Steps to 9.
A. For each cat, create a new entity under it. Rename it to cat1, cat2, etc.
B. Add the Sound component and change Max Distance to 50.
C. Rename the Slot 1 to cat1, cat2, etc. Add the cat1.wav, cat2.wave, etc. to the Asset slot.
A. Drag Quest_Complete.mp3 to the Assets window.
B. Create a new Sound entity and name it QuestComplete.
C. Uncheck Positional and set Volume to .5.
D. Update the Name slot to celebrate, add the Quest_Complete.mp3 to the Asset slot and change Volume to .8.
A. On the QuestComplete entity, add the NotificationCenterSubscribe trigger and celebrate to the notification name to subscribe.
B. Add the EntityPlaySound action, add celebrate to sound name to play and add the QuestComplete entity to the pick up specify execution entity.
Sample Project Version: 1.0
PlayCanvas Extension Version: 3.44.1















































































This project is an interactive, generative audiovisual maze experience created using PlayCanvas and Tone.js for the VIVERSE platform created by Enrique Garcia-Alcalá.
This project is an interactive, generative audiovisual maze experience created using PlayCanvas and Tone.js for the VIVERSE platform. It combines procedural maze generation, real-time 3D rendering, and layered generative audio . The maze acts as both a spatial and musical composition tool, where user exploration progressively reveals new musical layers.
This project was created by Enrique Garcia-Alcalá, digital artist, creative technologist, and professor of digital art and interactive media at Tecnológico de Monterrey in Mexico.
WITHIN can be experienced on VIVERSE here:
The WITHIN PlayCanvas Project can be found here: Link TBD
An overview of this project can be found here: Link TBD
- Procedural content generation
- Generative music with Tone.js
- Interactivity through object collection
- Use of template objects and tags for wall and object instantiation
Maze generation algorithms are techniques used to create connected, solvable labyrinths through procedural methods. They are widely applied in games, simulations, and generative art to construct spatial experiences that feel challenging, mysterious, or organic.
At their core, these algorithms decide how to link individual cells within a grid by removing walls between neighbors, which in turn defines the structure and rhythm of the maze.
Grid: A 2D matrix where each cell may connect to its adjacent neighbors.
Linking: The act of removing a wall between two adjacent cells to form a passage.
Bias: The directional tendency of an algorithm (e.g., favoring east or north links).
Define a grid: Create a 2D array of cells.
Select a starting point (often random).
Use a traversal algorithm to visit cells and remove walls between them.
Repeat until all cells are visited and connected.
This algorithm creates a uniform spanning tree by performing a random walk through the grid. It starts at a random cell and walks to a random neighbor. If that neighbor hasn’t been visited yet, it carves a passage. It continues until all cells have been visited and linked at least once.
Start with a random cell.
While there are unvisited cells:
Choose a random neighbor.
If that neighbor hasn’t been linked yet:
A fast, simple algorithm that connects each cell to either its north or east neighbor (if available). This creates a maze with a strong diagonal bias and clear, repetitive patterns.
For each cell in the grid:
Check if it has a north or east neighbor.
Randomly choose one of them (if any).
Link the current cell to that neighbor.
Each Cell represents a square in the maze grid. It knows its position (row, column), links to its neighbors (connections), and whether it has any special properties like an arch.
Constructor:
row, col: The cell’s position in the grid.
gridBuilder: A reference to the main grid manager, needed to coordinate things like placing arches.
links: A Map of other cells this cell is linked to (passages).
The Grid class is responsible for managing the entire maze. It creates and organizes all cells, establishes their neighbor relationships, and runs maze generation algorithms.
In this project, some carved passages may include arches — a visual element. Arches are added with a 20% chance between two connected cells, as long as they’re not on the boundary of the maze.
When an arch is added:
The hasArch property is set to true on both cells.
The requestArchPlacement()method is called to handle the visual placement.
Below is the full code of the Cell and Grid classes for reference and analysis.
This section explains how walls and arches are placed in the maze based on the generated grid structure. The system dynamically places walls where cells are not connected and occasionally adds arches between connected cells for architectural variety.
Once the maze has been generated using either the Binary Tree or Aldous-Broder algorithm, each cell knows which of its neighboring cells it is connected to. Walls are placed around each cell where a connection (or passage) does not exist, effectively enclosing the maze except where corridors exist.
The main function to create walls is GridBuilder.prototype.createWalls. It performs the following:
Calculates the world position for each potential wall around a cell (north, south, east, west).
Checks if the current cell is linked to a neighbor in that direction.
If not linked, it places a wall prefab there.
Occasionally (5% chance), it also places a decorative statue on that wall.
The function createWall(x, y, z, rotation) instantiates a wall prefab and positions it precisely in the 3D environment, avoiding duplicates using a key system. It also adds collision components and makes the wall static.
Arches are placed between connected cells if they are not on the boundary of the maze. This is handled by requestArchPlacement(cellA, cellB), which calculates the midpoint and orientation between the two cells, then calls createArch().
createArch() does the following:
Instantiates the arch prefab and rotates it correctly.
Calculates the dimensions of the arch using its bounding box.
Positions two invisible box colliders (pillars) on either side of the arch.
Adds rigidbody and collision components for physical interaction.
Floor generation is a crucial step to visually support the grid-based structure of the maze. Each grid cell receives a tile, with randomized textures (materials) and colliders for physics interaction. This process enhances visual diversity and ensures correct player interaction with the environment.
The floor tiles correspond directly to the maze's 2D grid. A tile is placed for each cell, aligned with its (row, col) coordinates. Tiles are not affected by maze connections or walls, so they create a uniform base surface across the entire area.
The function createFloorTiles() handles tile creation. Here's what it does:
Iterates over the entire grid using row and col.
Calculates the world coordinates for each cell.
Instantiates the floor prefab and positions it with a consistent orientation.
To break visual monotony, a list of materials is used. Each tile randomly picks one, giving the floor a more dynamic, varied look. This supports immersion without impacting gameplay.
Every tile receives a box collider sized to half the grid cell to prevent overlaps. Rigidbodies are set to 'static' so the tiles interact with physics but remain immobile.
The object spawning system is responsible for placing collectible objects throughout the maze. These objects enhance interactivity by allowing the player to explore and trigger actions upon collection. The system ensures that each object is placed in a unique, explorable cell within the maze.
Once the maze has been generated, the system identifies all explorable cells—those that have links to other cells. It then attempts to reposition preloaded objects tagged as 'Collectible' into these cells, avoiding overlap and ensuring each object is placed in a valid location.
Get all cells in the maze that are explorable (connected).
Fetch all preloaded objects with the 'Collectible' tag.
For each object:
Randomly select a cell.
Each object is set up to detect collisions using trigger events. When a player or other entity enters the trigger volume, the onCollectibleCollected function is called, which can be customized to handle logic like scoring, sound playback, or activating new elements (e.g., music layers).
This project integrates a dynamic, generative music system using Tone.js. It responds to player interaction by progressively layering musical elements based on collectibles. The soundscape evolves with gameplay, making it immersive and reactive.
Tone.js was used to implement the generative music system in the project. It provides a powerful audio framework built on top of the Web Audio API, enabling real-time synthesis, sequencing, and audio effects. The Tone.js library was downloaded from the official CDN: This file was then uploaded to the PlayCanvas project as a script asset and dynamically loaded at runtime to ensure compatibility and modular loading.
Tone.js is loaded dynamically using loadToneJS, and music begins with startMusic. The system ensures audio starts only after user interaction to comply with browser audio policies. Upon starting, initMusic initializes the audio routing and begins the ambient drone.
initMixer sets up mixer channels using Tone.Gain, Tone.Panner, and a global Tone.Reverb. Each instrument group (ambient, drums, lead, bass, etc.) has its own channel for balancing and spatialization.
addNewMusicLayer activates new music layers based on the name of the collectible object. This makes each collectible not just a visual or scoring element, but a contribution to the evolving musical composition.
A soft evolving pad built with PolySynth and fatsine oscillators, filtered by an LFO-modulated low-pass filter. It adds depth and calm, creating a base layer for the rest of the composition
Includes a kick (MembraneSynth), snare (MetalSynth), hi-hat, and toms. Played via a programmed rhythmic Tone.Loop .
A sawtooth polyphonic synth plays evolving patterns in eighth notes, with delay and randomness added for organic feel.
A NoiseSynth filtered and compressed to create a deep rumble. It plays continuously and contributes low-end power.
A polyphonic sawtooth pad with slow attack and release, using filters and reverb to create evolving harmonies in 8-bar loops.
A bell-like MetalSynth plays sparse, randomized melodies with added delay and reverb. Timings are randomized to add surprise and texture.
Sine wave synth producing short blips, spatialized with random panning and delays. Simulates an electronic radar feel and increases tension.
A generative ambient element wind using filtered noise. It features bandpass filtering to evoke wind-blown electronic signals, enhancing the immersive and mysterious atmosphere of the maze.
When a player collects a collectible object in the maze, it triggers a set of reactions that provide audiovisual feedback, update the UI, and potentially unlock new audio layers. This system creates a sense of progression and reward.
Collectible objects are set up with spherical colliders that use trigger events. When the player enters the collider, the onCollectibleCollected() function is called. This ensures objects can be interacted with passively without requiring clicks or physics-based collisions.
The onCollectibleCollected() function performs the following tasks:
Logs the collection event.
Marks the object as collected (avoiding duplicates).
Updates the counter displayed on screen.
Updates the object’s image in the UI (if one is defined).
The hideUI() function is called when the player closes the UI panel. If all objects are collected, it will:
Hide the main UI.
Show an ending screen.
After a short delay, update the screen to display credits.
While the current implementation successfully combines procedural maze generation, interactive collectibles, and a layered generative music system, there are several areas where the project can evolve further:
Dynamic Difficulty Scaling: Adjust maze complexity or collectible distribution based on player behavior or time.
Mobile Optimization: Adapt controls, UI, and performance for touch devices and mobile web browsers.
Spatial Audio: Incorporate 3D audio positioning for sound layers using Tone.Listener or Web Audio API for increased immersion.
Live Modulation: Introduce real-time modulation of filters or rhythms based on player location or speed.
Visual Effects for Sound: Implement shader-based visual cues when new layers are added.
Expanded UI: Improve UI responsiveness and design consistency across devices.
Sound Journal: Let players revisit sounds/layers they’ve discovered, building a "sound memory" map.
Code Refactoring: Modularize and document all components more thoroughly (e.g., separate music logic from maze logic).
Templates Management: Streamline template }usage and standardize naming for easier scalability.
Analytics Integration: Track user interaction data for iterative design and testing.
Link it.
Decrease the count of unvisited cells.
Move to the neighbor and repeat.
hasArch: A flag indicating whether an arch should be displayed in this cell.
Adds the arch and its colliders to the scene graph.
Adds a box collider and rigidbody for physical interaction.
Places the tile into the scene graph.
Ensure no object has already been placed there.
Relocate the object to the cell.
Add a spherical collision if it doesn't already have one.
Enable trigger events for interaction (e.g., collection).
Shows the full object UI panel.
Plays a sound or adds a new music layer (once per object).
link(cell, bidirectional = true)
Links the current cell with a neighbor cell by removing the wall between them. If bidirectional is true (default), it also links the neighbor back to the current cell. A 20% chance is used to decide whether to request an arch between them, as long as both are not on the outer boundary.
isLinked(cell)
Returns true if this cell is linked to the given cell (i.e., if there’s a passage between them).
neighbors()
Returns a list of all non-null neighboring cells (north, south, east, west).
isBoundary()
Checks whether the cell lies on the outer boundary of the maze grid. Used to avoid placing arches on the outermost edges.
prepareGrid()
Creates a 2D array of cells using nested Array.from(), assigning each a row, column, and a reference to the grid builder.
configureCells()
Assigns each cell its four directional neighbors (north, south, east, west), if they exist within the grid bounds.
cellAt(row, col)
Returns the cell at the given row and column if it’s inside the grid; otherwise returns null. Used to safely access neighbors.
getExplorableCells()
Returns all cells that have at least one link (i.e., are part of the maze). Used to decide where to spawn interactive objects.
randomCell()
Picks a random cell from the grid. Used as a starting point for certain algorithms.
size()
Returns the total number of cells in the maze.
class Cell {
constructor(row, col, gridBuilder) {
this.row = row;
this.col = col;
this.gridBuilder = gridBuilder; // Store reference to GridBuilder
this.links = new Map();
this.hasArch = false;
}
link(cell, bidirectional = true) {
this.links.set(cell, true);
if (bidirectional) {
cell.link(this, false);
}
if (this.isBoundary() || cell.isBoundary()) {
return;
}
if (Math.random() < 0.2 && this.gridBuilder) {
this.hasArch = true;
cell.hasArch = true;
this.gridBuilder.requestArchPlacement(this, cell);
}
}
isLinked(cell) {
return this.links.has(cell);
}
neighbors() {
return [this.north, this.south, this.east, this.west].filter(Boolean);
}
isBoundary() {
return this.row === 0 || this.row === this.gridBuilder.rows - 1 ||
this.col === 0 || this.col === this.gridBuilder.columns - 1;
}
}class Grid {
constructor(rows, columns, gridBuilder) {
this.rows = rows;
this.columns = columns;
this.gridBuilder = gridBuilder;
this.grid = this.prepareGrid();
this.configureCells();
}
prepareGrid() {
return Array.from({ length: this.rows }, (_, row) =>
Array.from({ length: this.columns }, (_, col) => new Cell(row, col, this.gridBuilder))
);
}
configureCells() {
for (let row = 0; row < this.rows; row++) {
for (let col = 0; col < this.columns; col++) {
const cell = this.grid[row][col];
cell.north = this.cellAt(row - 1, col);
cell.south = this.cellAt(row + 1, col);
cell.east = this.cellAt(row, col + 1);
cell.west = this.cellAt(row, col - 1);
}
}
}
cellAt(row, col) {
if (row < 0 || row >= this.rows || col < 0 || col >= this.columns) {
return null;
}
return this.grid[row][col];
}
binaryTreeMaze() {
this.grid.flat().forEach(cell => {
const neighbors = [];
if (cell.north) neighbors.push(cell.north);
if (cell.east) neighbors.push(cell.east);
const neighbor = neighbors[Math.floor(Math.random() * neighbors.length)];
if (neighbor) {
cell.link(neighbor);
}
});
}
aldousBroderMaze() {
let cell = this.randomCell();
let unvisited = this.size() - 1;
while (unvisited > 0) {
let neighbors = cell.neighbors();
let next = neighbors[Math.floor(Math.random() * neighbors.length)];
if (next.links.size === 0) {
cell.link(next);
unvisited--;
}
cell = next;
}
}
getExplorableCells() {
return this.grid.flat().filter(cell => cell.links.size > 0);
}
randomCell() {
const row = Math.floor(Math.random() * this.rows);
const col = Math.floor(Math.random() * this.columns);
return this.grid[row][col];
}
size() {
return this.rows * this.columns;
}
}GridBuilder.prototype.createWalls = function (cell) {
const x = cell.col * this.cellSize;
const z = cell.row * this.cellSize;
const y = this.wallHeight / 2;
let wallPositions = [
{ x: x, z: z - this.cellSize / 2, rotation: 0, direction: "north" },
{ x: x, z: z + this.cellSize / 2, rotation: 0, direction: "south" },
{ x: x + this.cellSize / 2, z: z, rotation: 90, direction: "east" },
{ x: x - this.cellSize / 2, z: z, rotation: 90, direction: "west" }
];
wallPositions.forEach(wall => {
if (
(wall.direction === "north" && !cell.isLinked(cell.north)) ||
(wall.direction === "south" && !cell.isLinked(cell.south)) ||
(wall.direction === "east" && !cell.isLinked(cell.east)) ||
(wall.direction === "west" && !cell.isLinked(cell.west))
) {
this.createWall(wall.x, y, wall.z, wall.rotation);
if (Math.random() < 0.05) {
this.placeStatueOnWall(wall.x, y, wall.z, wall.rotation);
}
}
});
};
GridBuilder.prototype.createWall = function (x, y, z, rotation) {
const key = `${x}_${y}_${z}_${rotation}`;
if (this.wallPositions.has(key)) return;
let wallPrefabs = [this.wallPrefab1, this.wallPrefab2, this.wallPrefab3];
let selectedPrefab = wallPrefabs[Math.floor(Math.random() * wallPrefabs.length)];
if (!selectedPrefab || !selectedPrefab.resource) return;
const wall = selectedPrefab.resource.instantiate();
let modelEntity = wall.render ? wall : wall.findByName("Render") || wall.children.find(child => child.render);
if (!modelEntity || !modelEntity.render || !modelEntity.render.meshInstances.length) return;
let aabb = modelEntity.render.meshInstances[0].aabb;
let halfExtents = aabb.halfExtents.clone();
wall.setLocalPosition(x, y - this.wallHeight / 2, z);
wall.setLocalEulerAngles(90, rotation, 0);
[halfExtents.x, halfExtents.y, halfExtents.z] = [halfExtents.x, halfExtents.z, halfExtents.y + 2];
wall.addComponent('collision', { type: 'box', halfExtents: halfExtents });
wall.addComponent('rigidbody', { type: 'static' });
this.app.root.addChild(wall);
this.wallPositions.add(key);
};
GridBuilder.prototype.requestArchPlacement = function (cellA, cellB) {
if (!this.archPrefab || !this.archPrefab.resource) return;
if (cellA.isBoundary() || cellB.isBoundary()) return;
let midX = ((cellA.col + cellB.col) / 2) * this.cellSize;
let midZ = ((cellA.row + cellB.row) / 2) * this.cellSize;
let y = this.wallHeight / 2;
let rotation = (cellA.col !== cellB.col) ? 90 : 0;
this.createArch(midX, y, midZ, rotation);
};
GridBuilder.prototype.createArch = function (x, y, z, rotation) {
const arch = this.archPrefab.resource.instantiate();
arch.setLocalPosition(x, y - this.wallHeight / 2, z);
arch.setLocalEulerAngles(90, rotation, 0);
let modelEntity = arch.render ? arch : arch.findByName("Render") || arch.children.find(child => child.render);
if (!modelEntity || !modelEntity.render || !modelEntity.render.meshInstances) return;
let aabb = modelEntity.render.meshInstances[0].aabb;
let archHalfExtents = aabb.halfExtents.clone();
let archWidth = archHalfExtents.x * 2;
let archHeight = archHalfExtents.y * 2;
let archDepth = archHalfExtents.z * 2;
let pillarWidth = archWidth * 0.2;
let pillarHeight = archHeight;
let pillarDepth = archDepth;
let adjustedPillarWidth = pillarWidth;
let adjustedPillarDepth = pillarDepth;
if (rotation === 90 || rotation === 270) {
adjustedPillarWidth = pillarDepth / 5;
adjustedPillarDepth = pillarWidth * 5;
}
let pillarY = y - this.wallHeight / 2 + (pillarHeight / 2);
let leftPillar = new pc.Entity("LeftPillar");
let rightPillar = new pc.Entity("RightPillar");
if (rotation === 0 || rotation === 180) {
leftPillar.setLocalPosition(x - (archWidth / 2) + (pillarWidth / 2), pillarY, z);
rightPillar.setLocalPosition(x + (archWidth / 2) - (pillarWidth / 2), pillarY, z);
} else {
leftPillar.setLocalPosition(x, pillarY, z - 2 - (archWidth / 2) + (pillarWidth / 2));
rightPillar.setLocalPosition(x, pillarY, z + 2 + (archWidth / 2) - (pillarWidth / 2));
}
[leftPillar, rightPillar].forEach(pillar => {
pillar.setLocalEulerAngles(0, rotation, 0);
pillar.addComponent("collision", {
type: "box",
halfExtents: new pc.Vec3(adjustedPillarWidth / 2, pillarHeight / 2, adjustedPillarDepth / 2)
});
pillar.addComponent("rigidbody", { type: "static" });
});
arch.addComponent("rigidbody", { type: "static" });
this.app.root.addChild(arch);
this.app.root.addChild(leftPillar);
this.app.root.addChild(rightPillar);
};GridBuilder.prototype.createFloorTiles = function () {
console.log("Generating Floor Tiles with Proper Rotation, Materials, and Colliders...");
if (!this.floorPrefab || !this.floorPrefab.resource) {
console.warn(" Floor prefab not assigned!");
return;
}
if (!this.floorMaterials || this.floorMaterials.length < 3) {
console.warn(" Not enough floor materials assigned! Need at least 3.");
return;
}
for (let row = 0; row < this.rows; row++) {
for (let col = 0; col < this.columns; col++) {
let x = col * this.cellSize;
let z = row * this.cellSize;
let y = 0;
let floorTile = this.floorPrefab.resource.instantiate();
floorTile.setLocalPosition(x, y, z);
floorTile.setLocalEulerAngles(90, 0, 0);
let modelEntity = floorTile.render ? floorTile : floorTile.findByName("Render") || floorTile.children.find(child => child.render);
if (!modelEntity || !modelEntity.render) {
console.warn("No valid render component found for floor tile.");
continue;
}
let randomMaterialIndex = Math.floor(Math.random() * this.floorMaterials.length);
let selectedMaterial = this.floorMaterials[randomMaterialIndex]?.resource;
if (selectedMaterial) {
modelEntity.render.material = selectedMaterial;
} else {
console.warn("Selected material is undefined.");
}
let colliderSize = this.cellSize / 2;
floorTile.addComponent("collision", {
type: "box",
halfExtents: new pc.Vec3(colliderSize, colliderSize, 0.05)
});
floorTile.addComponent("rigidbody", {
type: "static"
});
this.app.root.addChild(floorTile);
}
}
console.log("Floor Prefab Tiles Generated Successfully With Correct Rotation, Materials, and Colliders!");
};
GridBuilder.prototype.spawnObjects = function () {
const explorableCells = this.grid.getExplorableCells();
let placedPositions = new Set();
this.preloadedObjects = this.app.root.findByTag("Collectible");
if (this.preloadedObjects.length === 0) {
console.warn(" No preloaded collectibles found!");
return;
}
if (explorableCells.length < this.preloadedObjects.length) {
console.warn(` Not enough explorable cells to relocate all collectibles.`);
return;
}
for (let i = 0; i < this.preloadedObjects.length; i++) {
let object = this.preloadedObjects[i];
let objectPlaced = false;
let maxAttempts = 10;
while (!objectPlaced && maxAttempts > 0) {
maxAttempts--;
let cellIndex = Math.floor(Math.random() * explorableCells.length);
let cell = explorableCells[cellIndex];
let x = cell.col * this.cellSize;
let z = cell.row * this.cellSize;
let positionKey = `${x}_${z}`;
if (placedPositions.has(positionKey)) {
continue;
}
placedPositions.add(positionKey);
object.setLocalPosition(x, 0.6, z);
if (!object.collision) {
object.addComponent("collision", {
type: "sphere",
radius: 0.3
});
}
object.collision.on("triggerenter", (event) => this.onCollectibleCollected(object, event), this);
objectPlaced = true;
}
if (!objectPlaced) {
console.warn(`Could not find a valid position for collectible ${object.name}`);
}
}
};
GridBuilder.prototype.loadToneJS = function (callback) {
if (window.Tone) {
callback();
return;
}
var script = document.createElement("script");
script.src = this.app.assets.find("Tone.js").getFileUrl();
script.onload = callback;
document.head.appendChild(script);
};
GridBuilder.prototype.startMusic = function () {
if (Tone.context.state !== "running") {
Tone.start().then(() => {
console.log("🎶 Tone.js Started");
this.initMusic();
});
}else{
this.initMusic();
}
};
GridBuilder.prototype.initMusic = function () {
console.log(" Music System Initialized");
this.initMixer();
// Ambient drone
this.startAmbientDrone();
// Collectible-triggered instruments
this.instruments = [];
};
GridBuilder.prototype.addNewMusicLayer = function (name) {
//console.log(` Adding new music layer for: ${name}`);
let layer;
switch (name) {
case "MinotaurContainer":
layer = this.createDrumBeat();
break;
case "MirrorContainer":
layer = this.createGenerativeArpeggio();
break;
case "ThreadContainer":
layer = this.createPad();
break;
case "RocksContainer":
layer = this.createRumble();
break;
case "CompassContainer":
layer = this.createWindLayer();
break;
case "LanternContainer":
layer = this.createBells();
break;
default:
console.warn("No specific layer for this collectible.");
return;
}
if (layer) {
this.instruments.push(layer);
Tone.Transport.start();
}
};
GridBuilder.prototype.initMixer = function () {
// console.log("Initializing Audio Mixer");
// Master volume control
this.masterVolume = new Tone.Volume(-9).toDestination();
// Create mixer channels for each instrument group
this.channels = {
ambient: new Tone.Gain(0.09).connect(this.masterVolume),
arpeggio: new Tone.Gain(0.01).connect(this.masterVolume),
bass: new Tone.Gain(0.05).connect(this.masterVolume),
drums: new Tone.Gain(0.07).connect(this.masterVolume),
lead: new Tone.Gain(0.03).connect(this.masterVolume),
bells: new Tone.Gain(0.04).connect(this.masterVolume),
wind: new Tone.Gain(0.08).connect(this.masterVolume)
};
this.panners = {
ambient: new Tone.Panner(0).connect(this.channels.ambient),
arpeggio: new Tone.Panner(-1).connect(this.channels.arpeggio),
bass: new Tone.Panner(1).connect(this.channels.bass),
drums: new Tone.Panner(-0.4).connect(this.channels.drums),
lead: new Tone.Panner(0.7).connect(this.channels.lead),
wind: new Tone.Panner(0).connect(this.channels.lead)
};
// Global reverb for depth
this.globalReverb = new Tone.Reverb(9).connect(this.masterVolume);
//console.log("Mixer Initialized with Master Volume and Channels");
};
startAmbientDrone
GridBuilder.prototype.startAmbientDrone = function () {
// Create the Drone Synth (Low Background Layer)
this.droneSynth = new Tone.PolySynth(Tone.Synth, {
oscillator: {
type: "fatsine", // Thick, warm tone
count: 3,
spread: 12
},
envelope: {
attack: 5,
decay: 10,
sustain: 0.8,
release: 15
}
});
// Apply a Low-Pass Filter
this.filter = new Tone.Filter({
frequency: 900,
type: "lowpass",
Q: 0.8
});
// Add Effects (Chorus & Reverb)
this.chorus = new Tone.Chorus(0.15, 3.5, 0.7);
this.reverb = new Tone.Reverb(12);
// Chain Effects
this.droneSynth.chain(this.filter, this.chorus, this.reverb,this.channels.ambient, this.globalReverb);
// Modulate Filter Over Time
this.lfo = new Tone.LFO(0.02, 700, 1100).start();
this.lfo.connect(this.filter.frequency);
// Play Slow-Evolving Drone Notes
let droneNotes = ["C2","Db2","C3", "Db3", "E3", "F3", "G3", "Ab3", "B3","C4","Ab4"];
let droneInterval = new Tone.Loop((time) => {
let note = droneNotes[Math.floor(Math.random() * droneNotes.length)];
let velocity = Math.random() * 0.3 + 0.6;
this.droneSynth.triggerAttackRelease(note, "14s", time, velocity);
}, "10s").start();
//console.log("Drone & Random Bell Lead Active");
// Start Tone.js Transport
Tone.Transport.start();
};
GridBuilder.prototype.createDrumBeat = function () {
this.drumSynth = new Tone.MembraneSynth({
pitchDecay: 0.1,
octaves: 2,
oscillator: { type: "sine" },
envelope: { attack: 0.01, decay: 0.3, sustain: 0.1, release: 0.5 }
});
this.snareSynth = new Tone.MetalSynth({
frequency: 220,
envelope: { attack: 0.002, decay: 0.15, release: 0.05 },
harmonicity: 3,
modulationIndex: 20,
resonance: 2000,
octaves: 1.5
}).connect(this.channels.drums);
this.hatSynth = new Tone.MetalSynth({
frequency: 180, envelope: { attack: 0.005, decay: 0.15, release: 0.05 },
harmonicity: 5, modulationIndex: 35, resonance: 3000, octaves: 1
});
this.percSynth = new Tone.MembraneSynth({
pitchDecay: 0.08,
oscillator: { type: "triangle" },
envelope: { attack: 0.002, decay: 0.2, sustain: 0.05, release: 0.2 }
});
this.snareReverb = new Tone.Reverb(2);
this.hatDelay = new Tone.FeedbackDelay("16n", 0.3);
this.drumSynth.chain(this.channels.drums, this.globalReverb);
this.snareSynth.chain(this.snareReverb, this.channels.drums, this.globalReverb);
this.hatSynth.chain(this.hatDelay, this.channels.drums, this.globalReverb);
this.percSynth.chain(this.channels.drums,this.globalReverb);
let drumPattern = [
{ time: "0:0", drum: "kick" },
{ time: "0:0.5", drum: "kick" }, // Heartbeat effect
{ time: "0:1.5", drum: "snare" },
{ time: "0:2", drum: "tom" },
{ time: "0:2.75", drum: "hat" },
{ time: "0:3.5", drum: "tom" },
{ time: "1:0", drum: "kick" },
{ time: "1:0.5", drum: "kick" }, // Heartbeat effect
{ time: "1:1.75", drum: "snare" },
{ time: "1:2.5", drum: "hat" },
{ time: "1:3.25", drum: "tom" },
{ time: "2:0", drum: "kick" },
{ time: "2:0.5", drum: "kick" },
{ time: "2:1.5", drum: "snare" },
{ time: "2:2.25", drum: "tom" },
{ time: "2:3", drum: "hat" },
{ time: "3:0", drum: "kick" },
{ time: "3:0.5", drum: "kick" },
{ time: "3:1.75", drum: "snare" },
{ time: "3:2.5", drum: "tom" },
{ time: "3:3.25", drum: "hat" }
];
const loop = new Tone.Loop((time) => {
drumPattern.forEach(({ time: beatTime, drum }) => {
let t = Tone.Time(beatTime).toSeconds() + time;
switch (drum) {
case "kick":
this.drumSynth.triggerAttackRelease("C1", "8n", t);
break;
case "snare":
this.snareSynth.triggerAttackRelease("16n", t);
break;
case "hat":
this.hatSynth.triggerAttackRelease("32n", t);
break;
case "perc":
this.percSynth.triggerAttackRelease("G2", "16n", t);
break;
}
});
}, "4m");
loop.start(0);
Tone.Transport.start();
};
GridBuilder.prototype.createGenerativeArpeggio = function () {
// Create a synth with a warm, slightly detuned sound
this.arpSynth = new Tone.PolySynth(Tone.Synth, {
oscillator: {
type: "sawtooth",
detune: -10
},
envelope: {
attack: 0.05,
decay: 0.3,
sustain: 0.4,
release: 0.8
}
});
// Apply a delay and reverb for depth
this.arpDelay = new Tone.FeedbackDelay("8n", 0.4);
this.arpSynth.chain(this.arpDelay, this.channels.arpeggio, this.globalReverb);
// Define arpeggio note pattern
let arpeggioNotes = ["E3", "G3", "A3", "B3", "D4", "E4", "G4", "A4"];
let index = 0;
const arpeggioLoop = new Tone.Loop((time) => {
let note = arpeggioNotes[index % arpeggioNotes.length];
let velocity = Math.random() * 0.3 + 0.7; // Subtle dynamics
this.arpSynth.triggerAttackRelease(note, "8n", time, velocity);
// Introduce slight timing variations for organic feel
index++;
if (Math.random() < 0.3) {
index += 1;
}
}, "8n"); // Eighth-note rhythmic structure
arpeggioLoop.start(0);
Tone.Transport.start();
//console.log("🔊 Generative Arpeggio Started");
};
GridBuilder.prototype.createRumble = function () {
this.rumbleSynth = new Tone.NoiseSynth({
noise: { type: "brown" },
envelope: {
attack: 1,
decay: 8,
sustain: 0.9,
release: 10
}
});
this.rumbleFilter = new Tone.Filter({
type: "lowpass",
frequency: 400,
Q: 1
});
this.rumbleCompressor = new Tone.Compressor(-30, 10);
this.rumbleReverb = new Tone.Reverb(8);
this.rumbleGain = new Tone.Gain(1.5);
this.rumbleSynth.chain(this.rumbleFilter, this.rumbleCompressor, this.rumbleReverb, this.rumbleGain, this.channels.bass, this.globalReverb);
this.rumbleSynth.triggerAttack();
Tone.Transport.start();
};GridBuilder.prototype.createPad= function () {
this.padSynth = new Tone.PolySynth(Tone.Synth, {
oscillator: {
type: "sawtooth", // Rich harmonic content
detune: -5 // Slight detuning for warmth
},
envelope: {
attack: 4,
decay: 5,
sustain: 0.7,
release: 8
}
});
// Apply a gentle filter and spatial effects
this.padFilter = new Tone.Filter({
type: "lowpass",
frequency: 1200,
Q: 1
});
this.padReverb = new Tone.Reverb(15);
this.padChorus = new Tone.Chorus(0.3, 2, 0.5);
this.padLFO = new Tone.LFO("0.1hz", 1000, 1500).start();
this.padLFO.connect(this.padFilter.frequency);
// Chain everything together
this.padSynth.chain(this.padFilter, this.padChorus, this.padReverb, this.channels.lead, this.globalReverb);
// Define a slow-moving chord progression
let padChords = [
["C3", "E3", "G3", "B3"],
["D3", "F3", "A3", "C4"],
["E3", "G3", "B3", "D4"],
["F3", "A3", "C4", "E4"]
];
let index = 0;
const padLoop = new Tone.Loop((time) => {
let chord = padChords[index % padChords.length];
//console.log(`Playing pad chord: ${chord} at ${time}`);
this.padSynth.triggerAttackRelease(chord, "8m", time, 0.6);
index++;
}, "8m"); // Slow, evolving changes
padLoop.start(0);
Tone.Transport.start();
};
GridBuilder.prototype.createBells = function(){
this.bellSynth = new Tone.MetalSynth({
frequency: 300,
envelope: { attack: 0.01, decay: 2, release: 1 },
harmonicity: 8, // Creates a bell-like tone
resonance: 700,
modulationIndex: 10,
volume: -12
});
// Add Reverb and Delay to Bells
this.bellReverb = new Tone.Reverb(8);
this.bellDelay = new Tone.FeedbackDelay("4n", 0.5);
this.bellSynth.chain(this.bellReverb, this.bellDelay, this.channels.bells, this.globalReverb);
// Function to Randomly Play Bells
let bellNotes = ["C5", "E5", "G5", "B5", "D6"];
let playBell = () => {
let note = bellNotes[Math.floor(Math.random() * bellNotes.length)];
let velocity = Math.random() * 0.4 + 0.6;
this.bellSynth.triggerAttackRelease(note, "2s", "+0.1", velocity);
// **Randomize next bell trigger time (between 15 and 25 seconds)**
let nextTime = Math.random() * 10 + 15;
setTimeout(playBell, nextTime * 1000);
};
// Start the First Bell
setTimeout(playBell, Math.random() * 5 + 10); // Initial bell timing
Tone.Transport.start();
};
GridBuilder.prototype.createWindLayer = function () {
// Radar-like beeps
this.radarBeep = new Tone.Synth({
oscillator: {
type: "sine"
},
envelope: {
attack: 0.01,
decay: 0.2,
sustain: 0,
release: 0.1
}
});
this.beepFilter = new Tone.Filter({
type: "bandpass",
frequency: 1000,
Q: 6
});
// Randomized panning for each beep
this.beepPanner = new Tone.Panner(0);
this.beepDelay = new Tone.FeedbackDelay("8n", 0.3);
this.radarBeep.chain(this.beepFilter, this.beepPanner, this.beepDelay, this.channels.ambient, this.globalReverb);
// Schedule radar beeps with random panning
const beepLoop = new Tone.Loop((time) => {
let pitch = ["E5", "G#5", "B5", "C6", "D#6"][Math.floor(Math.random() * 5)];
let delay = Math.random() * 4 + 2; // Randomized spacing between beeps
let panValue = Math.random() * 2 - 1; // Random panning between -1 and 1
this.beepPanner.pan.rampTo(panValue, 0.2); // Smoothly adjust panning
this.radarBeep.triggerAttackRelease(pitch, "16n", time + delay);
}, "4m");
beepLoop.start(0);
Tone.Transport.start();
};
GridBuilder.prototype.onCollectibleCollected = function (object, event) {
console.log(`Collected: ${object.name}`);
if (object.collected) return;
object.collected = true;
this.collectedObjects++;
console.log(` Objects Collected: ${this.collectedObjects}/${this.totalObjects}`);
if (this.counterUIElements.counter && this.counterUIElements.counter.element) {
this.counterUIElements.counter.element.text = `${this.collectedObjects}/${this.totalObjects}`;
}
if (this.uiElements.image) {
let imageName = this.collectibleImages[object.name] || "ItemDescription.png";
let imageAsset = this.app.assets.find(imageName);
if (imageAsset) {
this.uiElements.image.element.texture = imageAsset.resource;
this.uiElements.image.enabled = true;
this.uiElements.image.element._dirty = true;
} else {
console.warn(`Image asset ${imageName} not found.`);
}
}
if (this.uiElements.objectView) {
this.uiElements.objectView.enabled = true;
}
if (!object.soundPlayed) {
this.addNewMusicLayer(object.name);
object.soundPlayed = true;
}
};
GridBuilder.prototype.hideUI = function () {
if (this.uiElements.objectView) {
this.uiElements.objectView.enabled = false;
}
if (this.collectedObjects >= this.totalObjects) {
if (this.endingUIElements.ending) {
let viverseUI = document.querySelector("#world-root");
viverseUI.style.display = "none";
this.endingUIElements.ending.enabled = true;
setTimeout(() => {
let creditsImage = this.app.assets.find("Credits.png");
if (creditsImage && this.endingUIElements.endingImage.element) {
this.endingUIElements.endingImage.element.texture = creditsImage.resource;
this.endingUIElements.endingImage.element._dirty = true;
console.log("Changed EndingView to credits.");
}
}, 5000);
}
}
};












