JavaScript Components


Sets up the 8thwall pipeline and retrieves tracking events to place an object at the location of the tracked AR camera / mobile device.

Use this for SLAM tracking based on 8thwall.

Make sure to enable 8thwall in “Project Settings” > “AR”. See also the AR Getting Started Guide.

cameraEnumChoose front/back camera


Click/hover/move/button target for cursor.

To trigger code when clicking, hovering, unhovering, moving cursor, pressing cursor button or releasing cursor button, use .addClickFunction(f), .addHoverFunction(f), .addUnHoverFunction(f), .addMoveFunction(f), .addDownFunction(f) and .addUpFunction(f) respectively with any function f() {}.

To call members on a different component, you can set up a cursor target like so:

1start: function() {
2  let target = this.object.addComponent('cursor-target');
3  target.addClickFunction(this.onClick.bind(this));
5onClick: function() {
6  console.log(, "was clicked!");


 1callback = function(object, cursorComponent) {};


  • a collision component to be attached to the same object.

See Animation Example.


3D cursor for desktop/mobile/VR.

Implements a ray-casting cursor into the scene. To react to clicking/hover/unhover/cursor down/cursor up/move use a cursor-target.

For VR, the ray is cast in direction of this.object.getForward(). For desktop and mobile, the forward vector is inverse-projected to account for where on screen the user clicked.

.globalTarget can be used to call callbacks for all objects, even those that do not have a cursor target attached, but match the collision group.

See Animation Example.

collisionGroupIntCollision group for the ray cast. Only objects in this group will be affected by this cursor.
cursorRayObjectObject(optional) Object that visualizes the cursor’s ray.
cursorRayScalingAxisEnumAxis along which to scale the cursorRayObject.
cursorObjectObject(optional) Object that visualizes the cursor’s hit location.
handednessEnumHandedness for VR cursors to accept trigger events only from respective controller.
rayCastModeEnumMode for raycasting, whether to use PhysX or simple collision components
styleCursorBoolWhether to set the CSS style of the mouse cursor on desktop


Prints some limited debug information about the object.

Information consists of: This object’s name, an object parameter’s name, the object’s world translation, world transform and local transform.

Mainly used by engine developers for debug purposes or as example code.

objObjectA second object to print the name of


Retrieve device orientation from a mobile device and set the object’s orientation accordingly.

Useful for magic window experiences.



Enables interaction with cursor-targets through collision overlaps, e.g. on the tip of a finger on a tracked hand.


  • A collision component (usually a sphere with 0.05 radius) on the same object


Applies fixed foveation once a WebXR session is started

Fixed foveation reduces shading cost at the periphery by rendering at lower resolutions at the edges of the users vision.

fixedFoveationFloatAmount to apply from 0 (none) to 1 (full)


Easy hand tracking through the WebXR Device API “Hand Input” API.

Allows displaying hands either as sphere-joints or skinned mesh.

To react to grabbing, use this.isGrabbing(). For other gestures, refer to this.joints - an array of WL.Object and use the joint indices listed in the WebXR Hand Input specification.

It is often desired to use either hand tracking or controllers, not both. This component provides deactivateChildrenWithoutPose to hide the hand tracking visualization if no pose is available and controllerToDeactivate for disabling another object once a hand tracking pose is available. Outside of XR sessions, tracking or controllers are neither enabled nor disabled to play well with the vr-mode-active-switch component.


  • To use hand-tracking, enable “joint tracking” in chrome://flags on Oculus Browser for Oculus Quest/Oculus Quest 2.

See Hand Tracking Example.

handednessEnumHandedness determining whether to receive tracking input from right or left hand
jointMeshMesh(optional) Mesh to use to visualize joints
jointMaterialMaterialMaterial to use for display. Applied to either the spawned skinned mesh or the joint spheres.
handSkinSkin(optional) Skin to apply tracked joint poses to. If not present, joint spheres will be used for display instead.
deactivateChildrenWithoutPoseBoolDeactivate children if no pose was tracked
controllerToDeactivateObjectController objects to activate including children if no pose is available


Sets up a WebXR Device API “Hit Test” and places the object to the hit location.


  • Specify 'hit-test' in the required or optional features on the AR button in your html file. See Wastepaperbin AR as an example.


(Spatial) audio listener based on Howler.js.

Retrieves the location and orientation of the object and passes it to Howler.pos() and Howler.orientation().

spatialBoolWhether audio should be spatialized/positional.


(Spatial) audio source based on Howler.js.

Creates a Howler audio source, plays an audio file on it and updates its position.

Optimizes the position update to only update if the difference to last position is larger than half a centimeter. To force updates (e.g. if the sound source is very close to the listener), use .updatePosition().

spatialBoolWhether audio should be spatialized/positional
loopBoolWhether to loop the sound
autoplayBoolWhether to start playing automatically
srcStringURL to a sound file to play


Downloads an image from URL and applies it as diffuseTexture or flatTexture to an attached mesh component. Only “Phong Opaque Textured” and “Flat Opaque Textured” materials are supported.

Warning: This component will soon be changed to be consistent with video-texture and change a material rather than mesh. To make sure your code keeps working in future versions of the engine, please use material rather than meshIndex.

urlStringURL to download the image from
meshIndexInt0-based mesh component index on this object (e.g. 1 for “second mesh”). Deprecated: Please use material instead.
materialMaterialMaterial to apply the video texture to (if null, tries to apply to the mesh with meshIndex)


Dynamically load and map input profiles for XR controllers.

handednessEnumThe index representing the handedness of the controller (0 for left, 1 for right).
defaultBasePathStringThe url for base path where XR input profiles are stored.By default fallbacks to
customBasePathStringAn optional folder path for loading custom XR input profiles. You can put your custom profile model and profile.json in a folder inside static and reference it here. If not empty this overrides defaultBasePath.
defaultControllerObject3DThe default 3D controller model used when a custom model fails to load. Fallback to first child object
trackedHandObject3DThe object which has the HandTracking component added to it. Fallback to its sibling with name HandLeft or HandRight w.r.t the handedness.
mapToDefaultControllerBooleanIf true, the input profile will be mapped to the default controller, and no dynamic 3D model of the controller will be loaded.
addVrModeSwitchBooleanIf true, adds a VR mode switch component to the loaded controller model.
  • onModelLoaded: Emitter which triggers on model loaded event.
  • toFilter: A set of components to filter during default controller component retrieval.


Controls the camera through mouse movement.

Efficiently implemented to affect object orientation only when the mouse moves.

sensitityFloatMouse look sensitivity
requireMouseDownBoolRequire a mouse button to be pressed to control view. Otherwise view will allways follow mouse movement
mouseButtonIndexIntIf “moveOnClick” is enabled, mouse button which should be held down to control view
pointerLockOnClickBoolEnables pointer lock on “mousedown” event on WL.canvas


Set player height for a Y-offset above the ground for ’local’ and ‘viewer’ WebXR.refSpace.



Sets the target framerate

Updates the target framerate to the closest supported target framerate to the given framerate.

The target framerate is used for the device’s VR compositor as an indication of how often to refresh the screen with new images. This means the app will be asked to produce frames in more regular intervals, potentially spending less time on frames that are likely to be dropped.

For apps with heavy load, setting a well matching target framerate can improve the apps rendering stability and reduce stutter.

Likewise, the target framerate can be used to enable 120Hz refresh rates on Oculus Quest 2 on simpler apps.



Teleport VR locomotion.

See Teleport Example.

teleportIndicatorMeshObjectObjectObject that will be placed as indiciation forwhere the player will teleport to.
camRootObjectRoot of the player, the object that will be positioned on teleportation.
camObjectNon-vr camera for use outside of VR
eyeLeftObjectLeft eye for use in VR
eyeRightObjectRight eye for use in VR
handednessEnumHandedness for VR cursors to accept trigger events only from respective controller.
floorGroupIntCollision group of valid “floor” objects that can be teleported on
thumbstickActivationThreshholdFloatHow far the thumbstick needs to be pushed to have the teleport target indicator show up
thumbstickDeactivationThreshholdFloatHow far the thumbstick needs to be released to execute the teleport
indicatorYOffsetFloatOffset to apply to the indicator object, e.g. to avoid it from Z-fighting with the floor
rayCastModeEnumMode for raycasting, whether to use PhysX or simple collision components
maxDistanceFloatMax distance for PhysX raycast


Inverse Kinematics for two-joint chains (e.g. knees or ellbows)

rootObjectRoot bone, never moves
middleObjectBone attached to the root
endObjectBone attached to the middle
targetObjectTarget the joins should reach for
helperObjectHelper object to use to determine joint rotation axis


Downloads a video from URL and applies it as diffuseTexture or flatTexture on given material.

Video textures need to be updated regularly whenever a new frame is available. This component handles the detection of a new frame and updates the texture to reflect the video’s current frame. Only “Phong Opaque Textured” and “Flat Opaque Textured” materials are supported.

The video can be accessed through

1  let videoTexture = this.object.getComponent('video-texture');

See Video Example.

urlStringURL to download video from
materialMaterialMaterial to apply the video texture to
loopBoolWhether to loop the video
autoplayBoolWhether to automatically start playing the video
mutedBoolWhether to mute sound


Allows switching all other components on an object to active/inactive depending on whether a VR/AR session is active.

Useful for hiding controllers until the user enters VR for example.

activateComponentsEnumWhen components should be active: In VR or when not in VR
affectChildrenBoolWhether child object’s components should be affected


Component for loading and handling VRM 1.0 models.

Posing of the model should be done exclusively by rotating the bones. These can be accessed using the .bones property and follow the VRM bone naming. Note that not all VRM models will have all possible bones. The rest pose (T-pose) is captured in the .restPose property. Resetting a bone to its rest pose can be done as follows:

1vrmComponent.bones[vrmBoneName].rotationLocal = vrmComponent.restPose[vrmBoneName];

Moving the model through the world should be done by moving the object this component is attached to. In other words, by moving the root of the VRM model. The bones and any descendant objects should not be used to move the VRM model.

The core extension VRMC_vrm as well as theVRMC_springBone and VRMC_node_constraint extensions are supported.


  • No support for VRMC_material_mtoon
  • Expressions aren’t supported
  • Expression based lookAt isn’t supported
  • Mesh annotation mode auto is not supported (first person mode)
srcStringURL to a VRM file to load
lookAtTargetObjectObject the VRM is looking at


Basic movement with W/A/S/D keys.

speedFloatMovement speed in m/s.
headObjectObjectObject of which the orientation is used to determine forward direction