JavaScript Components

8thwall-camera 

Sets up the 8thwall pipeline and retrieves tracking events to place an object at the location of the tracked AR camera / mobile device.

Use this for SLAM tracking based on 8thwall.

Make sure to enable 8thwall in “Project Settings” > “AR”. See also the AR Getting Started Guide.

ParamTypeDescription
cameraEnumChoose front/back camera

cursor-target 

Click/hover/move/button target for cursor.

To trigger code when clicking, hovering, unhovering, moving cursor, pressing cursor button or releasing cursor button, use .addClickFunction(f), .addHoverFunction(f), .addUnHoverFunction(f), .addMoveFunction(f), .addDownFunction(f) and .addUpFunction(f) respectively with any function f() {}.

To call members on a different component, you can set up a cursor target like so:

1start: function() {
2  let target = this.object.addComponent('cursor-target');
3  target.addClickFunction(this.onClick.bind(this));
4},
5onClick: function() {
6  console.log(this.object.name, "was clicked!");
7}

Functions:

 1callback = function(object, cursorComponent) {};
 2
 3addHoverFunction(callback);
 4removeHoverFunction(callback);
 5
 6addUnHoverFunction(callback);
 7removeUnHoverFunction(callback);
 8
 9addClickFunction(callback);
10removeClickFunction(callback);
11
12addMoveFunction(callback);
13removeMoveFunction(callback);
14
15addDownFunction(callback);
16removeDownFunction(callback);
17
18addUpFunction(callback);
19removeUpFunction(callback);

Requirements:

  • a collision component to be attached to the same object.

See Animation Example.

cursor 

3D cursor for desktop/mobile/VR.

Implements a ray-casting cursor into the scene. To react to clicking/hover/unhover/cursor down/cursor up/move use a cursor-target.

For VR, the ray is cast in direction of this.object.getForward(). For desktop and mobile, the forward vector is inverse-projected to account for where on screen the user clicked.

.globalTarget can be used to call callbacks for all objects, even those that do not have a cursor target attached, but match the collision group.

See Animation Example.

ParamTypeDescription
collisionGroupIntCollision group for the ray cast. Only objects in this group will be affected by this cursor.
cursorRayObjectObject(optional) Object that visualizes the cursor’s ray.
cursorRayScalingAxisEnumAxis along which to scale the cursorRayObject.
cursorObjectObject(optional) Object that visualizes the cursor’s hit location.
handednessEnumHandedness for VR cursors to accept trigger events only from respective controller.
rayCastModeEnumMode for raycasting, whether to use PhysX or simple collision components
styleCursorBoolWhether to set the CSS style of the mouse cursor on desktop

debug-object 

Prints some limited debug information about the object.

Information consists of: This object’s name, an object parameter’s name, the object’s world translation, world transform and local transform.

Mainly used by engine developers for debug purposes or as example code.

ParamTypeDescription
objObjectA second object to print the name of

device-orientation-look 

Retrieve device orientation from a mobile device and set the object’s orientation accordingly.

Useful for magic window experiences.

finger-cursor 

0.8.5+

Enables interaction with cursor-targets through collision overlaps, e.g. on the tip of a finger on a tracked hand.

Requirements:

  • A collision component (usually a sphere with 0.05 radius) on the same object

fixed-foveation 

Applies fixed foveation once a WebXR session is started

Fixed foveation reduces shading cost at the periphery by rendering at lower resolutions at the edges of the users vision.

ParamTypeDescription
fixedFoveationFloatAmount to apply from 0 (none) to 1 (full)

hand-tracking 

Easy hand tracking through the WebXR Device API “Hand Input” API.

Allows displaying hands either as sphere-joints or skinned mesh.

To react to grabbing, use this.isGrabbing(). For other gestures, refer to this.joints - an array of WL.Object and use the joint indices listed in the WebXR Hand Input specification.

It is often desired to use either hand tracking or controllers, not both. This component provides deactivateChildrenWithoutPose to hide the hand tracking visualization if no pose is available and controllerToDeactivate for disabling another object once a hand tracking pose is available. Outside of XR sessions, tracking or controllers are neither enabled nor disabled to play well with the vr-mode-active-switch component.

Requirements:

  • To use hand-tracking, enable “joint tracking” in chrome://flags on Oculus Browser for Oculus Quest/Oculus Quest 2.

See Hand Tracking Example.

ParamTypeDescription
handednessEnumHandedness determining whether to receive tracking input from right or left hand
jointMeshMesh(optional) Mesh to use to visualize joints
jointMaterialMaterialMaterial to use for display. Applied to either the spawned skinned mesh or the joint spheres.
handSkinSkin(optional) Skin to apply tracked joint poses to. If not present, joint spheres will be used for display instead.
deactivateChildrenWithoutPoseBoolDeactivate children if no pose was tracked
controllerToDeactivateObjectController objects to activate including children if no pose is available

hit-test-location 

Sets up a WebXR Device API “Hit Test” and places the object to the hit location.

Requirements:

  • Specify 'hit-test' in the required or optional features on the AR button in your html file. See Wastepaperbin AR as an example.

howler-audio-listener 

(Spatial) audio listener based on Howler.js.

Retrieves the location and orientation of the object and passes it to Howler.pos() and Howler.orientation().

ParamTypeDescription
spatialBoolWhether audio should be spatialized/positional.

howler-audio-source 

(Spatial) audio source based on Howler.js.

Creates a Howler audio source, plays an audio file on it and updates its position.

Optimizes the position update to only update if the difference to last position is larger than half a centimeter. To force updates (e.g. if the sound source is very close to the listener), use .updatePosition().

ParamTypeDescription
volumeFloatVolume
spatialBoolWhether audio should be spatialized/positional
loopBoolWhether to loop the sound
autoplayBoolWhether to start playing automatically
srcStringURL to a sound file to play

image-texture 

Downloads an image from URL and applies it as diffuseTexture or flatTexture to an attached mesh component. Only “Phong Opaque Textured” and “Flat Opaque Textured” materials are supported.

Warning: This component will soon be changed to be consistent with video-texture and change a material rather than mesh. To make sure your code keeps working in future versions of the engine, please use material rather than meshIndex.

ParamTypeDescription
urlStringURL to download the image from
meshIndexInt0-based mesh component index on this object (e.g. 1 for “second mesh”). Deprecated: Please use material instead.
materialMaterialMaterial to apply the video texture to (if null, tries to apply to the mesh with meshIndex)

mouse-look 

Controls the camera through mouse movement.

Efficiently implemented to affect object orientation only when the mouse moves.

ParamTypeDescription
sensitityFloatMouse look sensitivity
requireMouseDownBoolRequire a mouse button to be pressed to control view. Otherwise view will allways follow mouse movement
mouseButtonIndexIntIf “moveOnClick” is enabled, mouse button which should be held down to control view
pointerLockOnClickBoolEnables pointer lock on “mousedown” event on WL.canvas

player-height 

Set player height for a Y-offset above the ground for ’local’ and ‘viewer’ WebXR.refSpace.

ParamTypeDescription
heightFloat

target-framerate 

Sets the target framerate

Updates the target framerate to the closest supported target framerate to the given framerate.

The target framerate is used for the device’s VR compositor as an indication of how often to refresh the screen with new images. This means the app will be asked to produce frames in more regular intervals, potentially spending less time on frames that are likely to be dropped.

For apps with heavy load, setting a well matching target framerate can improve the apps rendering stability and reduce stutter.

Likewise, the target framerate can be used to enable 120Hz refresh rates on Oculus Quest 2 on simpler apps.

ParamTypeDescription
framerateFloat

teleport 

Teleport VR locomotion.

See Teleport Example.

ParamTypeDescription
teleportIndicatorMeshObjectObjectObject that will be placed as indiciation forwhere the player will teleport to.
camRootObjectRoot of the player, the object that will be positioned on teleportation.
camObjectNon-vr camera for use outside of VR
eyeLeftObjectLeft eye for use in VR
eyeRightObjectRight eye for use in VR
handednessEnumHandedness for VR cursors to accept trigger events only from respective controller.
floorGroupIntCollision group of valid “floor” objects that can be teleported on
thumbstickActivationThreshholdFloatHow far the thumbstick needs to be pushed to have the teleport target indicator show up
thumbstickDeactivationThreshholdFloatHow far the thumbstick needs to be released to execute the teleport
indicatorYOffsetFloatOffset to apply to the indicator object, e.g. to avoid it from Z-fighting with the floor
rayCastModeEnumMode for raycasting, whether to use PhysX or simple collision components
maxDistanceFloatMax distance for PhysX raycast

two-joint-ik-solver 

Inverse Kinematics for two-joint chains (e.g. knees or ellbows)

ParamTypeDescription
rootObjectRoot bone, never moves
middleObjectBone attached to the root
endObjectBone attached to the middle
targetObjectTarget the joins should reach for
helperObjectHelper object to use to determine joint rotation axis

video-texture 

Downloads a video from URL and applies it as diffuseTexture or flatTexture on given material.

Video textures need to be updated regularly whenever a new frame is available. This component handles the detection of a new frame and updates the texture to reflect the video’s current frame. Only “Phong Opaque Textured” and “Flat Opaque Textured” materials are supported.

The video can be accessed through this.video:

1  let videoTexture = this.object.getComponent('video-texture');
2  videoTexture.video.play();
3  videoTexture.video.pause();

See Video Example.

ParamTypeDescription
urlStringURL to download video from
materialMaterialMaterial to apply the video texture to
loopBoolWhether to loop the video
autoplayBoolWhether to automatically start playing the video
mutedBoolWhether to mute sound

vr-mode-active-switch 

Allows switching all other components on an object to active/inactive depending on whether a VR/AR session is active.

Useful for hiding controllers until the user enters VR for example.

ParamTypeDescription
activateComponentsEnumWhen components should be active: In VR or when not in VR
affectChildrenBoolWhether child object’s components should be affected

vrm 

Component for loading and handling VRM 1.0 models.

Posing of the model should be done exclusively by rotating the bones. These can be accessed using the .bones property and follow the VRM bone naming. Note that not all VRM models will have all possible bones. The rest pose (T-pose) is captured in the .restPose property. Resetting a bone to its rest pose can be done as follows:

1vrmComponent.bones[vrmBoneName].rotationLocal = vrmComponent.restPose[vrmBoneName];

Moving the model through the world should be done by moving the object this component is attached to. In other words, by moving the root of the VRM model. The bones and any descendant objects should not be used to move the VRM model.

The core extension VRMC_vrm as well as theVRMC_springBone and VRMC_node_constraint extensions are supported.

Limitations:

  • No support for VRMC_material_mtoon
  • Expressions aren’t supported
  • Expression based lookAt isn’t supported
  • Mesh annotation mode auto is not supported (first person mode)
ParamTypeDescription
srcStringURL to a VRM file to load
lookAtTargetObjectObject the VRM is looking at

wasd-controls 

Basic movement with W/A/S/D keys.

ParamTypeDescription
speedFloatMovement speed in m/s.
headObjectObjectObject of which the orientation is used to determine forward direction