JavaScript Components

8thwall-camera 

8thwall camera component.

Deprecated: Use the components in https://github.com/WonderlandEngine/wonderland-ar-tracking instead.

ParamTypeDescription
deprecatedbool

anchor 

Sets the location of the object to the location of an XRAnchor

Create anchors using the Anchor.create() static function.

Example for use with cursor:

1cursorTarget.onClick.add((object, cursor, originalEvent) => {
2    /* Only events in XR will have a frame attached */
3    if(!originalEvent.frame) return;
4    Anchor.create(anchorObject, {uuid: id, persist: true}, originalEvent.frame);
5});
ParamTypeDescription
persistbool
uuidstringUnique identifier to load a persistent anchor from, or empty/null if unknown

audio-listener 

Represents a Wonderland audio listener component. Updates the position and orientation of a WebAudio listener instance.

Note: Only one listener should be active at a time.

audio-source 

Represents an audio src in the Wonderland Engine, allowing playback of audio files.

cursor 

3D cursor for desktop/mobile/VR.

Implements a ray-casting cursor into the scene. To react to clicking/hover/unhover/cursor down/cursor up/move use a cursor-target.

For VR, the ray is cast in direction of this.object.getForward(). For desktop and mobile, the forward vector is inverse-projected to account for where on screen the user clicked.

.globalTarget can be used to call callbacks for all objects, even those that do not have a cursor target attached, but match the collision group.

.hitTestTarget can be used to call callbacks WebXR hit test results,

See Animation Example.

ParamTypeDescription
collisionGroupintCollision group for the ray cast. Only objects in this group will be affected by this cursor.
cursorObjectobject(optional) Object that visualizes the cursor’s hit location.
cursorRayObjectobject(optional) Object that visualizes the cursor’s ray.
cursorRayScalingAxisenumAxis along which to scale the cursorRayObject.
handednessenumHandedness for VR cursors to accept trigger events only from respective controller.
maxDistancefloatMaximum distance for the cursor’s ray cast.
rayCastModeenumMode for raycasting, whether to use PhysX or simple collision components
styleCursorboolWhether to set the CSS style of the mouse cursor on desktop
useWebXRHitTestboolUse WebXR hit-test if available.

Attaches a hit-test-location component to the cursorObject, which will be used by the cursor to send events to the hitTestTarget with HitTestResult.|

cursor-target 

Click/hover/move/button target for cursor.

To trigger code when clicking, hovering, unhovering, moving cursor, pressing cursor button or releasing cursor button, use .addClickFunction(f), .addHoverFunction(f), .addUnHoverFunction(f), .addMoveFunction(f), .addDownFunction(f) and .addUpFunction(f) respectively with any function f() {}.

To call members on a different component, you can set up a cursor target like so:

1start: function() {
2  let target = this.object.addComponent('cursor-target');
3  target.onClick.add(this.onClick.bind(this));
4},
5onClick: function() {
6  console.log(this.object.name, "was clicked!");
7}

Functions:

 1const target = this.object.getComponent(CursorTarget);
 2const callback = function(object, cursorComponent) {};
 3
 4target.onHover.add(callback);
 5target.onHover.remove(callback);
 6
 7target.onUnHover.add(callback);
 8target.onUnHover.remove(callback);
 9
10target.onClick.add(callback);
11target.onClick.remove(callback);
12
13target.onMove.add(callback);
14target.onMove.remove(callback);
15
16target.onDown.add(callback);
17target.onDown.remove(callback);
18
19target.onUp.add(callback);
20target.onUp.remove(callback);

Requirements:

  • a collision component to be attached to the same object.

See Animation Example.

debug-object 

Prints some limited debug information about the object.

Information consists of: This object’s name, an object parameter’s name, the object’s world translation, world transform and local transform.

Mainly used by engine developers for debug purposes or as example code.

ParamTypeDescription
objobjectA second object to print the name of

device-orientation-look 

Retrieve device orientation from a mobile device and set the object’s orientation accordingly.

Useful for magic window experiences.

finger-cursor 

0.8.5+

Enables interaction with cursor-targets through collision overlaps, e.g. on the tip of a finger on a tracked hand.

Requirements:

  • A collision component (usually a sphere with 0.05 radius) on the same object

fixed-foveation 

Applies fixed foveation once a WebXR session is started

Fixed foveation reduces shading cost at the periphery by rendering at lower resolutions at the edges of the users vision.

ParamTypeDescription
fixedFoveationfloatAmount to apply from 0 (none) to 1 (full)

hand-tracking 

Easy hand tracking through the WebXR Device API “Hand Input” API.

Allows displaying hands either as sphere-joints or skinned mesh.

To react to grabbing, use this.isGrabbing(). For other gestures, refer to this.joints - an array of Object3D and use the joint indices listed in the WebXR Hand Input specification.

It is often desired to use either hand tracking or controllers, not both. This component provides deactivateChildrenWithoutPose to hide the hand tracking visualization if no pose is available and controllerToDeactivate for disabling another object once a hand tracking pose is available. Outside of XR sessions, tracking or controllers are neither enabled nor disabled to play well with the vr-mode-active-switch component.

Requirements:

  • To use hand-tracking, enable “joint tracking” in chrome://flags on Oculus Browser for Oculus Quest/Oculus Quest 2.

See Hand Tracking Example.

ParamTypeDescription
controllerToDeactivateobjectController objects to activate including children if no pose is available
deactivateChildrenWithoutPoseboolDeactivate children if no pose was tracked
handSkinskin(optional) Skin to apply tracked joint poses to. If not present,
joint spheres will be used for display instead.
handednessenumHandedness determining whether to receive tracking input from right or left hand
jointMaterialmaterialMaterial to use for display. Applied to either the spawned skinned mesh or the joint spheres.
jointMeshmesh(optional) Mesh to use to visualize joints

hit-test-location 

Sets up a WebXR Device API “Hit Test” and places the object to the hit location.

Requirements:

  • Specify 'hit-test' in the required or optional features on the AR button in your html file. See Wastepaperbin AR as an example.
ParamTypeDescription
scaleObjectboolFor maintaining backwards compatibility: Whether to scale the object to 0 and back.

Deprecated: Use onHitLost and onHitFound instead.|

image-texture 

Downloads an image from URL and applies it as diffuseTexture or flatTexture to an attached mesh component.

Materials from the following shaders are supported:

  • “Phong Opaque Textured”
  • “Flat Opaque Textured”
  • “Background”
  • “Physical Opaque Textured”
  • “Foliage”
ParamTypeDescription
materialmaterialMaterial to apply the video texture to
texturePropertystringName of the texture property to set
urlstringURL to download the image from

input-profile 

Dynamically load and map input profiles for XR controllers.

ParamTypeDescription
addVrModeSwitchboolIf true, adds a VR mode switch component to the loaded controller model.
customBasePathstringAn optional folder path for loading custom XR input profiles.
defaultBasePathstringThe base path where XR input profiles are stored.
defaultControllerobjectThe default 3D controller model used when a custom model fails to load.
handednessenumThe index representing the handedness of the controller (0 for left, 1 for right).
mapToDefaultControllerboolIf true, the input profile will be mapped to the default controller, and no dynamic 3D model of controller will be loaded.
trackedHandobjectThe object which has HandTracking component added to it.

mouse-look 

Controls the camera orientation through mouse movement.

Efficiently implemented to affect object orientation only when the mouse moves.

ParamTypeDescription
mouseButtonIndexintIf “moveOnClick” is enabled, mouse button which should
be held down to control view
pointerLockOnClickboolEnables pointer lock on “mousedown” event on canvas
requireMouseDownboolRequire a mouse button to be pressed to control view.
Otherwise view will allways follow mouse movement
sensitityfloatMouse look sensitivity

orbital-camera 

OrbitalCamera component allows the user to orbit around a target point, which is the position of the object itself. It rotates at the specified distance.

Remarks: The component works using mouse or touch. Therefor it does not work in VR.

ParamTypeDescription
dampingfloat
maxElevationfloat
maxZoomfloat
minElevationfloat
minZoomfloat
mouseButtonIndexint
radialfloat
xSensitivityfloat
ySensitivityfloat
zoomSensitivityfloat

plane-detection 

Generate meshes and collisions for XRPlanes using WebXR Device API - Plane Detection.

ParamTypeDescription
collisionMaskintCollision mask to assign to newly created collision components or a negative value if
collision components should not be created.
planeMaterialmaterialMaterial to assign to created plane meshes or null if meshes should not be created.

player-height 

Set player height for a Y-offset above the ground for ’local’ and ‘viewer’ reference spaces.

ParamTypeDescription
heightfloat

target-framerate 

Sets the target framerate

Updates the target framerate to the closest supported target framerate to the given framerate.

The target framerate is used for the device’s VR compositor as an indication of how often to refresh the screen with new images. This means the app will be asked to produce frames in more regular intervals, potentially spending less time on frames that are likely to be dropped.

For apps with heavy load, setting a well matching target framerate can improve the apps rendering stability and reduce stutter.

Likewise, the target framerate can be used to enable 120Hz refresh rates on Oculus Quest 2 on simpler apps.

ParamTypeDescription
frameratefloat

teleport 

Teleport VR locomotion.

See Teleport Example.

ParamTypeDescription
cam?Non-vr camera for use outside of VR
camRoot?Root of the player, the object that will be positioned on teleportation.
eyeLeft?Left eye for use in VR
eyeRight?Right eye for use in VR
floorGroup?Collision group of valid “floor” objects that can be teleported on
handedness?Handedness for VR cursors to accept trigger events only from respective controller.
indicatorYOffset?Offset to apply to the indicator object, e.g. to avoid it from Z-fighting with the floor
maxDistance?Max distance for PhysX raycast
rayCastMode?Mode for raycasting, whether to use PhysX or simple collision components
teleportIndicatorMeshObject?Object that will be placed as indiciation forwhere the player will teleport to.
thumbstickActivationThreshhold?How far the thumbstick needs to be pushed to have the teleport target indicator show up
thumbstickDeactivationThreshhold?How far the thumbstick needs to be released to execute the teleport

trail 

Dynamic mesh-based trail

This component keeps track of the world position of the object it’s added to. At a fixed interval the world position is stored as start and end points of the trail segments.

The trail tapers off along its length. UV texture coordinates are setup such that the U-axis covers the width of the trail and the V-axis covers the length of the trail. This allows the trail’s appearance to be defined using a texture.

ParamTypeDescription
intervalfloatThe time interval before recording a new point
materialmaterialThe material to apply to the trail mesh
resetThresholdfloatThe maximum delta time in seconds, above which the trail resets.
This prevents the trail from jumping around when updates happen
infrequently (e.g. when the tab doesn’t have focus).
segmentsintThe number of segments in the trail mesh
taperboolWhether or not the trail should taper off
widthfloatThe width of the trail (in world space)

two-joint-ik-solver 

Inverse kinematics for two-joint chains (e.g. knees or elbows)

ParamTypeDescription
copyTargetRotation?Flag for copying rotation from target to end
end?Bone attached to the middle
helper?Helper object to use to determine joint rotation axis
middle?Bone attached to the root
root?Root bone, never moves
target?Target the joins should reach for

video-texture 

Downloads a video from URL and applies it as diffuseTexture or flatTexture on given material.

Video textures need to be updated regularly whenever a new frame is available. This component handles the detection of a new frame and updates the texture to reflect the video’s current frame.

Materials from the following shaders are supported:

  • “Phong Opaque Textured”
  • “Flat Opaque Textured”
  • “Background”
  • “Physical Opaque Textured”
  • “Foliage”

The video can be accessed through this.video:

1  let videoTexture = this.object.getComponent('video-texture');
2  videoTexture.video.play();
3  videoTexture.video.pause();

See Video Example.

ParamTypeDescription
autoplayboolWhether to automatically start playing the video
loopboolWhether to loop the video
materialmaterialMaterial to apply the video texture to
mutedboolWhether to mute sound
texturePropertystringName of the texture property to set
urlstringURL to download video from

vr-mode-active-switch 

Allows switching all other components on an object to active/inactive depending on whether a VR/AR session is active.

Useful for hiding controllers until the user enters VR for example.

ParamTypeDescription
activateComponents?When components should be active: In VR or when not in VR
affectChildren?Whether child object’s components should be affected

vrm 

Component for loading and handling VRM 1.0 models.

Posing of the model should be done exclusively by rotating the bones. These can be accessed using the .bones property and follow the VRM bone naming. Note that not all VRM models will have all possible bones. The rest pose (T-pose) is captured in the .restPose property. Resetting a bone to its rest pose can be done as follows:

1vrmComponent.bones[vrmBoneName].rotationLocal = vrmComponent.restPose[vrmBoneName];

Moving the model through the world should be done by moving the object this component is attached to. In other words, by moving the root of the VRM model. The bones and any descendant objects should not be used to move the VRM model.

The core extension VRMC_vrm as well as theVRMC_springBone and VRMC_node_constraint extensions are supported.

Limitations:

  • No support for VRMC_material_mtoon
  • Expressions aren’t supported
  • Expression based lookAt isn’t supported
  • Mesh annotation mode auto is not supported (first person mode)
ParamTypeDescription
lookAtTarget?Object the VRM is looking at
src?URL to a VRM file to load

wasd-controls 

Basic movement with W/A/S/D keys.

ParamTypeDescription
headObjectobjectObject of which the orientation is used to determine forward direction
lockYboolFlag for only moving the object on the global x & z planes
speedfloatMovement speed in m/s.