Introduction to WebXR Development
AR and VR on the web is amazing technology. It allows building and publishing your ideas without the hassle of going through curated stores, but the ability to share easily like you share a website.
VR on the web has been around since the first release of WebVR 1.0 in Chrome and Firefox in 2016. This experiment has since been replaced by the WebXR Device API, which will eventually also support Augmented Reality (and is available in Chrome for Android since November 2020), not just Virtual Reality. Now that this new technology has reached a stable enough point so that developers can actually start relying on it, how do you get started?
Framewoks
Read our last blog post for some guidance on how to choose a framework.
3 Web Specific Challenges
Web- and native XR development differentiate in three large points:
Loading Time
Waiting for a game to load is not acceptable on the web. Every second the user has to wait increases the drop off rate.
Keeping your game assets small, using compression and generally ensuring fast load times is key to keeping your users happy.
Performance
The web runs sandboxed by the browser. This means that it is isolated from the running device in a way that it cannot harm the device.
The Sandbox comes with performance overhead, as what the website wants to do needs to be double-checked. That together with the overhead that JavaScript brings over a native application make VR and AR even more of a performance challenge.
Scalable Quality
If you release a native application in VR, you usually target a few very specific devices. Since know their performance metrics and can adapt each build per platform.
The beauty of the web is that you are so radically cross-platform that any device could be running your WebXR app. That means that you need to consider anything from a toaster to a desktop PC and draw the line somewhere.
In VR, the range of devices also come with a range of input methods: from simple gaze controls (Google Cardboard), to 3-dof, 6-dof controllers, hand tracking (Quest 2) and eye tracking (Quest Pro, Apple Vision Pro).
You will need to decide the lowest common denominator that should be able to run your application.
Solutions
It is important to face these web specific challenges from design on: they have influence on your choice of art-style, choice of interaction experience and can make or break your application.
Load faster
In general, the smaller your files, the faster the game will download. On Web it is best practice to only load the initially required assets and lazy load further assets during runtime.
Text-based formats usually require longer parsing times, so consider using binary asset formats instead. Image formats like WebP and Basis Universal can improve file sizes on CPU and GPU over traditional PNG and JPEG images.
Performance
Make sure you watch performance and optimize according to your scope. Keep assets small and “spend” resolution on important assets and based on distance from the user.
Special browser extensions like OCULUS_multiview2 can help with GPU rendering performance. Others, like the upcoming WEBGL_multi_draw will help with CPU performance.
Most frameworks will have a performance guide or tips documentation. Check that to see what you can do right from the start in case you are tackling a larger project.
Scaling
For better scalability, you can load assets dependent on the headset input type or user agent string. Gaze based systems are usually lower end than 6-dof devices or those that support hand tracking.
Wonderland Engine
The Wonderland Engine aims to take care of these optimization tasks for you. Our goal is to allow you to focus on your app’s features without having to worry about the technical challenges of getting it to run smoothly.
Wonderland Engine has recently reached version 1.0. We’d love to welcome you to our Discord Community!