Spatial tracking

Spatial Tracking allows to scan a whole room, facility or rigid objects to create a 3D representation of the environment, that will be uploaded and processed within Onirix Studio. There you will be able to configure the AR content that will be anchored to the real world. Start creating awesome Web AR experiences with Spatial tracking and its visual positioning system.

Spatial AR Beta: January 2024

docs%20spatial

The beta of Spatial for Web AR will be released in January 2024 with the v.2.58.0 release. To access it, beta access request mechanisms will be provided within our Onirix Studio platform.

Beta access will be time limited and controlled by our customer success team. You will need to fill in a form to be able to test this new functionality.

Compatibilities for beta testing

  • To generate new spaces it will be necessary to use our construction app: Onirix Constructor. Only available for iOS at the moment.
  • To visualise the final result, i.e. location and tracking of the contents,
    • iOS: and from the version v.2.62.1 (8 April 2024) also with iOS devices with ARKit compatibility (from iPhone 6s onwards). It is necessary to have the operating system version updated to 16.4 or higher.

Spatial AR: steps for creation

The steps for creating these kind of contents are:

  1. Construction: Scan the environment with our Onirix Constructor app (currently only available on demand). More info about the construction process here.
  2. Processing: Upload the captured space to Onirix Studio. Here our system will process the visual information to create the 3D virtual reference of the real world.
  3. Configuration: Access to your Onirix account to include all the AR anchored content and element interactions you want to be available in your experience.
  4. Visualization: first, a visual location process will detect and recognize the environment around the user; and then the tracking will start, locating all the AR content perfectly placed in the real world.

1. How to create a new scene

The first step to create a new spatial-tracking scene is to scan the environment with the Constructor app. You will only need to take a walk around the environment you want to capture, as if you were recording a video.

Try to use rich texture regions for creating your scans. For example, a simple door won't work well if there are other doors with the same shape in the same enviroment as the system won't be able to distinguish one door from another. More tips about scanning here.

This will generate a 3D model with textures that represents visually the real environment. Something like this example:

boiler-room-scene

Here there are some tips to make a good scene capture:

  • Scan your environment slowly around the area you want to capture from a distance of about 1 meter.
  • Moving (walking) around an area is required, just rotating or looking around from the same place won't capture the environment correctly.
  • Avoid hard illumination changes. Entering a dark room or capturing the sky in a sunny day would break scanning.
  • Avoid textureless regions like empty walls, they won't be able to generate info for the point cloud, as there is no way to generate depth with just textureless images.

A good internet connection is required for scene capturing


2. Configure your scene

The next step is to add visual content to the scene from Onirix Studio. You can add virtual elements, and add interaction through events.

3. View your experience

Last, but no least, you will be able to access the experiences as any other kind of experience in Onirix: using the QR code associated, or the web link, to open the browser. This options are included in our Sharing options feature.

You will have to be in the same environmente where the capture was taken in order to localize yourself in the environment. After that, AR content will appear and you will be able to play with the experience.

Results