Space Overview

Scene Hierarchy

To make things work correctly in Mona there are some factors that need to be strictly followed in order to reduce issues and increase functionality, an example hierarchy is shown below. Note that this is not what is found in the template SDK, just examples of Mona assets that you may want to use.

Let's go into each section in detail.

Space Hierarchy

Your Space scene should be at the top, and set as the Active Scene (right click - set active scene if it isn't by default). All environment assets that you would like to be minted with your NFT Space should be inside the "Space" GameObject (also commonly called the Space Layer). These objects cannot be edited after the Space has been minted.

If you have an Error saying there is more than one root object then there is a high chance that your hierarchy is different from the following.

Do NOT delete or rename the Space Scene, or Space GameObject. If the Space Scene or Space gameobject (with Space Tag) are not present, it will not work in Mona.

Make sure that your space looks like the following if you collapse all the assets.

All objects need to be a child of the Space, Portals, or Artifacts Gameobjects. If there is anything else on the parent level, your space will not build and also fail QA.

No Cameras

You shouldn't have a Camera in your scene when building your files for Mona. Make sure to delete them. There are some situations where you may want to use a camera to override the Users experience, but initially it is recommended to delete any Cameras in the scene.

Player Spawn Point

The spawn point is the Default location for a Player to spawn when connecting to the Space. There is a Spawnpoint in the Template scene, but if you delete it the Spawnpoint prefab can be found in the _MonaAssets folder in the Project window.

It is best to keep the Spawnpoint in the same location in the hierarchy as the image above. If it is anywhere else there is a higher chance the spawn location will default to 0,0,0.

The Spawnpoint Avatar gizmo needs to be blue, otherwise it will fail QA. Avatar Gizmos will turn red when they are within another collider, or not above a collider to land on.

It is recommended to keep all spawnpoints a little above the ground to make sure there are no spawning issues. This applies to all portal, Artifact and Canvas spawnpoints as well.

If gizmos are turned off using the Gizmos button at the top right of the scene window, you can use the Pivot and Local mode for your gizmo set with the Position tool selected (at the top left of the Unity window). The green arrow is up, and the blue arrow is forward on the default gizmo.

If there is no Spawnpoint in the scene or it is setup incorrectly, the Player will spawn at the 0,0,0 location in the Space.


Portals allow for easy navigation between Spaces inside of Mona.

These function as a hyperlink to other minted Spaces within the Monaverse. Portals can be used to form larger interconnected worlds. Portals can look like anything you want, although the most common is something like a door or portal.

Portals are optional, and require a specific hierarchy to function correctly. For more information on Portals, you can go to Portals

If you delete all Portals, make sure to keep the Portals Gameobject under the Portals Scene otherwise it will fail QA.


The Artifacts Scene/Layer keeps all the objects that can be modified/edited in some way after the space has been minted.

(Default) Artifacts allow users to open external websites from within the Space. Perhaps you want to create a virtual store with a 3D representation of a product in your Space, that links to the online store product page. Another use might be some furniture that you want to add to the space, but want the Owner to be able to remove it at a later point in time.

Artifacts are completely optional. If used, they require a specific hierarchy to function correctly. For more information on Artifacts, you can go to Artifacts

If you delete all Artifacts and Canvases, make sure to keep the Artifacts Gameobject under the Artifacts Scene otherwise it will fail QA.


Canvases are also considered an Artifact in that they can be modified or taken out after the Space has been minted.

Canvases allow a builder/owner to display external media types such as images in the Space. Currently .jpg, .png, and .gif formats are supported. Livestreaming is also possible with video coming soon. These media URL links can be updated after the Space has been minted.

As Canvases can be linked to external sites too, it might be used to showcase art in a Mona gallery that links to the marketplace page of the NFT.

For more details on how to setup and use Canvases, you can go to Canvases

If you delete all Artifacts and Canvases, make sure to keep the Artifacts Gameobject under the Artifacts Scene Otherwise it will fail QA.

Creating Assets

Most assets are made in an application that can export as .fbx. These include :

  • Blender

  • Maya

  • 3D Studio Max

  • Revit (will usually need optimisation)

  • Rhino (will usually need optimisation)

  • Gravity Sketch

  • and many others...

Assets needs to be less complex as they are used in a web browser. The larger the space, the simpler the assets will need to be. So you will need to keep a number of things in mind when creating your space such as :

  • Polycount

  • Number of materials

  • Size of textures

  • Limited real time lights (use lightmapping when possible)

  • many, many more...

If you are designing in the hopes of using a standalone VR headset or Mobile (both of which are not supported just yet) then this will be extremely important to keep in mind and will change your design directly. So your target device will decide how you design your assets.

Importing assets into Unity

There are a number of ways to import your objects into Unity.

  1. Save your assets into the Project folder directly.

  2. Right mouse click on the target folder in the Project window and select Import new Asset...

  3. Drag and drop your assets from Explorer into the target folder in the Project Window.

It is generally recommended to use .fbx to import 3D assets into Unity. .obj and .blend files are supported, but .fbx is simplest and has less issues with the importing process.

Setting up your imported assets

There are a couple of things you could / should do when importing your objects into Unity to save time later. These are optional however.

1. If you plan on using lightmapping (which we highly recommend) and want Unity to do the Unwrapping of the objects second UV Map to do so (so you don't have to do it yourself in another application), click on the Model in the Project window. Select Model at the top of the Inspector window, and turn on Generate Lightmap UVs (towards the bottom). This is a pretty common step to forget, so it's good to get in the habit early.

2. If you have Materials in Unity already created, you can make sure your model uses them by default by clicking on the model in the Project Window. Select Materials at the top of the Inspector window, and drag/Drop your materials into the correct slots for your object. These can be changed in the scene, but these are the default materials for your object and saves a lot of time when you are trying to update a lot of the same object in the scene.

More information on creating your space in Unity (such as lightmaps, animation, reflection probes etc can be found in the documentation to the left under unity Development).


Colliders are used to have the player avatar not fall through the floor or walk through walls and objects. There are several collider types you can use :

  • Box collider (used for most situations, light and quick on the processor)

  • Sphere collider

  • Capsule collider

  • Mesh Collider

As tempting as it is to use the Mesh Collider on all of your objects this is not recommended as this adds considerable load to the space. If you must use a mesh collider, make a very simplified version of your object and use that in the Mesh Collider component.

For more information you can go to Colliders


Realtime lighting

As noted previously, real time lighting is quite processor intensive and is recommended to keep to 1-2 real time lights unless you understand how to optimise heavily. Realtime lights allow dynamic objects (such as the player avatar or animated objects) to have shadows update in real time.

A Directional Light is included in the Template scene, however to add a new light to the scene you can add a Directional Light, Point light, or Spotlight to the scene by :

  • Right click on the target location in the Hierarchy window, go down to Light, and select which light you would like

  • Use the GameObject menu and go down to Light, and select the light you would like. This method places the light at the bottom of the Hierarchy however.

Lightmapping / Lightbaking

The best way to get good lighting is by using Lightmapping, which 'bakes' the light information into the scene for a much smaller cost to processing. You can find more information about how to do Lightmapping using the Unity Lightmapper here.

Realtime lights can be turned into Baked lights easily using the Mode pulldown on the light itself.

For more information on Lightmapping in Unity, you can go to Lightmaps (Unity Lightmapper)

Reflection Probes

Reflection Probes are used to fake reflections in a Space. As Mona can use Physics based Rendering (or PBR) for materials, you can get amazing results with reflection probes that represent real world results of materials such as shiny metal, through to buffed glass.

For more information on how to create Reflection Probes, you can go Reflection Probes

Lighting Probes

Lighting probes are a tool that allows dynamic objects to be lit more accurately from baked information such as lightmaps. All your fantastic baked lighting can be applied, in a way, to dynamic objects like the player Avatar so that it blends in with the environment better.

For more information on how to create Light Probes, you can go Light Probes


Animation can be done within the 3D Asset creation tools like Blender and Maya, or within Unity itself. There are pros and cons to each approach, so use whichever is more comfortable for you.

For more information on animating in Unity you can use the tutorial Animation in Unityor if you want to import animation from external applications, go to Importing Animation.


It is possible to create interaction using Visual Scripting or the Reactor tool. Visual Scripting allows you to create code using Nodes very similar to coding whereas Reactor gives you control over the Animator component using events like walking into a collider or interacting with an asset.

You can find more information on Visual scripting at Visual Scripting.

And you can find more on Reactor at the following : Reactor Overview.

In Summary...

In this page we covered the basics of the following :

  • Added a player Spawnpoint

  • Added Portals (optional)

  • Added any artifacts including the Default artifact or Canvas artifact (optional)

  • Imported and arranged all your assets

  • Added Colliders

  • Set up the lighting, both real time and lightmapped/Baked

  • Added reflection probes

  • Added Light probes

  • Added Animation

  • Added Interactive elements with Reactor

As noted, many of these elements have more details. Let's get into those first. Starting off with Portals!

Last updated