Author:

Johann Bernhardt, Mehmet Gül
Supervisor:Prof. Gudrun Klinker
Advisor:Adnane Jadid
Submission Date:

29.02.2020

Abstract

“aMAZEing Reality” is an AR (Augmented Reality) technology demonstrator for a VR (Virtual Reality) device in a mostly location independent application and shall serve future projects by providing a lower entry barrier into the technology. aMAZEing Reality was developed in the scope of the “Augmented Reality Lab Course” held by Adnane Jadid at TUM and is designed to fulfill the project criteria of implementing tracking, visualization and game logic. In this project a VR headset and integrated tracking is combined with external cameras to provide stereo video see-through with the goal of making it usable for AR applications. This included the registration of the cameras to the VR headset tracking system. In terms of game logic and visualization a simple maze was provided of which the player has to escape out of. This maze was procedurally generated, contained secrets to find and is configurable by the user such that it uses the available space in an appropriate manner. The following is a report on the projects development with all its setbacks and successes.

Table of Contents



Defining the Project

The authors interests are heavily focussed on the algorithms and less on visualization/UI and therefore a project was designed that could focus on those interests. Also there was interest to use GPS and outdoor tracking. Following this realization it was decided that a maze, which is particularly easy to visualize, would be built and a player would have to escape out of it. This maze would be located on a field outdoors to provide enough space for a non-trivial maze and be tracked by a camera, IMU and GPS. A maze is also a big object which can be used to at least subjectively judge the quality of the tracking since especially deviations in rotation are magnified the larger an object is. This way it would provide easy feedback regarding the quality of the tracking. Also the maze was intended to be generated procedurally to keep the game interesting and able to adapt to any given area.

The lab course defines three goals of which each project has to fulfill at least two. Those goals are:

  • Tracking usage/adapting of tracking systems with a set minimal work
  • Visualization using Unity3D to display tracking results and/or a game
  • (Game) Logic implementing game features

Implementing a procedurally generated maze is intended to satisfy Game Logic while constructing an outdoors tracker for large AR system aims to the Tracking goal.

Implementing a Maze Generator

To select a suitable algorithm we conducted a literature search, in which we found the following masters thesis by Foltin M: https://is.muni.cz/th/143508/fi_m/thesis.pdf .

In their thesis Foltin M. summarises the different properties of which mazes consist and gives a good overview of different algorithms and their outcomes.

Combining different properties can lead to a wide variety of different and interesting mazes.

Considering our use case, we decided that a maze with the following properties should lead the best results. We decided to go with a two dimensional maze, considering how this is the most common type of maze and anything other than a two dimensional maze would be infeasible in an augmented reality environment, compared to a VR environment where a three dimensional maze could provide more depth and complexity to the experience. While a non euclidean maze would be an interesting maze to experience, an euclidean maze makes more sense in an augmented reality world and would also feel more natural and real, considering how reality is euclidean. Another big factor to consider was which kind of tessellation to use.Tessellation describes what shapes the space is divided into. A maze in the shape of a ring could be very interesting, but building the maze could prove troublesome. A deformed maze could lead to a more special experience, but could be very unintuitive and difficult to traverse. We decided to go with a standard orthogonal maze, which uses rectangular walls, with horizontal and vertical paths. To manage the difficulty of the maze, the most important attribute would be the routing of the maze. The two most logical options for this project would be a “perfect” maze or a “braided” maze. A perfect maze has to follow three rules: no loops, no isolated cells, only one path between two cells in the maze. A braided maze, on the contrary, has to follow only one rule: no dead ends. We decided to go with a combination of the two by having some dead ends and by having some loops in the maze. By adjusting the ratio the difficulty can be adjusted based on the needs of the player.

To create such a maze, we decided to go with a recursive backtracker, which creates perfect mazes. To add loops to this maze, we remove a wall at the end of random dead ends, based on the ratio which we aim for. 

To give the player the option to see the path leading out of the maze, we also implemented a maze solving algorithm. For mazes with no loops, a depth first search algorithm would have worked perfectly fine, as there is only one path towards the exit. But if there are loops in the maze, then the depth first search would not necessarily give the shortest path to the exit but would just give a random path instead. Therefore using breadth first search will guarantee that the solver will show the shortest path.

Screenshots of the Maze:


Maze with no loopsMaze with no dead-endsA solved Maze

Maze with no loops

Maze with no dead ends

Maze with shortest solution visible

Tracking with GPS: First Problems

The chair for augmented reality is developing a platform called “ubitrack” for quite some time now. This platform heavily simplifies tracking for games by separating it and providing a surprisingly high level of abstractions using scene graphs[quelle]. Although ubitrack is able to provide camera tracking and fusing with velocity/acceleration data with just a few clicks, it does not have a GPS module yet. This required us to write a module which could provide GPS data in the correct format and interface. Luckily there were already modules available. Those modules are for InertialSense’s μIMU and VectorNAV which are devices able to provide GPS data but the available modules did not take advantage of it. Therefore, all three modules were updated to the current build system and then one was selected to be used: the InertialSense μIMU on recommendation of our supervisor for providing proven data quality and for being available. Following this its ubitrack module was extended to also support GPS data forwarding. Although this could be accomplished it revealed a problem for the next step: coordinate systems. GPS devices report positions in longitude, latitude and height which are coordinates in an ellipsoidal coordinate system and cannot easily be converted into and from cartesian coordinate systems that are used by game engines. Further research into the selected device revealed that also cartesian coordinates could be provided so this feature was added to the module. Unfortunately, implementing a GPS module for ubitrack took far more time than expected so it was placed on hold until all other basics of the project would be completed since tracking would also work without GPS.

[Screenshot Trackman mit InertialSense μIMU]

Source: https://inertialsense.com/imu-sensors/#IMU-sensors

Selecting a Head Mounted Display (HMD)

Giving the project definition HMDs have the requirement to operate on large outdoor areas. While this seems to not be a big deal as long as it is not raining, it proved to be problematic for the Microsoft HoloLens since sunlight drastically lowered the contrast on the HoloLens. This effect exists because the HoloLens projects onto a transparent plane and therefore has to overpower the incoming light with its projection. This is problematic since sunlight is magnitudes more powerful compared to artificial or passive lighting indoors. While being a native AR solution and therefore attractive, the HoloLens’ limited field of view (FOV) and low contrast in direct sunlight severely limited its usability. Therefore, we choose to look for another option. Fortunately, the games lab provides many opportunities to test various HMDs to find the most fitting for the project.

Another HMD to consider was the HTC Vive. The screen not being made as transparent overlay but being encapsulated inside a rather light tight case it does not suffer from strong backlighting by the sun like the HoloLens does. But the HTC Vive requires static devices to assist with Outside-In Tracking and is unusable in standalone outdoor operation because the device ceises operation when out of range of the tracking stations.

The last HMD that was considered was the Oculus S. Like the HTC Vive it also does not suffer contrast issues like the HoloLens. In contrast to the HTC Vive it uses Inside-Out Tracking and therefore has no further hardware requirements outside the device itself. This enables the Oculus S to track in arbitrarily big areas which is exactly what this project requires. Since the Oculus S is a VR HMD it was not developed with any see-through in mind. The cameras used for tracking are even blocked for outside access due to privacy concerns. To achieve see-through anyway a ZedMini was used and the video stream of each camera was overlaid onto each eye.

Quote: 

  • Like the white LED on Quest, the blue LED on Rift S indicates when the headset’s cameras are active; this is a hardware function which can’t be circumvented with software.
  • If a hacker gains root access to Quest or the Rift S host system, it would be possible to access the cameras on the headsets (similar to a camera on a compromised smartphone or PC).
  • Third-party developers cannot access the headsets’ cameras in any way.


https://www.roadtovr.com/oculus-quest-camera-privacy-rift-s-facebook/

Setting up Oculus Rift S for AR application

A lot of different steps need to be followed before the Oculus Rift S and the Zed mini can cooperate together and be recognized by a computer. There can also be some compatibility problems between the Oculus Rift S and the Zed mini. According to other people in our class, the Unity package for the Zed mini and the Unity package for the Oculus Rift S do not work together. Therefore using Steam VR and the Steam VR package for unity is recommended. 

To set up the Oculus Rift S, the Oculus software needs to be installed first, as Steam VR requires this software. We had the version: 13.0.0.280.463.

Then the Oculus Rift S needs to be connected with the computer. The Oculus Rift S has two cables, a display port cable and a USB 3.0 cable. 

Next click the Device setup button and follow the directions until the end. Once that is finished, close the Oculus software. 

Now install the Zed Mini SDK. The SDK installation may prompt you to install some CUDA drivers, which implies that the Zed Mini may be incompatible with a graphics card made by a different company than NVIDIA, but further testing is required. 

Next, install Steam VR from the steam store. We had the version: 1.19.16, and the unity plugin had the version: 2.2.0. Once started, Steam VR may start the Oculus software automatically. Keep in mind that you will need to be logged into your steam account to change controller settings for your unity project.

Next download and install the Zed mini unity package and the Steam VR unity package. Now whenever the unity project gets started, Steam VR will start automatically, which in turn will start the Oculus software automatically if the Oculus Rift S is connected. Our Zed mini unity package had the version: 2.8.0.


Now whenever you want to work on the project, connect the Oculus Rift S cables with the computer, and then start the unity project from the unity hub. Opening the project in unity will start all the required software automatically.

Quelle: https://www.oculus.com/setup/#rift-s-setup 

Common problems

While working with the previously described set up, we encountered a lot of different problems.

One very common issue is Unity crashing. This especially started happening once the Zed mini was added to the project. Unity crashing is generally not a big problem. But if the project crashes while using the Zed mini, it can lead to the Zed mini to be seen as still in use. This leads to the Zed mini becoming unusable. To solve this problem, just unplug and replug the Zed mini into the computer.

Another problem that can happen is the Zed mini not being recognized as plugged in. This is probably caused by an unstable cable connection. Either the cable is not plugged in right, or it is loose/faulty. Try to unplug and replug the Zed mini and make sure that the cable is not loose or faulty.

Sometimes the Oculus Rift S would also not be recognized as plugged in. To solve this problem first close your Unity project, then Steam VR and then the Oculus software in this order. Next unplug and replug both Oculus Rift S cable into the computer. Now open the Oculus software, click on device setup to set up the Oculus Rift S so that it ends up being recognized by the software, then close it again and start the Unity package which in turn will start SteamVR automatically which in turn will start the Oculus software automatically. Now everything should be recognized correctly. 

Another issue that may arise is the lack of input recognition. The way steam VR tracks input from the controllers is by using the controller bindings set by the SteamVR controller bindings options. These bindings are stored on the cloud, connected to the steam account used at the time. If there is no internet connection, then the input bindings may not be available and the input will not be recognized by Steam VR. To solve this problem, just connect to the internet and log into steam with the steam account on which the bindings are stored.

Registration of ZEDmini and Oculus S

To enable registration between the ZEDmini and Oculus S we first need an object that is tracked or can be modified into a tracked object in both coordinate systems respectively. This object can then be used to sample multiple measurements for positions at both devices at the same time to form a system that can be solved for the transformation between the Oculus S coordinate system and the ZEDmini coordinate system. Given that Oculus controllers are already tracked in the coordinate system they become an obvious choice. But we do not know where any point on the controller itself is relation to the tracking results provided by the Oculus API. Calculation the constant transformation from the tracking results to a selected point on the controllers themselves is the first step in the registration of this system.

To begin we select an arbitrary point on the controller that the controller can be pivoted around and follow up by calculating the position with respect to the controller coordinate system. To calculate this transformation tip calibration as described by Ziv Yaniv [1] can used. This method utilizes that rotating a controller around a pivot keeps the pivot point constant in the local controller coordinate system as well as the tracking coordinate system of the Oculus S. Fortunately the solution of this is implemented in Ubitrack and only requires the setup of the following Spacial Relationship Graph and corresponding dataflow network description to set up:


In this solution the controller poses are fed directly from unity into the tip calibration module that calculates the transformation matrix and saves it to a file. Now the tracked pose of the controller can be used to get the tip position in the Oculus coordinate system by simple multiplication.


Game design choices

Putting a Maze into an augmented reality environment can open up a lot of different possibilities and make the entire experience a lot more immersive. One big problem though that comes with the territory is that virtual restrictions do not apply to the real world. Nothing stops the player from walking through a wall to rush towards the exit. To avoid cheating, we decided to implement some detection to check if the player is walking into a wall. If that is the case, the maze gets moved relatively to the players movement so that the player stays in the same spot in the maze.

Feedback

This project was presented at the Demo Day 2020 at the TUM. There we managed to get a lot of feedback on our project.

Some of the things which the people liked were as follows:

  • The Anti-Cheat system which makes it so that one cannot walk through the wall
  • The randomly generated maze looked very cool
  • The whole experience is fun
  • Everything felt very immersive

But of course we also got some constructive criticism:

  • The graphics are not very nice looking/too low resolution
  • Maybe implement a Nightmode, for when it is dark outside
  • The ground seemed too high
  • Some slight drifting issues with the tracking
  • Some rotation issues with the maze

The rotation issue mentioned here is already fixed, but avoiding drifting is something which is very difficult, if not even impossible to implement. Also because this is only a Demo, and no one in our team had any artistic skills, we were not able to improve the graphics further.


Future work

Considering that this project was mostly a Demo to showcase the technology and not a fully fledged game, we of course did not spend a lot of time to make the game visually, nor auditory, appealing. Improving the project in those aspects could lead to a better feeling of immersion for the player. Quality of life functionalities, like a minimap or a more visually pleasing UI, were also not implemented fully because of the lack of time. These could of course improve the enjoyment and immersion for the player as well.


Conclusion

In conclusion, this project was very successful in creating an immersive experience. While creating this project we also gathered a lot of experience about using existing Augmented reality technologies to create such an immersive experience. We documented everything we learned in this wiki page.


Video of game-play:

Kick-off presentation:

Final Demo day presentation slide:

Authors: