Florian Philipp
Supervisor:Prof. Gudrun Klinker
Advisor:Sven Liedtke
Submission Date:15.06.2019


Remote Guidance applications can be used for solving problems without having to be at its location personally. This could be done through streaming the user interface(UI) of the device the problem occurred, but the capabilities of applying the use-case of Guidance to more physical problems like repairing machines are minimal. The field of remote guidance applications could benefit from new technologies in video recording and streaming as well as from new, cheaper and more light-weight head-mounted displays(HMD) for Augmented Reality(AR) or Virtual Reality(VR) ways of displaying information. Mixed Reality could offer the possibility of visualizing objects through some 3D scanning mechanism to achieve omnidirectional view-space and manipulability of the virtual scan.

This thesis will focus on the enhancement of remote guidance systems through Mixed Reality technologies. Therefore in a lab experiment, a Rubik's Cube is visualized with the help of the UbiTrack library for tracking the cube's faces.

On the one side, an AR application is implemented. With it, the Rubik's Cube will be scanned. Afterward, it is transmitted to a NodeJS remote web server with MongoDB database integrated. The other application will request this data. Furthermore, it stores instructions too. This instruction will be presented through different UI methods. It will be the focus of the evaluation in the end for researching the impact of different UI methods on spacial understanding. On the other side, a VR application is implemented. It will be used for visualizing a virtual version of the real Rubik's Cube. It will be possible to partial rotate sides of the virtual Rubik's Cube which causes a transmission of the rotation done to the web server. Additionally, a UI displaying a solving manual and a live video stream of the AR view will be provided to this application.

The VR application is intended to be used by an instructor while a client uses the AR application. With that, an intuitive and hands-free remote guidance application should be invented.

Mixed Reality Remote Guidance Application with Web Server and Augmented Video Streaming

Instructor View (VR)

The goal of the instructor-side view is to provide a virtual view with a virtual reality device of a virtual problem object. In the lab experiment for this thesis, a virtual Rubik's cube could be inspected with an Oculus Rift VR glass. The cube's face placements are provided by scanning results of another application mentioned later. The virtual cube could be manipulated intuitively with Oculus Rift controllers without requiring any special previous knowledge.

The final version is providing a solving manual with laser gaze navigation on the left. Centered in the starting view, the virtual Rubik's Cube is placed. It face coloration is updated as soon as the scanned version of the real cube is stored at the web server.


A new grabbed-rotation script,  inheriting from the Oculus grabber script, was invented to allow partial rotation with the one hand while holding the whole object with the other hand.

On the right side a live video stream of the the augmented camera view of the AR application is shown.

Client View (AR)

The goal of the client-side view is to provide an Augmented Reality interface for getting assistance through a remote guidance system. For later evaluating the knowledge transfer through different techniques, three UI methods will be implemented. In the lab experiment for this thesis, first a physical Rubik's Cube is scanned. It is then transmitted through a web-server to the instructor application. Solving instructions are then transmitted back through the same web-server to this application and visualized through one one of the UIs. As aim of the project a platform for evaluating the quality of knowledge transfer should be created. As AR device for this lab experiment the Meta2 glass was chosen.

Tracking: Marker Tracking with Ubitrack

For every of the 54 markers a corresponding entry in the SRG for Ubitrack has been created, which gets somehow confusing, so a structured variant for readability in the thesis was created too.

Furthermore the corresponding marker had been placed on each of the faces of the cube. For the evaluation five equal Rubik's Cubes were prepared this way.

When using the Ubitrack print template for Latex, the size is measured with the white borders, while the size filled in the SRG is measured without it.

Solving UIs

Picture-Based UI Augmented Virtual Cube-BasedAugmented Virtual Arrow-Based


For finding the best UI design to enhance spatial understanding of steps solving the Rubik's Cube, a User Study was held.

During the Evaluation the firmware of the Meta2 started causing problems. This mostly influenced the Augmented Virtual Arrow based UI, since it was the only one using live depth data for placement.

For Evaluation the INTUI test as well as the UEQ were used.

INTUI ResultsUEQ Results

Resulting the two test, the virtual cube based UI has work best.

Since the INTUI Test only comes with a SPSS analysis file, a excel analysis file is provided here

Slides Kick-Off

Slides  Final

[ PDF (optional) ]