Author: | Anja Sturm |
---|---|
Supervisor: | Prof. Gudrun Klinker |
Advisor: | Linda Rudolph, MSc |
Submission Date: | 15.12.2020 |
Abstract
Auszug |
---|
The recent advent of Software Development Kits for an easy Augmented Reality (AR) development promises a fast incorporation of AR systems into Industry 4.0. Since the underlying mathematical concepts of these AR systems are kept secret, insufficient tracking accuracy could severely limit the promising applicability of these systems in Industry 4.0. This paper studies the combined user perception and tracking accuracy effects of Google’s ARCore SDK in an industrial context, using a proven localization reference system to get ground-truth data. We introduce a general applicable approach to spatially register Google’s ARCore system with a reference system that allows to compute the separated user perception and ARCore tracking errors. We show that for the usage task of supporting workers drilling holes into a wall by displaying the exact 3D position of the desired hole with a virtual cross, ARCores tracking is far from useful. The virtual cross was projected up to 33cm away from the plane of the wall. Due to this high tracking fluctuations, the perception error could not be independently estimated. |
Example Code
https://gitlab.lrz.de/IN-FAR/Thesis-Projects/ba_ws20_sturm_anja_perception_measurements
Bachelorthesis Task
Literatur Review
- fundamentals: hand-held AR, sensors, etc.
- tracking as source of inaccuracies
- user perception as source of inaccuracies
App
- the user should be able to place a virtual cross in spatial relation to a marker
Note: Yes, but not in 3D (would be a new source of error). I would suggest you to have 3 Textboxes for enter the x, y, z positions of the cross (and an "enter" button). - the user should be able to scan the marker to get an anchor point
- the cross should be projected onto its predefined position depending on the previously scanned marker
User Study (4-12 participants)
Goal: a user should be able to mark the cross on a wall with a maximum deviation of 10 cm to its virtual position
- vary the distance of the marker and the virtual cross (< 6m, 6m is insufficient; see paper)
- for each distance variation, analyze the influence of supporting features on the tracking accuracy
- Depending on the marker-virtual cross distance, how many supporting features are required to achieve a deviation <= 10 cm (if possible)?
- analyze to what extend the perception of the same virtual cross deviates among different users
Results/Implementation/Project Description
Conclusion
In this thesis, we investigated the combined user perception and tracking accuracy
effects of an ARCore AR application that supports workers drilling holes into a wall by
displaying the exact 3D position of the desired hole with a virtual cross. In order to get
ground-truth data, the AR application was empirically controlled with two HTC Vive
trackers as proven reference systemsProblem Description
The goal of this thesis is to investigate the combined user perception and tracking accuracy effects of an ARCore AR application that supports workers drilling holes into a wall by displaying the exact 3D position of the desired hole with a virtual cross. We therefore want to compare the visual error, the perceptual error and the tracking error as the Euclidean Distance between the position the virtual cross is intended to be (intended position), the position the virtual cross is perceived by the user (perceived position), and the position the virtual cross is actually augmented to by the AR application (augmented position), see figure 1.
Figure 1: visual, perceptual and tracking error
In order to get ground-truth data, we used two HTC Vive trackers as proven reference systems to empirically controlle the AR application. The first one was attached to the smartphone’s back (see figure 2) to get the smartphone pose, the second one was attached to a tip tool that enabled(see figure 3). This tiptool enables the user to communicate the perceived cross position by tipping the cross with the tiptool. tiptool.
Figure 2: Tiptool with attached Vive Tracker Figure 3: Smartphone with attached Vive Tracker
The Vive trackers were then spatially registered with the ARCore tracking system by the spatial relationship graph depicted in chapter 4. figure 5.
Figure 5: Spatial Relationship Graph
Based on this graph, the evaluation pipelinein chapter 7 first calculated the position at which the AR application augmented the virtual cross, and the position at which the user perceived the virtual cross were calculated. Finally, the tracking error of the AR application was calculated as the Euclidean distance between
the intended and the augmented position, and the perceptual error of the user’s visual system as the Euclidean distance between the augmented and the perceived position of the virtual cross. Figure 6 gives an overview of the complexity of the evaluation pipeline.
Figure 5: Evaluation Pipeline
Results
We discovered that the tracking accuracy of the AR application implemented in this theses is insufficient to fulfill the intended usage task. The virtual cross was projected up to 33cm before the wall into the air and was therefore unable to indicate where to drill a whole into into a wall. Furthermore, these tracking fluctuations lead to a dependency between the tracking error and the perceptual error, since the users followed the usage task and therefore marked the position where they perceived the cross in the plane of the wall. Since the spatial registration scheme was constructed for independent tracking and perception errors, the independent perceptual error could not be determined within this paper.
Overall, the evaluation scheme introduced in this paper had some weaknesses thatwere discussed in chapter 8. For AR to fully enter the field of Industry 4.0, it may be of great importance to continue the research of this paper and to further investigate the overall visual placement accuracy capabilities of handheld augmented reality applications in industry settings.
View file | ||||
---|---|---|---|---|
|