|Supervisor:||Prof. Gudrun Klinker|
In recent years, Augmented Reality achieved some huge development progress and reached millions of consumers via their smartphones. Due to advances in mobile computing power, ARCore and ARKit are able to provide markerless motion tracking of the device's location and rotation through the real world. Thanks to those frameworks, many AR apps and games have been developed, allowing to overlay the real world with virtual content. However, the interaction between the virtual and the real world in such apps is quite limited at the moment. Virtual objects can be placed on even surfaces around the user, but there is no further understanding of other real objects present.
This thesis aims to develop a system that provides a deeper understanding of the surrounding real world. The presented system does not need any other input than the point cloud of visual features that is used by the AR frameworks to track the device's position. This point cloud is used to create 3D mesh geometry representing real-world objects. These meshes can then be used for additional interaction between the physical and the virtual world. A sample game was implemented to showcase the results of the proposed system using those meshes as core gameplay elements.
The application that was implemented using Unreal Engine and ARCore is able to generate approximate meshes for objects on a tabletop.
An example scene setup that can be used with the application.
Front view of the generated meshes.
Back view of the generated meshes.
Video showing the full reconstruction process and the sample game using the generated meshes:
- It is possible to generate 3D geometry from ARCore tracking data that approximates the real world
- Such geometry can be used in a game to enable more interaction between the real and the virtual world
- The proposed system needs to be refined before it can be used in consumer applications
- The quality of the generated meshes is highly dependent on environmental conditions