Maximilian Wandinger, B.Sc.
Prof. Gudrun Klinker
Dipl.-Inf. Univ. David A. Plecher, M.A.
With museums starting to use more and more technology within their exhibitions to enhance the overall visitor attendance and experience, especially VR visualizations of cultural heritage can become quite interesting for museums as they allow new ways how visitors can experience and interact with their exhibits. So, this thesis gives an impression about topics discussed in research that could help to improve the visitor experience within museums and focuses on the advantages and disadvantages of existing VR technologies regarding their application within a museum. Moreover, it covers the development of a new hybrid framework, called SaMaXVR, with the goal to combine the presented benefits of desktop and mobile VR systems by wirelessly streaming VR content that was rendered on a powerful gaming PC to a smartphone inserted into low budget VR goggles. Since the user experience and therefore the applicability of this framework highly depends on the overall latency noticeable by its user, different approaches and configurations which were tested are compared and discussed with respect to their benchmarks. Using the best configuration, a RTT of only 27.36 ms was achieved. Finally, the development of a VR application is presented that uses SaMaXVR as a solution for museums to visualize high-detailed 3D scans of the statues that were formerly located within the western pediment of the Temple of Zeus at the ancient Olympia in their reconstructed former environment. Combining SaMaXVR with Kinect 2 head tracking compensated the huge disadvantage of SaMaXVR and mobile VR not offering positional tracking on their own and made it possible to achieve a similar experience to desktop VR systems. As with the usage of SaMaXVR, the whole VR content is rendered on a powerful PC, not only the visualization of the very high-detailed 3D scans on smartphones became possible but also a very high graphics quality was achieved, which normally would not be possible in mobile VR.
As nowadays a lot of museums seem to struggle maintaining their audiences, they are looking for new possibilities and technologies that can be used to improve the experience and the attendance of their visitors. Common approaches that can be found are the improvement of their online presence, the creation of smartphone apps and the usage of popular emerging technologies such as augmented and virtual reality to arise the interest of their visitors or to create new and interesting ways how visitors can experience and interact with their exhibits. Especially the usage of virtual reality seems very promising as it allows their visitors to experience their cultural heritage in many ways that would not be feasible in the real world due to monetary or spacial limitations or to the fragility of exhibits. So, museums have the chance to enhance their limited physical space by infinite virtual space thus allowing them not only to exhibit more artifacts but also to create exhibitions which could be bigger than the whole museum itself. This makes it even possible to visualize cultural heritage in its original context by reconstructing its whole former environment as it could have looked like in the past. To visualize the real artifacts in VR with as much accuracy as possible, particularly 3D scanning real artifacts becomes interesting as this approach is able to create almost perfect virtual representations of the real artifacts. Nevertheless, visualizing high-detailed 3D scans in high-quality in VR can still be quite difficult as their high amount of geometry imposes the need of a lot of computational power on the underlying rendering hardware. Furthermore, common desktop VR systems like the Oculus Rift or the HTC Vive as well as mobile VR setups have their own advantages and disadvantages regarding maintenance, usability, security, computational power and graphics quality which makes their application in a museum quite challenging. So, the goal of this thesis is to prove whether it is possible to combine the benefits of desktop and mobile VR concerning their application within a museum by creating a system capable of visualizing high-detailed 3D models of cultural heritage in their original context by wirelessly streaming PC generated VR content to a smartphone inserted into VR goggles.
SaMaXVR is a framework for the streaming of PC generated VR content usable in game engines like Unity3D, developed to enable high-quality VR experiences even on smartphones with limited rendering capabilities. It can be seen as a low budget alternative to desktop VR and an improvement to mobile VR in terms of graphics-quality and visualization of computational intensive VR content. By being able to render all its content on a powerful PC, SaMaXVR can use the same amount of computational power as typical desktop VR systems thus removing the disadvantage of mobile VR being restricted in terms of graphics quality. Unlike Furion, no VR content has to be split up into foreground and background content, which can be quite problematic when visualizing complex objects directly in front of the user. Nevertheless, it provides the already presented benefits of mobile VR concerning maintenance, cleanability, usability and safety as it can be used along with BYOD policies with no need for expensive hardware to be accessible for users. Compared to desktop VR HMDs, cheap and easy to clean VR goggles can be used, which additionally lower the risk of theft and financial damage. Without many different devices and wires being necessary even the setup of SaMaXVR is much simpler, which makes this system easier to maintain and much less sensitive to accidental damage.
[ 1 ] [ 2 ]
SaMaXVR consists of two components. Code that can be easily integrated as a native plugin into existing Unity3D applications and games developed to run on a PC and an app that has to be launched on a smartphone. The PC-side code is responsible for updating the virtual camera according to the current head orientation, acquiring the final rendered images and sending them as a compressed image stream to the connected smartphone. The app running on the smartphone is responsible for sending the current calculated head orientation to the PC, for decompressing the image stream and for showing the images in full screen mode. To ensure a smooth high-quality VR experience at 60 fps, the overall goal during development was to keep the overall round trip time as short as possible while still achieving a high resolution and good image quality.
To demonstrate the ability of SaMaXVR to visualize high-detailed 3D models of cultural heritage in VR on today's smartphones, an application was developed that enables its users to study the 3D scanned castings of the antique statues that were formerly located within the west pediment of the temple of Zeus at Olympia within their former environment. Therefore, the temple and the other surrounding buildings were reconstructed according to historical descriptions and pictures of reconstructions published by museums and universities. The figure below shows the reconstructed Temple of Zeus with the 3D scanned statues in its west pediment.
As today, the buildings do not exist anymore and at the excavation site only the foundations are visible, this application should help its users to travel back in time to perceive this place as what it could have looked like in the times of ancient Greece. To be able to take a closer look at the statues, the west pediment and the statues are also visualized in a separate area on the ground. By additionally tracking the user's head with a Kinect 2, the user is able to move around in front of the statues and view them from different angles.
To solve the issue of not having enough trackable physical space to cover the whole scene, an interactive teleportation system was implemented which helps the user to navigate through the virtual environment by "jumping" to other areas where he is also able to move freely due to the head tracking.
The 3D models of the statues were obtained from the "Museum für Abgüsse Klassischer Bildwerke München", which created the models by 3D scanning their castings of the original antique statues that were formerly located within the west pediment of the temple of Zeus at Olympia.In this application, not the whole group of figures is visualized, as not all models of the castings were obtained from the museum. So, Apollo and some other figures are missing, as no 3D scans of them are available. The scans were obtained in the ".ply" format, which meant that they had to be converted into ".fbx" first to be able to import them into the Unity3D project. This was done by using the free and Open Source 3D software "Blender".
Depending on the figure, the file sizes were between several hundred MB up to nearly 2 GB at the highest available resolution. While importing them into Blender was possible for all the figures below 1 GB, the import of the scan reaching nearly 2 GB caused the PC to "freeze" after several hours, as 32 GB RAM were not enough and it ran out of memory. Even though for all other models an export into ".fbx" was possible, importing them into Unity3D caused the engine to crash, as its importer was not able to deal with that much geometry. In the highest available resolution, the models each had a polygon count of over 20 million. Fortunately, the scans were also delivered in a minor resolution, which could be imported into Blender and Unity3D as well. With this scanning resolution, the models still had a polygon count of about 150 000 up to 650 000 and a vertex count of 280 000 to 1.5 million depending on the statue. Although Unity3D was able to import the models, it threw some errors as it still was not able to generate the UV-layouts which are required to achieve an accurate lighting. The solution for this issue was to manually unwrap the single statues and generate the UV-layouts in Blender, which were then also imported into Unity3D.
As their overall vertex count with over 8 million still lies far over the maximum recommended vertex count of 600 000 (300 000 per eye) renderable with a Google Pixel 2 as a reference model [ 3 ], SaMaXVR is used to render these models on a powerful gaming PC rather than directly on a mobile device. To still visualize them in VR on mobile devices, so only the rendered images had to be streamed to and shown on the mobile device. This approach was also used in several related projects that tried to visualize complex geometry on less powerful devices.
Another issue which was detected was the rendering of the high-resolution models of the statues in combination with the model of the temple as viewing both at the same time caused the FPS to drop noticeably. This issue was solved by reducing the polygon count of the models of the statues to about 1/10 and using normal maps for the lighting that were baked from the high-resolution variant. This is shown in the pictures above. Comparing both variants next to each other at runtime revealed almost no visual differences. So the variant with the reduced polygon count was used to visualize the statues located in the pediment above the colonnade of the temple and the statues located directly within the walkable area of the user.
As realistic and visually appealing graphics are important to make the user feel present in the virtual world, and all the rendering takes place on the PC side anyway, realistic lighting of the statues and the buildings including Global Illumination and Volumetric Light was possible. To fine-tune the final look of the scene, also several post-processing effects were applied.
As computational power is very important to visualize high detailed 3D models of cultural heritage, remote rendering is a common solution when trying to visualize these models on less powerful devices. It could be shown that such an approach is suitable for public cultural institutions such as museums by closing the gap between mobile and desktop VR, thus combining the benefits of mobile VR with the benefits of desktop VR in terms of maintenance, usability, safety, rendering performance and graphics quality.
Moreover, it could be shown that even though the WiFi in the 802.11 standard is pretty much limited concerning bandwidth and transmission speed, very low latencies could be reached when streaming high-resolution PC rendered VR content to smartphones. Benchmarks have exposed that using the fastest configuration (Nvenc H.265), an average round trip time of only 27,36 ms was reached which can be mitigated by predicting the user's head orientation.
For demonstrating the potential of SaMaXVR in order to visualize high-detailed 3D models of cultural heritage in VR on smartphones, an application was created which enables its users to experience 3D scans of real exhibits obtained from the "Museum für Abgüsse Klassischer Bildwerke München" in their former environment. The scanned exhibits were castings of the original statues of the west pediment of the Temple of Zeus found at the site of ancient Olympia. In order to view them in their original context, the whole site including the Temple of Zeus was reconstructed, thus allowing different representations of the scanned statues. This enables users to watch the statues either at close distance or within the west pediment on top of the reconstructed temple. To be able to walk around and view the models from different perspectives, in addition to SaMaXVR, a Kinect 2 was used to track the head position of the user. Combining SaMaXVR with Kinect 2 head tracking compensated the huge disadvantage of SaMaXVR and mobile VR not offering positional tracking on their own and made it possible to achieve a similar experience to desktop VR systems. As with the usage of SaMaXVR, the whole VR content is rendered on a powerful PC, not only the visualization of the very high-detailed 3D scans on smartphones became possible but also a very high graphics quality was achieved, which normally would not be possible in mobile VR.
[ 1 ] https://www.mifcom.de/media/catalog/product/i/m/img_1865_2.jpg
[ 2 ] http://cdn1.knowyourmobile.com/sites/knowyourmobilecom/files/styles/gallery_wide/public/2016/10/google-daydream-view.jpg?itok=lyqKU6oE
[ 3 ] https://developers.google.com/vr/develop/best-practices/perf-best-practices
[ PDF (optional) ]
[ Slides Kickoff/Final (optional)]