Individual Reflection - Jannik Schwarz-Wolf

Responsibilities:
- Educational AR Popup: Writing the blog post
- Geographical Portal AR: Writing the blog post and code overview
- VR Escape Room: Labyrinth puzzle and XROrigin camera setup
- XR Project: Entire Race part and Dart game

Educational AR popup
As the first project we were tasked to create a Marker-based AR application. AR stands for Augmented reality which simply put is a mixture of Augmented (Computer generated) objects with real world objects, often via a phone. Marker-based means the AR application takes the usage of an image to load 3D models on to the scene. The most famous probably being Pokemon go, which is a (Markless AR). We created an educational Popup application with animals using the Vuforia SDK in unity. Vuforia gives access to advanced computer vision functionality to the application. Vuforia detects images by comparing extracted natural features from the camera and compares them against a known target resource. Once detected, Vuforia will track the image so you can move around the image, whilst the engine will display the associated 3D model. After this we simply had to add animation the animals upon them being instantiated by Vuforia. Once instantiated, the models are simple gameobjects in unity and can be manipulated. To setup Vuforia in Unity you only must add the SDK, add an AR Camera to the scene and open the Vuforia configuration from the inspector. Here simply add the Vuforia Developer basic key in the app license key field. Here after you can add Vuforia Engine features, such as Image Target which takes an uploaded image as its Image Target Gameobject. Personally, I believe we utilized the core functionality of Vuforia in this project but with more time could have made more integrated functionality with the gameobjects in unity.

Geographical Portal for Educational purpose
The second AR project was to be a Markless AR application, meaning an application that can recognize targets that were not directly provided from the get-go. Here we used ARCore with unity to obtain an abundance of functionality. The fundamentals of ARCore are, Motion Tracking, Environmental Understanding, and Light Estimation. Firstly, we created a virtual camera in the form of an AR Session Origin with an AR Camera which will represent the virtual view that matches the physical camera (your phone). When the phone moves through the world, the ARCore combines visual data from the device’s camera and IMU (Inertial measurement unit, Force/movement) to estimate the virtual positioning over time. We ended up only using Motion Tracking as we used unity to create isolated 3D worlds with 3D models inside of them. That way the ARCore would move the virtual Camera which could open and look through the doors. This gave a cool dimensional door feeling, but in the end nothing more. Personally, I believe we could have maybe used Environmental Understand and its Plane Finding that can detect surfaces via seeing points that appear on a common horizontal surface. With this we could have placed the dimensional doors on desks and allow the user to resize the door and have small portals giving more depth of the application.

The VR Escape Room
This project evolved around creating a VR application with the usage of an Oculus headset, where AR is a combination of virtual and real word, VR is a full virtual world rendered by the headset. Firstly, Oculus takes use of 6 DoF inside-out tracking, which means it can track the three degrees of motion, being yaw, pitch, and roll, whilst also being able to track elevation, strafing and surging. The inside-out means that Oculus takes use of mounted cameras that track the headset from the user-viewpoint. The headset displays the world via an HMD 90Hz display with distorted optics. The reason for the optics is because the ‘displays’ are placed extremely close to the eyes thereby limiting the FoV (Field of View), with human eye having a FoV of 220° means that the display needed to be “rounded” to give a more realistic viewing experience. To interact with the 3D world, the oculus comes with 2 hand-held controllers that are being tracked via the cameras on the headset. For this project we decided to try and utilize the full experience of full VR whilst still being connected to the real world and therefor decided on a co-op puzzle game, one inside the VR world and one outside giving instructions. I created the labyrinth, and my first goal was to fully utilize the 6 DoF tracking. I did this by placing the needed tools around the shop on all levels, ground, chest and even above the head of the player. After this I made the grid in the garage and added Box-collider panels on each grid which in combination with the collider on the XROrigin “The player” gave me the needed functionality of listening if the player stepped on the right panel. Finally, I added the XR interactable to the objects that had to be able to be picked up.

XR race project
The initial idea was to create an arcade, but after the coherence networking failed and I figured out that the OVR grabbable did not convey physics meant that we had to re-plan. The reason behind the usage of OVR instead of an XROrigin is because I wanted to have virtual representation of hands, which XROrigin does not come with as standard, and there for you must create them yourselves which turned out to be quite difficult. I therefor dropped the Darts game and instead focus on creating a race game. Never having worked with wheel colliders, meant that it took me some time to get the car physics working, and furthermore due to the OVR Grabbable needing a rigidbody (Physical weight) meant that the steering wheel could not be added as the rigidbodies of the car and the steering wheel collided. I did try to detach the steering wheel of the car object and bind it via constraints but that also sadly failed. In the end I simply listen to controller input aka thumb stick and buttons inside the script. Finally due to the OVR Camera starting the scene in the position you are in real world meant that I also had to add an interactable menu that allows the player to readjust the camera. Finally, I added 12 tracks that could be selected via another menu.

Kommentarer

Populære opslag fra denne blog

XR project final part and XR Expo

Individual reflections - Kristóf Lénárd

Individual reflections - Christian