The VR escape room game project - part 2

We decided that the VR Excape Room game will consist of 4 puzzles that the players have to solve to escape from the room. These puzzles will require the player with the vr headset to walk around, inspect the room and interact with objects, while the "mission control" has to figure out the correct solution to the puzzles by reading the manual. There will be four puzzles in total, so each person is responsible for the development of one puzzle. We agreed on some puzzle ideas, but these are subject to change according to asset availablity and the difficulty of implementation. Each person is also responsible for making the manual for their own puzzle. 


The players have to escape from this garage
Puzzle 1:

There are a random (2-4) number of buttons in front of the player in vr. The players have to decide which is the correct button to push based on the placement of tools on the tool rack and the number of buttons. 

Puzzle 2: 

The player has an object to interact with that has an inner circular track with symbols and an outer track with numbers on it. The player in vr has to describe the symbols and the "mission control" has to determine which number to assign it. The tracks can be spinned to get the correct combination of numbers and symbols.

Puzzle 3:

This is a sound based environmental puzzle. The room's floor is divided into squares which correspond to number-letter pairs. The player in vr has to listen to a sound and determine where it's coming from. When they have the correct square, they identify it to "mission control", who will then have to figure out the correct code. 

Puzzle 4:

A maze where the players first have to decide what the correct blueprint for the maze is based on clues. The person in vr doesn't see the walls of the maze but has to go through it to get to a certain tile. The "mission control" has to navigate the player to move through the maze without going through walls.

We are each in various stages of development, as the puzzles require a lot of different setups, assets and logic.


XR Origin:
The XROrigin class that we use in the scene is a shared dependency between two packages: AR Foundation and XR Interaction Toolkit. The center of worldspace is represented by the XR Origin. The XR Origin transforms objects and trackable features to their final position, orientation, and scale. It specifies an Origin, a Camera Floor Offset Object, and a Camera. The XR Origin gameobject in the scene also has a Main Camera and Left- and RightHand Controllers. 

XR Interaction Manager: 
Between Interactors (controllers) and Interactables (items to pick up), the Interaction Manager serves as a middleman. For Interactors and Interactables to be able to communicate, the loaded scenes must include at least one Interaction Manager.

We experienced some problems when trying to set up the headsets for development. Some issues we faced were that even though we turned on the developer mode on the headset and followed every step in multiple guides, unity would not recognise the device properly and it would mark it as unathorized. We later figured out that we needed to pair the headset with the oculus mobile application as well, where there was another developer mode to turn on (which did not appear in the desktop application).  Some of us also had to fiddle around with the configuration of the ADB server on our computer so we could force the reconnection of the usb debugging. After all this finally unity recognised the device as authorized and we could start developing on it. 

 Author: Anett Barát









Kommentarer

Populære opslag fra denne blog

XR project final part and XR Expo

Individual reflections - Kristóf Lénárd

Individual reflections - Christian