Individual Reflection - Anett Bianka Barát
My main contributions:
- Marker-based AR project: Model and picture acquisition
- Markerless AR project: Model acquisition, portal prefab, script and shader, fixes after everything was put together
- VR project: Button puzzle, merging levels with the others and making sure the levels work correctly
- XR project: Minigolf game
Marker-based AR project
As this was my first ever AR project, everything was
very new and confusing at first. It was hard to come up with ideas where basically
an image recognition software could come in handy. We settled on a children’s
book with pictures of animals, which show the children a 3D model of the animal
that they are reading about. This idea was perfect for an AR project, where the
user can point the camera towards an image, which is then recognised by the
software and a digital model is shown through the camera in the real world. These
images provide an anchor for the digital content that is shown.
The Vuforia SDK provided all of this with a fairly
easy-to-use SDK, where we just had to find unique pictures and pair them with 3D
models and Vuforia would try to recognise when the camera was pointed to one of
the pictures and show the corresponding model. The emphasis is on try, because
as it turned out, sometimes it would work perfectly and sometimes it wouldn’t
work at all. We even had a problem with some Unity packages, which prevented
some of us (me for example) from starting the AR experience. This could have
been due to the team members using different Unity versions, so after this
project, we all insisted on using the same version to avoid further problems.
Marker-less AR project
In this
project, we moved on from using markers, to a markerless experience. This gives
more freedom for tracking the environment and specifying where and how objects
should appear. We used ARCore which provides various different tools, such as
environmental understanding, which enables devices to recognise horizontal and
vertical surfaces and planes. Additionally, it
has motion tracking, which enables phones to comprehend and keep track of their
locations in relation to the environment.
I made the
portals, which work by setting a value in the shader of the wall and the 3D
models of the buildings, based on the position of the camera. There is a plane
on the door, which always shows the "inside" of the portal, so the
user can see what's inside the portal before entering it. At this point, both
the walls and the 3D model are invisible, except for the part that can be seen
through the door's plane. When the user goes through the door, the shader is
changed so now the walls and the 3D model are completely visible. The initial
portal worked perfectly, but after the others put in the other 3D models and
walls, I had to come in and fix them so the portal effect works correctly,
since there were a lot of materials/shaders/settings to be applied.
To be honest, I
don't think we made good use of the environmental understanding of ARCore. The
end result of this project are the portals which most of the time are just
hanging mid-air and the tracking is lost at times as well, making it hard to
actually walk to all the portals and see what's inside. If I were to do it
again, I would use anchors, to avoid the portals "floating away" from
the user.
VR project
This was the first VR project, so we wanted to create
a really immersive game, so after brainstorming for a while, we decided on an
escape room with 4 different puzzles to complete. For this, we used Oculus
Quest headsets, which use inside-out tracking. This means that they don’t use
external (lighthouse) tracking. The Quest headsets also have 6DoF (6 degrees of
freedom), which can translate the player’s movements in the real world into VR.
I made the puzzle where a random number (between 2-4) of
buttons appear on a desk and the VR player also sees some different tools on
the wall. The player with the manual has to determine which button to push,
based on the number of buttons and the tools on the wall. This was achieved by
a random number generator, which decides which scenario (from 1-12) to initialize.
Once the scenario is initialized, the correct number of buttons and tools are
shown and the correct button to push is saved. From there, the script simply
checks if the pushed button is the correct one. If not, it starts the entire
process again and initializes a new scenario.
XR project
In the fourth assignment, which started out as a VR
arcade project, I made the minigolf game. Originally, each of us would have
made a game to play in the arcade, but this did not work out as expected by the
end. As for my part, I learned a lot while working on the minigolf game.
Starting with modelling the hole and the flag for the course, which is
something I tried for the first time.
I also meddled around a lot with hand grab positions
and teleportation via the OVR rig. It was important to get the hand position on
the golf club as close to real as possible, so it would feel natural to hold
the club and hit the ball with it. This was very hard to get right, but with
enough playing around with the fingers’ position and rotation, I think it
turned out well.
My game required moving around a lot, to be able to
move through the golf course, so it was my responsibility to implement
teleportation. This was a hard task at first, because no matter how closely I
tried following the oculus example scene, it seemed like it would not work together
with the hand grab. It was either the grab that worked, or the teleportation,
but for a while I couldn’t get them to work together. Eventually I figured out
that it was because the structure of the OVR rig game objects wasn’t correct.
Once I fixed that, everything worked correctly.
As for the minigolf game, I found some free obstacle
course assets and used those to make the golf courses a bit harder to complete,
and also modified some of the scripts, like the rotation script, to become more
custom. I also had problems with getting the physics of the golf ball and club
right. With the help of Kristóf, we managed to make it better by using physics materials
and lowering the fixed timestep in Unity (so the collision detection is better),
but some of the modifications unfortunately got lost while merging so the
interaction between the ball and the club wasn’t really good in the end
product.
Kommentarer
Send en kommentar