Individual reflections - Christian
My main contributions for the four projects are as follows:
- Marker-based AR project: Brainstorming, unity implementation, animation setup.
- Marker-less AR project: Brainstorming, model acquisition, portal doors & animation, interaction script
- VR app: Brainstorming, codebreaking implementation (my own level), keypad interaction
- XR app: Brainstorming, pool level (ended up not working), general support role afterwards
In the following sections I will reflect on the projects and the knowledge I have gained throughout the course.
The
marker-based project was a very weird one. Of course, it being the first project
we have makes it very limited in scope, but it was also plagued by technical
issues from the SDK we were using. Vuforia created a lot of problems,
especially for me as I was in charge of the main Unity implementation.
Technical issues aside, the project was AR, which meant it was about augmenting
the user experience by having virtual objects coexist in our world. It does
this by marker-based tracking, focusing on detecting a real-world object for
reference so the app knows where to show and display an object. It does this by
being given a set of images that it detects special feature points in, which
then is compared against real-world objects with a video feed. It then crunches
numbers and figures out how to display the object.
All of this was being done ‘under-the-hood’ by the Vuforia SDK. When it worked,
it worked well, but most of the time it was kind of a mess of errors that we
had no idea how to fix.
To conclude on this project, I liked our idea about educating children with
animals in augmented reality, but the limited knowledge of our team and the
technical issues with Vuforia made the project very janky.
For the
marker-less project we had much greater freedom in choosing what we would work
with. This is mainly due to the inherent limitations of marker-images, making objects
rely on real-world images. This was not the case now, and I think that shows in
our project. By allowing the application to ‘recognize’ objects and determine
the tracking details of said objects in the real world, we gained enormous freedom.
We decided to make a kind of portal-based geography education, where the user
would enter doors/portals to real-world places where they can experience it in
AR. We were mostly limited by the assets available to us for free.
This was achieved by using ARCore in Unity. By compositing the virtual world
with the real world, we achieve augmented reality. We do this with the help of
our phones, since they carry fantastic cameras and loads of sensors that ARCore
uses to estimate the virtual world relative to the real world. In short there
is a lot of math going on under the hood.
My contributions mainly was during the brainstorming, model finding and
implementing interaction for the doors/portals.
To conclude on this project, this was a much better one than the last. Getting
out of Vuforia and into ARCore and other SDKs in Unity was fantastic and we
think we had some pretty creative ideas and that it showed in the project.
The VR
project was in my opinion our best one. We had more time developing it and I
think our idea was really fun. We had decided to make a very small game
inspired heavily by Keep Talking and Nobody Explodes. The idea was we would
have a bunch of small puzzles around for the player to solve in VR. This would
take advantage of the main selling-point of VR, namely the immersion and interaction.
The user is fully in a virtual world, interacting only with virtual objects,
and by doing this we lose a lot of the janky-ness that AR inherently brings. The
headset tracks where it is with either inside-out or outside-in tracking. This
depends on the headset the user is using. Our project mainly used Oculus Quest,
which uses inside-out tracking, meaning it does not rely on external tracking. All
the sensors are found on the headset and communicated back to the game engine
(Unity) where the virtual world is rendered as normal, with the headset having
a virtual presence inside the engine.
My puzzle was a codebreaking puzzle where the user would describe symbols on a
keypad to a person with a manual who can tell the code-breaker the order to
press the keypad buttons. This made use of VR’s immersion and interaction.
Unity’s XR developer tools allowed us much easier implementation of our systems,
as VR is extremely complicated. For VR functionality we used the XR Rig in
Unity, which gave us what we needed to have a camera and interaction with
controllers.
All in all, I think that this project was our best one, and that VR is a very
interesting platform to develop for. I think that this shows in our project pretty
well.
The final
project, the XR project was a long one. We had four weeks to create a much more
ambitious project than all the last ones. We knew almost immediately that we
wanted to work in VR again, since the group found it the most fun. This time we
wanted to develop a lot more, since we already had experience from the last
project.
We decided to make a bar arcade (barcade) game, where the user could play
multiple smaller minigames in one location. Our initial plan was to have darts,
minigolf and pool I am pretty sure. This ended up being way different, however.
Our approach to the project was fairly flawed in hindsight, as we each went off
in our own directions and did not work together as much as we should have. This
is mostly our own fault, but it also speaks to the complexity of developing for
VR. We spent a lot of time learning the same things individually, which
resulted in different versions of the same basic functionality. One example was
the XR rig with its interaction setup. We all had different versions of this
depending on how we learned it, which resulted in weird errors and problems.
Our initial ideas were also changed quite a bit. We did not do darts, but
instead a go-kart type game instead, the pool game did not work well and was
eventually scrapped, and we spent a lot of time on trying to get multiplayer to
work. We ended up with a final project only having minigolf and go-karts, but
those did work decently well. My contributions were mostly in the pool level,
but it was too problematic and could not be fixed in time. Afterwards I was
mostly helping out the others with their various parts and trying to get
everything to work together.
All in all, the project ended up being kind of just okay. What was working was pretty
good, but it was no where near what we wanted the end product to be like.
Overall, I
have learned a lot from the course. I understand the differences of AR and VR,
how they work, and what makes them unique experiences. These are all skills and
knowledge that can be useful in the future.
Kommentarer
Send en kommentar