[ Introduction to Virtual Reality Coursework #1&2 ]
GDS VR LAB
2022
Concepts Quote
"Making physical a tangible part of VR space."
Location
GDS VR Lab
Ben Pimlott Building,
Goldsmiths, University of London
Features
-
360 photo&video
-
Location-based VR (LBVR)
-
Multiplayer
-
Embodied Full-body Avatar
Collobartion
Device
-
VR Lab v1.0 - Group Work made by Unity 2020.3.30
-
VR Lab v2.0 - Individual Work made by Unity 2021.3.0
-
Social VR
-
Cartoon Style
-
Natural and Intuitive Interaction supported by Hand Tracking
-
VR supported
Quest with Oculus Air Link (Full version).
Standalone Oculus Quest
-
PC supported
-
Mobile supported
Award
-
Immerse UK Awards 2023: Winner of Best Overall Experience Category
Hi!
Welcome to this multiplayer experience that takes place in a real-life setting fitted in our Goldsmiths VR Lab. This experience uses Collaborative interactions to help freshers quickly get to know each other. Plus intuitive hand tracking interaction, I'm trying to make it as physical sense as possible.
Ju Zhang
Inspiration
Turn our Reality VR Lab Life to virtual? Why not?
Concept & Feature #1: Online 360 Photo Tour
The early stage user journey made for online single players who can't come physically
Concept & Feature #2: VR Medium and LBVR Environment
1:1 Modelling Process for Fitted Virtual and Physical 3D Lab
VR LAB 1.0
My Group Contributions
_________
Unity 2020.3.30
VR LAB 2.0
My Individual Iterations
_________
Unity 2021.3.0
Week 1 -2 {
Initial Location-based
Concept
|
|
|
↓
MoodBoard
|
|
|
↓
User Journey
|
|
|
↓
360 Image and Video Assets
|
|
|
↓
3D Scan Lab Model
|
|
|
↓
Test Oculus Interaction SDK Functions and Sample Scene
(Hand Tracking)
|
|
|
↓
Full-body Arm and Leg IK Improvement
|
|
|
↓
Apply Hand Tracking to Normcore Full-body Avatar's Hand and Finger Rigs
|
|
|
↓
Android adb Debug
|
|
|
↓
}Week 10 -11
Week 2-5 {
Sync Archor Setting and Impelement Location-based VR Function
|
|
|
↓
Avatar Creation
|
|
|
↓
Modelling the 1:1 VR Lab
|
|
|
↓
Toon Shader in Blender and Intergrate with Unity
|
|
|
↓
Optimising Model Mesh
|
|
|
↓
Test model synced well in .apk
|
|
|
↓
Redone the Whole Project in Unity as the interaction method changed
(From Controller to Hand Tracking)
|
|
|
↓
Change XR Interaction Toolkit to Oculus Intergration (OVR)
|
|
|
↓
Redesign and Increase VR Lab Spacial Interactions from 6 to 10 Socail Activities/Games
|
|
|
↓
Move and Redesign all Games UI to diegetic PC UI
|
|
|
↓
Improve the Realtime Lighting to Baked Lighting Map (Mixed Lighting) and Add Shadow, Postprocessing
}Week 11 -12
Week 5-8 {
Intergrate Avatar .vrm, .glb work in Unity
|
|
|
↓
Full-body MultiplayerAvatar Implementation using FinalIK
|
|
|
↓
Implement Adjust Avatar Height Function
[Multiplayer]
|
|
|
↓
Invesigate Voice Chat, Oculus Lipsync
[Local]
|
|
|
↓
Implement full-body Avatar with facial movement
[Multiplayer]
|
|
|
↓
Keyboard/Joystick XR Locomotion for Test using XR Interaction Toolkit
|
|
|
↓
Change Toon Shader Implement by Blender to Unity
|
|
|
↓
Across Platform Supported
VR, PC, Mobile
|
|
|
↓
* Hand Input Interactions *
(Multiplayer Synchronization)
①
Hand Gesture Recognition and Locomotion
②
Hand Tracking Grab Interaction
③
Hand Poke Physical Button
*Multiplayer Optimazation*
①
Draw and Guess Game in Lab 1.0
↓
Rewrote
Multiplayer Data Synchronization Script Logic in Lab 2.0
}Week 12 -13
Week 9-10 {
|
|
|
↓
URP Setting and Materials Upgrade
|
|
|
↓
Physical Playtest with More Than 5 People
|
|
|
↓
Two-player Info-gap Train Game Implementation
[Multiplayer]
|
|
|
↓
Zoning Interactable Spaces
|
|
|
↓
Debug Avatar Change Script
[Multiplayer]
|
|
|
↓
Go Through
Video Editing
________
②
Dart Game in Lab 1.0
↓
Rewrote Multiplayer Data Synchronization and Realtime Ownership Script Logic in Lab 2.0
③
Change Avatar in Lab 1.0
↓
Upgraded Avatar Selection Implementation with more efficient code such as RealtimeDictionary in Normcore API in Lab 2.0
*Additional Game/Fucntions*
Including Graffiti Function, Rock, paper, scissors Interaction with functions synced to multiplayer.
Please see more details about the addtional functions in spcial social activities part.
________
}Week 13 -14
Justify the use of Mediums /
Between real and virtual, fitted VR/AR space and embodiment.
Avatar Interaction/
Intuitive hand tracking interaction in VR environment
The Utilisation of Space/
A functional division of the different physical spaces, iterating the functional use (show as intuitive PC games) of the different areas in Lab v2.0
Virtual&Realistic Room Mapping/
Physical lab room matching (negligible measured/matched error) for tactile feedback interaction by Origin Anchor Point method
Avatar Representation/
Selectable cartoon full-body avatars are used for multiplayers' co-presence. The synced location lets users are able to aware of their body and location in the physical environment
Full-Body Avatar(Embodiment)/
I implemented the full-body avatar function using FinalIk in VR Lab v1.0.
Walk in Place Locomotion/
The onsite mode can walk on its own feet
Avatar Interaction/
Intuitive hand tracking interaction in VR environment
Hand Tracking Interaction/
To be more in line with the LBVR concept, I added hand tracking to the full-body interaction in v2.0 by exploring the new Oculus Interaction SDK released this February 2022. Interactions mainly are poke physics button, complex hand grab, gesture recogition and hand UI ray.
Voice Chat and Voice Controlled Facial Expression/
The voice chat is a function provided in the Normcore plugin.
Oculus lipsync SDK was used to combine with the VRM plugin to make Viseme and eye expressions working. But I encountered issue with the multiplayer sync.
In the final version, I modify a custom script to control the lip movement and eye micro-expressions controlled by voice volume.
Concept & Feature #3: VR Illusion and Natural Interactions
Natural Locomotion and Hand Tracking by Latest Oculus Interaction SDK
Concept & Feature #4: Multiplayer Social Activities
The big technical chanllege for redoing the whole project
Design 2.0
-
Redesign all games in VR Lab v1.0 according to more functions understood from Normcore multiplayer.
-
Redesign the way the train game is presented
-
Redesign all UI and add Curve UI
Implementation 2.0
-
All interactions in the project are redone as the hand tracking is upgraded
-
Debug some issues in VR Lab 1.0.
-
Technically I rewrite or optimized all scripts in VR Lab v2.0
-
Change realtime lighting to baked
Vectors all from
Note: Due to time constraints, not all games have implemented the multiplayer feature, I hope to finish it later when I have time
Met the initial goal
In general, I believe I exceeded my expectations.
Technically, I realised my initial idea of improving my modelling skills from the Lab model by using Blender. Also, Simple blendshapes and toon shader from avatar model creation.
Regarding Unity and programming, I didn't think I could implement full-body under a multiplayer environment at first, nor did I think I could do location-based. Moreover, I didn't think I could integrate hand tracking into a full-body avatar! I am very glad about what I have learned in all these implementations from this project, the VR technology and the VR medium understanding. The Normcore API is very helpful and interesting chose to work with. I learned a lot of things that I didn't think I would learn at first.
The most challenging parts
The multiplayer function is the most challenging part of the group project, meanwhile, the full-body avatar integrated with hand tracking and the latest interaction from Oculus SDK is the most challenging part for individual work as well as optimization for standalone Oculus Quest.
Multiplayer avatars
We used Normcore to implement the multiplayer function, which is Haoyang more focused on this function in VR Lab 1.0. I explore this function at the very end of our group work and learn more during the individual work. The ownership logic is quite easy, but the real-time sync logic from this Normcore plugin is very challenged to implement any easy single-player functions since we have to consider the sync from the server view and all players’ views, for example, people need to see other’s synced facial expression and synced avatar selection changes.
Although it is very hard to understand when I know more about the logic behind the script, it became interesting and inspired. I spent more than two weeks to tried the document example scene and functions. Finally, the multiplayer sync script changed my mind and gave me new thinking about I don’t need to know any unique number for each player, but I always know the local one could help me to implement any multiplayer function I’d like to do. This kind of learning lesson made me extend my programming skills.
Full body self-avatar with hand tracking interaction
This is a very difficult function from every point of my view. First of all, it's cumbersome because all input interactions have to be changed to hand tracking. However, I believe that intuitive hand tracking is perfect for this location-based projects that can physically interact with tables, switches and walls in the VR room, this is one VR medium understanding from my perspective as well, it will make this experience unique for a VR immersive experience with physical tactile feedback, which no other platform and interact can do this.
I have many other reflections and challenges during this project and solve or give up many functions from modelling to the avatar, from locational anchor to real-time interactions. Thanks to the plugins, which provide more feasible functions in a short time, and also made more conflicts and bugs that I spent a long time solving but learnt from the solutions.