top of page
frame.png

[ Introduction to Virtual Reality Coursework #1&2 ]

GDS VR LAB

2022

Concepts Quote

"Making physical a tangible part of VR space." 

Location

GDS VR Lab

Ben Pimlott Building,

Goldsmiths, University of London

Features

  • 360 photo&video

  • Location-based VR (LBVR)

  • Multiplayer

  • Embodied Full-body Avatar

Collobartion 

Device

oculusvr.png
  • VR Lab v1.0 - Group Work made by Unity 2020.3.30

  • VR Lab v2.0 - Individual Work made by Unity 2021.3.0

  • Social VR

  • Cartoon Style

  • Natural and Intuitive Interaction supported by Hand Tracking

  • VR supported

Quest with Oculus Air Link (Full version).

Standalone Oculus Quest

  • PC supported

  • Mobile supported

Award

  • Immerse UK Awards 2023: Winner of Best Overall Experience Category

ipad.png

Hi!

Welcome to this multiplayer experience that takes place in a real-life setting fitted in our Goldsmiths VR Lab. This experience uses Collaborative interactions to help freshers quickly get to know each other. Plus intuitive hand tracking interaction, I'm trying to make it as physical sense as possible. 

​Ju Zhang

Inspiration

Turn our Reality VR Lab Life to virtual? Why not?

Concept & Feature #1: Online 360 Photo Tour

The early stage user journey made for online single players who can't come physically

Concept & Feature #2: VR Medium and LBVR Environment

1:1 Modelling Process for Fitted Virtual and Physical 3D Lab

Journey
v11.png

VR LAB 1.0

My Group Contributions

_________
 

Unity 2020.3.30
 

VR LAB 2.0

My Individual Iterations

_________

Unity 2021.3.0
 

Week 1 -2 {

Initial Location-based
Concept

|
|
|
​↓

MoodBoard

|

|
|​


User Journey


|
|
|



360 Image and Video Assets

|

|
|
​↓
 
3D Scan Lab Model

|

|
|​

Test Oculus Interaction SDK Functions and Sample Scene
(Hand Tracking)

|
|
|
​↓


Full-body Arm and Leg IK Improvement

|

|
|


Apply Hand Tracking to Normcore Full-body Avatar's Hand and Finger Rigs

|

|
|


Android adb Debug


|
|
|

bug.png

}Week 10 -11

v12.png

Week 2-5 {

Sync Archor Setting and Impelement Location-based VR Function

|
|
|
​↓


Avatar Creation

|

|
|​


Modelling the 1:1 VR Lab

|
|
|


Toon Shader in Blender and Intergrate with Unity 


|
|
|
​↓

Optimising Model Mesh

|
|
|​


Test model synced well in .apk

|

|
|​

Redone the Whole Project in Unity as the interaction method changed
(From Controller to Hand Tracking)

|
|
|
​↓

Change XR Interaction Toolkit to Oculus Intergration (OVR)
 

|

|
|


Redesign and Increase VR Lab Spacial Interactions from 6 to 10 Socail Activities/Games

|

|
|​


Move and Redesign all Games UI to diegetic PC UI 


|
|
|


Improve the Realtime Lighting to Baked Lighting Map (Mixed Lighting) and Add Shadow, Postprocessing

}Week 11 -12

v13.png

Week 5-8 {

Intergrate Avatar .vrm, .glb work in Unity

|
|
|


Full-body MultiplayerAvatar Implementation using FinalIK 


|
|
|
​↓

Implement Adjust Avatar Height Function

[Multiplayer]

|

|
|​


Invesigate Voice Chat, Oculus Lipsync

[Local]

|

|
|​


Implement full-body Avatar with facial movement

[Multiplayer]

|

|
|​


Keyboard/Joystick XR Locomotion for Test using XR Interaction Toolkit

|
|
|
​↓


Change Toon Shader Implement by Blender to Unity

|

|
|​


Across Platform Supported
VR, PC, Mobile
|

|
|​


* Hand Input Interactions *
(Multiplayer Synchronization
)


Hand Gesture Recognition and Locomotion



Hand Tracking Grab Interaction


Hand Poke Physical Button


*Multiplayer Optimazation*


Draw and Guess Game in Lab 1.0

Rewrote
Multiplayer
Data Synchronization Script Logic in Lab 2.0

}Week 12 -13

v14.png

Week 9-10 {

|
|
|
​↓


URP Setting and Materials Upgrade

|

|
|​


Physical Playtest with More Than 5 People

|
|
|


Two-player Info-gap Train Game Implementation
[Multiplayer]

|
|
|


Zoning Interactable Spaces


|
|
|
​↓

Debug Avatar Change Script
[Multiplayer]

|
|
|​


Go Through 
Video Editing

________


Dart Game in Lab 1.0


Rewrote Multiplayer Data Synchronization and Realtime Ownership Script Logic in Lab 2.0


Change Avatar in Lab 1.0

Upgraded Avatar Selection Implementation with
more efficient code such as RealtimeDictionary in Normcore API in Lab 2.0


*Additional Game/Fucntions*

Including Graffiti Function, Rock, paper, scissors Interaction with functions  synced to multiplayer.

Please see more details about the addtional functions in spcial social activities part.


________

}Week 13 -14

point.png

Justify the use of Mediums /

Between real and virtual, fitted VR/AR space and embodiment.

point.png

Avatar Interaction/

Intuitive hand tracking interaction in VR environment

point.png

The Utilisation of Space/

A functional division of the different physical spaces, iterating the functional use (show as intuitive PC games) of the different areas in Lab v2.0

ipad.png
point_edited.png

Virtual&Realistic Room Mapping/

Physical lab room matching (negligible measured/matched error) for tactile feedback interaction by Origin Anchor Point method 

point_edited.png

Avatar Representation/

​Selectable cartoon full-body avatars are used for multiplayers' co-presence. The synced location lets users are able to aware of their body and location in the physical environment

Model

Full-Body Avatar(Embodiment)/

I implemented the full-body avatar function using FinalIk in VR Lab v1.0.

point.png
point.png
point.png

Hand Gesture Locomotion/

A few gestures instead of the joystick locomotion method to control the back and forth movement of the character

Walk in Place Locomotion/

The onsite mode can walk on its own feet

point.png
point.png

Hand Gesture Locomotion/

A few gestures instead of the joystick locomotion method to control the back and forth movement of the character

Avatar Interaction/

Intuitive hand tracking interaction in VR environment

mtransparent.png
ftransparent.png
point_edited.png
point_edited.png
point_edited.png

Hand Tracking Interaction/

To be more in line with the LBVR concept, I added hand tracking to the full-body interaction in v2.0 by exploring the new Oculus Interaction SDK released this February 2022. Interactions mainly are poke physics button, complex hand grab, gesture recogition and hand UI ray.

Leg IK/

In VR Lab v1.0, the implementation of procedural leg animation has simply adjusted the parameters in FinalIK Plugins. In the v2.0 individual project, the IK animation was upgraded to leg animation function in FinalIK, which combines with the motion capture animation.​

Voice Chat and Voice Controlled Facial Expression/

The voice chat is a function provided in the Normcore plugin. 

Oculus lipsync SDK was used to combine with the VRM plugin to make Viseme and eye expressions working. But I encountered issue with the multiplayer sync.

In the final version, I modify a custom script to control the lip movement and eye micro-expressions controlled by voice volume.

Concept & Feature #3: VR Illusion and Natural Interactions

Natural Locomotion and Hand Tracking by Latest Oculus Interaction SDK

Concept & Feature #4: Multiplayer Social Activities

The big technical chanllege for redoing the whole project

Activities
Avatar

Design 2.0

  • Redesign all games in VR Lab v1.0 according to more functions understood from Normcore multiplayer.

  • Redesign the way the train game is presented

  • Redesign all UI and add Curve UI

Implementation 2.0

  • All interactions in the project are redone as the hand tracking is upgraded

  • Debug some issues in VR Lab 1.0.

  • Technically I rewrite or optimized all scripts in VR Lab v2.0

  • Change realtime lighting to baked

template.png
train.png
rock-paper-scissors.png
selfie.png
dart.png
graffiti.png
karaoke.png
brush.png
chess.png
after-party.png
color-wheel.png
1841783.png

Vectors all from 

Note: Due to time constraints, not all games have implemented the multiplayer feature, I hope to finish it later when I have time

Cross-platform Application Supported

Maybe so-called Metaverse? : p

pc.png
vr-glasses.png
device.png

Vectors all from 

1841783.png
two-way.png
eye-recognition.png
two-way (1).png
placeholder.png
placeholder.png

Implement here with XR rig character controller, can improve with walking animation

See XR Avatars and Voice Chat in Normcore sample Unity package

See the AR spectator concept in Normcore sample Unity package

Crossplatform

Met the initial goal

In general, I believe I exceeded my expectations.


Technically, I realised my initial idea of improving my modelling skills from the Lab model by using Blender. Also, Simple blendshapes and toon shader from avatar model creation.

 

Regarding Unity and programming, I didn't think I could implement full-body under a multiplayer environment at first, nor did I think I could do location-based. Moreover, I didn't think I could integrate hand tracking into a full-body avatar! I am very glad about what I have learned in all these implementations from this project, the VR technology and the VR medium understanding. The Normcore API is very helpful and interesting chose to work with. I learned a lot of things that I didn't think I would learn at first.

The most challenging parts

The multiplayer function is the most challenging part of the group project, meanwhile, the full-body avatar integrated with hand tracking and the latest interaction from Oculus SDK is the most challenging part for individual work as well as optimization for standalone Oculus Quest.


Multiplayer avatars
We used Normcore to implement the multiplayer function, which is Haoyang more focused on this function in VR Lab 1.0. I explore this function at the very end of our group work and learn more during the individual work. The ownership logic is quite easy, but the real-time sync logic from this Normcore plugin is very challenged to implement any easy single-player functions since we have to consider the sync from the server view and all players’ views, for example, people need to see other’s synced facial expression and synced avatar selection changes. 

Although it is very hard to understand when I know more about the logic behind the script, it became interesting and inspired. I spent more than two weeks to tried the document example scene and functions. Finally, the multiplayer sync script changed my mind and gave me new thinking about I don’t need to know any unique number for each player, but I always know the local one could help me to implement any multiplayer function I’d like to do. This kind of learning lesson made me extend my programming skills.


Full body self-avatar with hand tracking interaction
This is a very difficult function from every point of my view. First of all, it's cumbersome because all input interactions have to be changed to hand tracking. However, I believe that intuitive hand tracking is perfect for this location-based projects that can physically interact with tables, switches and walls in the VR room, this is one VR medium understanding from my perspective as well, it will make this experience unique for a VR immersive experience with physical tactile feedback, which no other platform and interact can do this.

I have many other reflections and challenges during this project and solve or give up many functions from modelling to the avatar, from locational anchor to real-time interactions. Thanks to the plugins, which provide more feasible functions in a short time, and also made more conflicts and bugs that I spent a long time solving but learnt from the solutions.

Reflection of My Work

 The Goal and Challenges

Project Management

The timeline and task hub used in both versions

manage1.png
manage2.png
bottom of page