Virtual/Augmented Reality Portfolio


OpenGL Flocking Simulation and Fractals

Computer Graphics final OpenGL project of an ocean floor scene with colored starfish, flocking “fish”, and a fractal coral plant in the center. Plant fractal patterns were generated using Lindenmayer-System. Plant fractals are also textured, and branch animation driven using a sin function.


Programming Lanuguages and Frameworks Used

C, OpenGL




NASA Spacesuit User Interface Technologies for Students (NASA S.U.I.T.S.) Design Challenge

The NASA S.U.I.T.S. Design Challenge is a mission-driven project in which university student teams design and create spacesuit informatics using an augmented reality (AR) Microsoft HoloLens platform. Working with CU’s Bioastronautics team, I helped developed a modular Unity code base enabling current and future students to easily modify and build new procedures oruser interfaces.

Motivation

This HoloLens application presents information to aid astronaut perform tasks by presenting space suit status, and instruction procedures via a voice command interface. Vuforia image tracking is also integrated to enable tracking and highlighting of tools during different steps of a procedure for further aid during these stressful and intense operating conditions. The user interface was purposely designed to be simple and minimalistic to prevent any unnecessary distractions. The designs were further improved during a 1 week testing of the application at NASA Johnson Space Control Center.


API/SDKs Used

Vuforia, Unity, HoloLens




HoloTouch
Augmented visualizations on 3D Printed models

Open source Unity toolkit that sets up a 3D printed model (.obj) to be augmented with custom information using the HoloLens. The 3D printed model is tracked using Vuforia by 4 image tracked targets.


Motivation

In some situations, the visually impaired use tactile tools for learning topological and mathematics. These tools can visually look very differnt from the actual data due to the limited percision of how the learning tool was created (in this case the 3D printer). The HoloTouch’s simple toolkit allows researchers to create a HoloLens experience that augments information on top of these 3D printed tactile objects thereby allowing collaboration between sighted and visually impaired participants.


API/SDKs Used

Vuforia, Unity, HoloLens


Take Aways

Multiple tracking targets to augment a 3D printed model Simple Unity toolkit interface enabling customization of different models to be overlayed on top of 3D printed model


Possible Next Steps

  • use of HoloLen’s voice interface for toggling different information to be displayed



Follow Me

USC Creating Reality Hackathon Winner of 2nd Place Best Windows Mixed Reality Hack. This augmented reality tool that saves lives; a fire drill training application.


Motivation

Traditional fire escape drills are critical for saving lives, but have several inherent problems. They are highly disruptive to class or work schedules, are time consuming and costly to organize, and safety information retention is abysmal. Participants go through the motions passively, and aren’t engaged in a way that ingrains the procedures into their memory.

Our implementation is cheaper and less disruptive than mass fire drills. The app enhances learning retention by using gamification, multimodal teaching, comprehension acknowledgement, and repetition. For fun, the app is stylized for kids, but a full build could easily swap in age appropriate styles.

We have implemented 3 skill tests, and a single exit path from an assembly space to outdoor safety.


API/SDKs Used

HoloLens,Unity


Accolades

2nd Place Best Windows Mixed Reality Hack

Devpost Project Page




NBA Data Visualization Tool

Simulation tool used to view SportsVU and NBA Play-by-Play data developed for a Data Mining Course. The SportsVU data includes position of all players and the basketball for every .25s during the game. Visualization tool features include: Enclosed Hull Display,Trail Rendering, Camera Controls, Start, Stop, Forward Play-back.


API/SDKs Used

Unity




CU-Visually-2
GIS Data Visualization

VR platform for visualizing and manipulating GIS (Geogrpahic Information System) data using Mapbox Unity SDK and Service. Custom layers were created and integrated into the Mapbox SDK allowing for CU Boulder GIS Data to be overlayed and accessed using VIVE Controllers. Models of buildings can be picked up and manipulated to view embedded information about the building. Sliders allow for brushing of data which are automatically updated in VR.


Motivation

The prototype demonstrates a platform to navigate and manipulate customized information in geo-tagged locations including:

  • room occupancy
  • building descriptions
  • building/floor plan level information

Visualization of spatial information and physical manipulation of data can be more effective for exploratory based learning compared to 2D and WIMP (Windows Icon Menu Pointer) interface.


API/SDKs Used

Mapbox SDK, Zenject Framework, Unity, VIVE, VRTK


Take Aways

Interactable Building Objects embedded with information were structured using Inversion of Control framework (Zenject), allowing for flexibility to customize the information embedded.

Touched on the User Interface and User Experience design for visualizing the embedded information and controls for accessing the embedded information.


Possible Next Steps

  • Integration of different sources of information.
  • Look further into UX/UI of visualization information in VR.
  • Integration Assistant / voice user interface for execution of common commands.
  • Connect application to query information from a database rather than static locally hosted data
  • Multi user for data annotation





CrossoVR

24 hr HackCU III hackathon (Major League Hackathon Event) submission that allows non-VR players to interact and play with a player in VR.

This cross platform browser/VR based games allow browser players to spawn bombs in the virtual environment to help the VR player fend off zombies.


Motivation

VR can be a disengaging experience especially for those who are not playing in the VR experience. Our motivation was to enable non-VR players to active participants with players in VR.

It was the first time that my teammate Roldan, and I built a browser/VR multiplayer experience using NodeJS and Websocket framework.


API/SDKs Used

VIVE,Unity,NodeJS,socket.io,AWS,HTML


Press




Object Weight Visualizer

A quick prototype demonstrating an interface allowing users to visualize and change the properties of virtual spheres. Users assign “weight” values on the spheres, which affect their color, size, and position. The sphere objects and interface were designed using Model-View-Controller framework, allowing for application to update the weight, sphere, color, size, and position accordingly.


Motivation

Experimented with designing an interface for selecting and changing the object properties that isnt WIMP (Windows Icon Menu Pointer) based. The design is inspired by shape changing interfaces like the Inform, where the interface controls changes based on context and the state of the application.


API/SDKs Used

VRTK, VIVE, Unity




ReLive VR

36 hr Major League Baseball Advanced Media and NYC Media Lab hackathon submission that allows users to relive a memorable moment in baseball history from the perspective of a player on the field. Users can change from a 1st or 3rd person point of view while the events of the game is played out in real time.


Motivation

The proposed challenge was to utilize virtual reality to deliver a new experience for baseball fans. Leveraging the capabilities of VR, we focused on bringing an impossible experience of reliving a past baseball game to the fans.

The VR simulation can allow fans to have the unique perspective of being on the field, running to home plate, or relive a past game which only has recorded footage. Layered contextual information, such as recorded in game commentary, retrospective interviews, were incorporated enhance the virtual experience.


API/SDKs Used

VIVE,Unity


Take Aways

  • Built and designed a time based event queue and messaging system to trigger audio, animation events, for scripted narrative/experience.

Press

2nd Place Winner for Virtual and Augmented Reality Track at the MLB Advanced Media and NYC Media Lab Hackathon.




Voice Control Navigation

Unity application that allows the user to navigate a rocket ship through a galaxy generated with planets and space debris.

Using Amazon Alexa, the user can issue commands to

  • Start and Stop the rocket thruster to move the space ship
  • Change views from 3 different scales on the galaxy
  • Inspect or zoom the camera view onto a planet Amazon Alexa recognize’s the user’s intent (“start thruster”, “stop thruster”) to execute events/commands in the Unity Application

API/SDKs Used

Amazon Alexa SDK and Unity Package


Press

Winner of Most Useful Hack at Tackle STEM Hackathon




Political Landscape

MIT Reality Virtual Hackathon submission explores the possibility of using VR to visualize information in an immersive and three-dimensional way.

Users answer Pew Research Center questions related to political beliefs by throwing their answers into a virtual space landscape. With each answered survey question, the landscape transforms changing atmosphere color (Red=Conservative , Blue= Liberal) and creating valleys, lakes, ditches, and mountains based on the their answer. After the completion of the Pew survey questions, the user is slowly lifted giving them a perspective of their landscape that visually represents their political view.


Motivation

This VR application is an experiment to see if virtual reality can influence individuals’ perceptions of the political proclivities of other people as well as their own. It uses an adapted version of Pew’s Ideological Consistency Scale with an additional element of personal relevance. The construction and sharing of a virtual landscape as a representation of personal belief can potentially be used to initiate dialogue about sensitive politically topics and beliefs.


API/SDKs Used

VRTK,VIVE,Unity


Hackathon Experience Blog Post

Devpost Project Page

Hackathon Submission




Ramen Shop VIVE

A fun interactive experience importing into Unity a Ramen shop, created in Maya for my Animation/3D modelling class. Throw and break plates in the Ramen shop and hear the owner yell at you. Experience the shaking of the ramen shop as a train passes by. Turn on the television and watch clips from movies and tv shows.


Motivation

Create an experience with interactable objects (breaking plates, cloth physics, television with changing channels). Audio play back is triggered by interactions with objects, such as breaking of plates, using a publish/subscribe event system.


API/SDKs Used

VRTK, VIVE, Unity


Take Aways

  • Event based triggering of audio
  • Video and audio playback




CU-Visually

University of Colorado project demonstrating a portal allowing prospective students to learn about CU Boulder International student statistics in VR using Amazon Alexa. The experience also includes 360 videos that can played to preview study abroad experiences for different countries.

CU-Visually Voice User Interface

Motivation

To test out querying of database information using a voice user interface such as Amazon Alexa. Voice user interfaces can provide several advantages of being a more natural interface of natural conversation, in comparison to having to learn and know how use an interface. More information about the advantage of voice user interfaces can be found in my blog post


API/SDKs Used

Amazon Alexa SDK and Unity Package, Amazon SQS, MongoDB


Take Aways

  • Development of Amazon Alexa Echo Skill
  • Integration of Amazon Alexa SDK, Amazon SQS, AWS Lambda, Unity, and MongoDB database



Audio Reactive Objects

A short experience where a scripted voice assistant (a talking rock) instructs the user how to interact with Audio Reactive Objects that play sound effects to compose their own beats. In addition to playing music, the objects react by changing in color and scale based on the audio played.


Motivation

Mainly to test the use of scaling and color for representing a virtual assistant’s speech and expression. Also to build a personal virtual assistant scripted narrative experience for a tutorial sequence. Steam VR does a really good job of this.


API/SDKs Used

Keijiro Reaktion Package, VRTK, Unity


Take Aways

  • Modified Keijiro’s Reaktion package to process wav file inputs
  • Tested integration of objects responding to audio for potentially avatar/assistants behaviour



UAV Virtual Flight Deck System

Proof of concept prototype of a virtual reality visualization and control system for a unmanned aerial vehicle (UAV). Live video, infrared images, and telemetry data from the UAV are streamed to a server and relayed into virtual reality for the operator. A virtual navigation mapping system provides the operator a visualization of the terrain for a GPS location, and method to plan a flight path for the UAV.


API/SDKs Used

VRTK, VIVE,Unity, Python Twisted Server


Take Aways

  • Flight path plotting system.
  • GPS telemetry, video feed, and infrared image stream to Unity using Websocket connection.



Virtual Arctic Experience

Virtual simulation of an arctic environment with gusting snow simulation, wind sound effects, periodic breath fog, and floating glaciers. The simulation is controlled by a Python Twisted Server which pops up a virtual survey for conducting theraml comfort experiments.


Motivation

Conducting a thermal perception experiment that testing and measuring the effect/impact a virtual environment on thermal perception and comfort. Based on past research and anecdotal accounts, users perception of temperature can be influenced by visual cues in virtual reality, such as looking at a virtual fire. This virtual reality experiment attempts to test this in an experimental setting, by placing subjects in a temperature controlled room while experiencing being in an artic environment.


API/SDKs Used

Unity, VIVE