Open source Unity toolkit that sets up a 3D printed model (.obj) to be augmented with custom information using the HoloLens. The 3D printed model is tracked using Vuforia by 4 image tracked targets.
In some situations, the visually impaired use tactile tools for learning topological and mathematics. These tools can visually look very differnt from the actual data due to the limited percision of how the learning tool was created (in this case the 3D printer). The HoloTouch’s simple toolkit allows researchers to create a HoloLens experience that augments information on top of these 3D printed tactile objects thereby allowing collaboration between sighted and visually impaired participants.
Vuforia, Unity, HoloLens
Multiple tracking targets to augment a 3D printed model Simple Unity toolkit interface enabling customization of different models to be overlayed on top of 3D printed model
USC Creating Reality Hackathon Winner of 2nd Place Best Windows Mixed Reality Hack. This augmented reality tool that saves lives; a fire drill training application.
Traditional fire escape drills are critical for saving lives, but have several inherent problems. They are highly disruptive to class or work schedules, are time consuming and costly to organize, and safety information retention is abysmal. Participants go through the motions passively, and aren’t engaged in a way that ingrains the procedures into their memory.
Our implementation is cheaper and less disruptive than mass fire drills. The app enhances learning retention by using gamification, multimodal teaching, comprehension acknowledgement, and repetition. For fun, the app is stylized for kids, but a full build could easily swap in age appropriate styles.
We have implemented 3 skill tests, and a single exit path from an assembly space to outdoor safety.
Simulation tool used to view SportsVU and NBA Play-by-Play data developed for a Data Mining Course. The SportsVU data includes position of all players and the basketball for every .25s during the game. Visualization tool features include: Enclosed Hull Display,Trail Rendering, Camera Controls, Start, Stop, Forward Play-back.
VR platform for visualizing and manipulating GIS (Geogrpahic Information System) data using Mapbox Unity SDK and Service. Custom layers were created and integrated into the Mapbox SDK allowing for CU Boulder GIS Data to be overlayed and accessed using VIVE Controllers. Models of buildings can be picked up and manipulated to view embedded information about the building. Sliders allow for brushing of data which are automatically updated in VR.
The prototype demonstrates a platform to navigate and manipulate customized information in geo-tagged locations including:
Visualization of spatial information and physical manipulation of data can be more effective for exploratory based learning compared to 2D and WIMP (Windows Icon Menu Pointer) interface.
Mapbox SDK, Zenject Framework, Unity, VIVE, VRTK
Interactable Building Objects embedded with information were structured using Inversion of Control framework (Zenject), allowing for flexibility to customize the information embedded.
Touched on the User Interface and User Experience design for visualizing the embedded information and controls for accessing the embedded information.
24 hr HackCU III hackathon (Major League Hackathon Event) submission that allows non-VR players to interact and play with a player in VR.
This cross platform browser/VR based games allow browser players to spawn bombs in the virtual environment to help the VR player fend off zombies.
VR can be a disengaging experience especially for those who are not playing in the VR experience. Our motivation was to enable non-VR players to active participants with players in VR.
It was the first time that my teammate Roldan, and I built a browser/VR multiplayer experience using NodeJS and Websocket framework.
A quick prototype demonstrating an interface allowing users to visualize and change the properties of virtual spheres. Users assign “weight” values on the spheres, which affect their color, size, and position. The sphere objects and interface were designed using Model-View-Controller framework, allowing for application to update the weight, sphere, color, size, and position accordingly.
Experimented with designing an interface for selecting and changing the object properties that isnt WIMP (Windows Icon Menu Pointer) based. The design is inspired by shape changing interfaces like the Inform, where the interface controls changes based on context and the state of the application.
VRTK, VIVE, Unity
36 hr Major League Baseball Advanced Media and NYC Media Lab hackathon submission that allows users to relive a memorable moment in baseball history from the perspective of a player on the field. Users can change from a 1st or 3rd person point of view while the events of the game is played out in real time.
The proposed challenge was to utilize virtual reality to deliver a new experience for baseball fans. Leveraging the capabilities of VR, we focused on bringing an impossible experience of reliving a past baseball game to the fans.
The VR simulation can allow fans to have the unique perspective of being on the field, running to home plate, or relive a past game which only has recorded footage. Layered contextual information, such as recorded in game commentary, retrospective interviews, were incorporated enhance the virtual experience.
2nd Place Winner for Virtual and Augmented Reality Track at the MLB Advanced Media and NYC Media Lab Hackathon.
Unity application that allows the user to navigate a rocket ship through a galaxy generated with planets and space debris.
Using Amazon Alexa, the user can issue commands to
Amazon Alexa SDK and Unity Package
Winner of Most Useful Hack at Tackle STEM Hackathon
MIT Reality Virtual Hackathon submission explores the possibility of using VR to visualize information in an immersive and three-dimensional way.
Users answer Pew Research Center questions related to political beliefs by throwing their answers into a virtual space landscape. With each answered survey question, the landscape transforms changing atmosphere color (Red=Conservative , Blue= Liberal) and creating valleys, lakes, ditches, and mountains based on the their answer. After the completion of the Pew survey questions, the user is slowly lifted giving them a perspective of their landscape that visually represents their political view.
This VR application is an experiment to see if virtual reality can influence individuals’ perceptions of the political proclivities of other people as well as their own. It uses an adapted version of Pew’s Ideological Consistency Scale with an additional element of personal relevance. The construction and sharing of a virtual landscape as a representation of personal belief can potentially be used to initiate dialogue about sensitive politically topics and beliefs.
A fun interactive experience importing into Unity a Ramen shop, created in Maya for my Animation/3D modelling class. Throw and break plates in the Ramen shop and hear the owner yell at you. Experience the shaking of the ramen shop as a train passes by. Turn on the television and watch clips from movies and tv shows.
Create an experience with interactable objects (breaking plates, cloth physics, television with changing channels). Audio play back is triggered by interactions with objects, such as breaking of plates, using a publish/subscribe event system.
VRTK, VIVE, Unity
University of Colorado project demonstrating a portal allowing prospective students to learn about CU Boulder International student statistics in VR using Amazon Alexa. The experience also includes 360 videos that can played to preview study abroad experiences for different countries.
CU-Visually Voice User Interface
To test out querying of database information using a voice user interface such as Amazon Alexa. Voice user interfaces can provide several advantages of being a more natural interface of natural conversation, in comparison to having to learn and know how use an interface. More information about the advantage of voice user interfaces can be found in my blog post
Amazon Alexa SDK and Unity Package, Amazon SQS, MongoDB
A short experience where a scripted voice assistant (a talking rock) instructs the user how to interact with Audio Reactive Objects that play sound effects to compose their own beats. In addition to playing music, the objects react by changing in color and scale based on the audio played.
Mainly to test the use of scaling and color for representing a virtual assistant’s speech and expression. Also to build a personal virtual assistant scripted narrative experience for a tutorial sequence. Steam VR does a really good job of this.
Keijiro Reaktion Package, VRTK, Unity
Proof of concept prototype of a virtual reality visualization and control system for a unmanned aerial vehicle (UAV). Live video, infrared images, and telemetry data from the UAV are streamed to a server and relayed into virtual reality for the operator. A virtual navigation mapping system provides the operator a visualization of the terrain for a GPS location, and method to plan a flight path for the UAV.
VRTK, VIVE,Unity, Python Twisted Server
Virtual simulation of an arctic environment with gusting snow simulation, wind sound effects, periodic breath fog, and floating glaciers. The simulation is controlled by a Python Twisted Server which pops up a virtual survey for conducting theraml comfort experiments.
Conducting a thermal perception experiment that testing and measuring the effect/impact a virtual environment on thermal perception and comfort. Based on past research and anecdotal accounts, users perception of temperature can be influenced by visual cues in virtual reality, such as looking at a virtual fire. This virtual reality experiment attempts to test this in an experimental setting, by placing subjects in a temperature controlled room while experiencing being in an artic environment.