Social Virtual Reality

Week 14: In Memoriam

Screen Shot 2018-12-18 at 7.43.21 PM.png

In Memoriam is an open-ended, multi-user virtual reality experience that presents a future in which nature is preserved as a synthetic monument.

Two participants, after donning VR headsets, enter a surreal world of plastic trees, open water, and forest sounds from the past. Entering at two different locations in the landscape, guests are tasked to find each other while exploring an artificial forest.

What is familiar? What is strange? How will they relate their respective locations? What language will they use to describe their foreign surroundings? What clues will they leave for one another, and what will they discover along the way?

In collaboration with Carrie Wang.

What can I write about networking in virtual reality? It’s challenging and fragile but possible if you have inspiring professor and supportive peers. A huge thanks to Igal Nassima and our classmates who encouraged and helped us along the way.

181218_Screen Shot 2018-12-18 at 6.56.40 PM.jpg

Week 4: Painting in Virtual Reality


For this week’s exercise we built a virtual reality painting app like Tilt Brush, but not as fancy, for with the Oculus Rift and touch controllers. The SteamVR interaction library makes it easy to pair buttons and triggers on the hand controllers to functions in your project’s code. In my sketch, touching my “brush” to one of the cubes allows me to paint in 3D space with that same color when I squeeze my controller’s trigger. Here’s an outline of the setup and my code.


  1. I started by adding objects to my scene’s hierarchy. Adding the SteamVR Player object to the Hierarchy menu and disabling the default main camera allows the Rift-wearing user to become the camera instead.

  2. After that I created a basic 3D plane to which I applied the SteamVR Teleport Area Script. I also added the SteamVR Teleporting object to the Hierarchy, and these items in place allow me to move around the plane using my hand controllers.

  3. Then I added a 3D Plane with its own material (added to Mesh of the Mesh Collider component) to sit just below the Teleporting plane to define that area with some color.

  4. Next I created my color menu with three cubes—Red, Green, and Blue, each with their own material (attached to the Mesh Renderer component). I parented (is this the right terminology?) these cubes to my player’s LeftHand (Player > SteamVRObjects > LeftHand) such that when engaged in VR, they follow the movements of my left controller. My menu travels with me—thanks to Terrick for that idea!

  5. I also created a sphere (my “brush”) to which I parented to my player’s RightHand. Using the SteamVR Behaviour_Boolean (Script) I set the GrabGrip action (squeezing the controller’s trigger) to the Draw() function of my ControllerHander script (see below).

  6. After that I created was a sphere of the same size, which I converted into a Prefab in anticipation of instantiating it multiple times as the “paint” of my brush strokes. 

Code, there are three scripts:

BrushHandler is attached to my brush and sets up a public color variable called currentColor. This does not mirror the actual color of my sphere, however, which I gave its own material.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;

public class BrushHandler : MonoBehaviour 
    public Color currentColor;

ColorSelector is attached to each of the color menu’s cubes and does two things: creates a public color variable that matches the color its cube and contains an OnTriggerEnter function that sets the value of the brush’s currentColor variable to its own color.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;

public class ColorSelector : MonoBehaviour {

    public Color myColor;

    private void OnTriggerEnter(Collider other)
        FindObjectOfType<BrushHandler>(). currentColor = myColor;

ControllerHandler is attached my RightHand controller and create a public GameObject to which I added my DrawingSphere Prefab (see above). My Draw() function, which is called when I squeeze the trigger of my RightHand controller, gets the value of the currentColor variable associated with my brush, instantiates the DrawingSphere object, adds that object to a List, assigns the color to it, and then positions it at the same location in space as my hand. If the number of spheres exceeds a threshold, then they are removed from the list and deleted so as not to bog down the computer’s memory with thousands upon thousands of game objects.

using System.Collections;
using System.Collections.Generic;
using UnityEngine;

public class ControllerHandler : MonoBehaviour {

    public GameObject drawingObject;
    private Color newColor;
    public List<GameObject> paintBalls;

    public void Start()
        paintBalls = new List<GameObject>();

    public void Update()
        if (paintBalls.Count > 500)

    public void Draw()
      newColor = FindObjectOfType<BrushHandler>().currentColor;;
      GameObject drawnObject = Instantiate(drawingObject);
      drawnObject.GetComponent<MeshRenderer>().material.color = newColor;
      drawnObject.transform.position = transform.position;

Next Steps
To improve the user’s experience, I would update the material of the brush to match the color of the menu item it selects. Aaand, after Igal taught us about pure functions and state machines, try to incorporate those in projects using C# and other languages. 

Week 3: Introduction to Unity


Here I am learning out loud about Unity for a class focused on social virtual reality. Yes, networked VR! Unity is a massive game engine that supports many platforms for multiple applications. In the past week I’ve gotten to know my way around the interface, and now I’m learning how to assemble game objects and code in simple exercises. No doubt this post is a gross simplicity of the basics, but it’s a start.

Exercise 1: Falling Boxes
There are give game objects in this project, which has one scene: a cube, a plane, a main camera, a directional light, and an object that I’m calling the Game Manager. Game objects are created and display in the Hierarchy menu. Each object has a their own set of components. Right now I’m thinking about components as qualities that you attach to an objet that set it up to respond to the environment or to other objects in a variety of different ways. Components are like an object’s “adjectives” and include information about how an object looks (it’s material), how it responds to light, and it’s orientation and position in space. All of these attributes can be altered by code (written in C#) AND you can also write code to program additional variables for an object’s components.

In this exercise, the plane has seven components. Beyond the default components provided with a plane 3D object (transform, mesh renderer for a plane, mesh renderer, mesh collider, default material), I added a rigidbody. A rigidbody sets up the object to respond to the laws of physics, however it will not fall due to gravity if it is marked as kinematic, which this plane is.

To the cube, I added a rigidbody marked to use gravity, an audio source, and a material. Because it responds to gravity, the cube falls upon starting the game play. The color of the box was a created as a material (Project > Create > Material) and dragged into the cube’s Inspector to sit with the other components. And and audio plays for the cube when the game starts.

Both of the plane and cube have collider components (think invisible field that completely surrounds an object), which means that we can exploit when objects’ colliders intersect with one another. However, I did not do that in this exercise, but of note: without the Box Collider marked for the cube, it will simply fall through the plane.

I styled the cube in the Hierarchy and then dragged into my Project’s Assets folder to convert it into a PreFab, which means that I can create multiple instances of it. Creating multiple cubes is handled by my BoxCreator script, which is attached to my GameManager object. When the game starts multiple cubes (as many as I want) are created at random locations.

Exercise 2: Space Shooter Tutorial
This mini-project, a Galaga rip-off (awww classic arcade video games), helped me understand more of Unity’s potential. Oh my, there’s so much! Even though it looks 2D, I built out items as 3D objects which respond to different types of lighting. In addition to working with light, I learned how to download pre-created materials from the Asset Store so I could focus on integrating scripts to connect everything together for a game experience. I built out the project according to the tutorial but then incorporated a timer to reinforce my understanding. Here’s a brief outline and some main takeaways:

  • The GameController game object contains a script component linked to a script of the same name that handles the basic flow of the game. Similar to P5 sketches it has Start() and Update() functions so I can determine what it begins upon play and what loops throughout. I can also add my own functions. This script handles ties together the main actions that occur while the game is in play and what to reset when the timer runs out or when my player, the spaceship, is destroyed by an asteroid. The GameController game object also an audio source with background music and variables to display text on the screen.

  • I learned that using an IEnumerator-type of function is a smart way to incorporate wait times without halting the entire program with a delay. The GameController script incorporates two these: one to handle the constant timer countdown and the other to instantiate a new asteroid every half second.

  • My Player game object (the spaceship) has a rigidbody (that does not use gravity nor is it kinematic) and a mesh collider that is a trigger. It has three script components:

    • PlayerController: constrains (or clamps) the ship’s movement around the game board, its speed, its tilt when moving to the left or right, and instantiates fire bolts when the mapped key is pressed. In addition to Update(), this scripts uses a FixedUpdate() which is called automatically just before each fixed physics step.

    • DestroyByTime: when the game starts access to the GameController script is initialized, such that when my gameController.timeLeft variable reaches zero my player gameObject is destroyed (disappears from the game).

    • DestroyByContact: the player gameObject is destroyed when its mesh collider collides with an asteroid. Both player and the asteroids are marked as triggers, and I’ve used the OnTriggerEvent(). When the game starts, access to the GameController script is initialized such that when this gameObject is destroyed, a the gameController.GameOver() is called.

  • The Asteroids are PreFabs, each with a rigidbody, a capsule collider that is a trigger, and four script components:

    • RandoRotator: gets the rigidbody component to modify its angular velocity with random values.

    • Mover: gets the rigidbody component to modify how fast the asteroid moves forward.

    • DestroyByTime: exact same script that is attached to the PlayerController game object (see above).

    • DestroyByContact: exact same script that is attached to the PlayerController game object except for the purposes of each asteroid this scripts states that when it’s collider merges with one of the colliders of the ship’s bolts or that of the player gameObject itself, it explodes (the asteroid explosion is instantiated). When the game starts, access to the GameController script is initialized such when an asteroid is destroyed, the gameController.AddScore() is called to increase the number of points.

  • Boundary is a game object with a box collider that is a trigger and contains a DestroyByBoundary script component that destroys asteroids when they leave the game board or field of view. Without this, asteroids would just build up in our program and I’m guessing eventually slow everything down. Hmmm…is this a simple way to create a particle system?