Unit 4, Week 3: pre-production

Art for Finn by the talented Vier.

We did four significant things this week:
(1) Sourced the voice actor, who will be the voice for the main worm, Finn.
(2) Started building a prototype for the main type of exploratory movement – it will be a room-scale kind of re-directed walking.
(3) Got the concept art ready for the main character, Finn.
(4) Spoke with our sound artists and began to shape the general soundscape.
(5) Scripted a prototype for a heart fractal generator we will need for a narrative beat (no pun intended), and which we are considering making a design motif.

(1) We already had a read-through with Víctor Emanuelle and came up with excellent results. He will be recording the lines in separate files and sending them over this week.

(2) Vier began building a prototype for the movement we are looking for. At the moment, we are using stencil shaders to create portals the user can go into, giving the illusion of space expanding.

(3) Here is another of the pages Vier delivered this week. Try and find our favorite so far.

Art by Vier.

(4) In our meeting this week with the sound artists, we created an asset list and heard some samples for general atmosphere and button sounds. We are looking to make the worm machines sound more human-like than the actual humans in the story.

(5) Here is a clip of the heart fractal generator.

Scripted with C# for Unity.

We are so excited for this project.

Week 2: Cross-platform Werewolf (VR/MR)

Concept art by the incredible Vier.

For this unit, our team wants to explore avatar customization and the potential it has for revealing or obscuring identity. We have found inspiration in the game commonly played among friends under different names like “Werewolf” or “Murderer.” The basic structure of the game is as follows: first, each player is assigned a role. A crime has been committed, so one or two people are assigned the roles of criminals. These are the major roles, since the objective of the game is for the other players to find out who the real criminals are. Each round there is a vote and whoever is accused by the most people in the group is “killed” in the night (which means they can no longer speak). If the innocent people kill both criminals, they win. If the criminals remain with the same number of innocent people, they win. 

More art for the Shaman character by Vier.

Different versions of this game have different roles. Some include detectives like in the murderer version, and some include villagers as in the werewolf version. We have decided to make the emotional core of the story one of lightheartednesses for friends to be able to socialize and be jovial. The narrative will be set on an island, and the friends will start out around a campfire. The roles will be as follows: a shaman, a spirit, a healer, and a seer. Each role has its own unique powers. The “crime” in this version will be a shapeshifting. During the night, the shaman in the friend-group has turned one of the other friends into an animal. 

After being assigned their roles, the players will be able to customize their avatars as to divert or attract attention to certain features and hide their identity. The shaman’s unique power is that he can customize the avatar of every player after the first round, but only once. The spirit player is unable to be shapeshifted by the shaman. The healer can speak with the shapeshifted animals, and the seer has clearer visions than the rest. 

Fire spirit art by Vier.

Traditionally, after each vote, there is “night,” where every player sleeps and the “murderer” commits another crime, i.e. the person voted to be “killed” is taken care of in the night. For our version, we will be expanding on the avatar theme and include a dream sequence of the shapeshifting episode. This will be shown in 360 video. The avatar of the shaman player will be distorted in each “night” sequence, but will slowly become clearer as the game progresses. This is why the seer has the special gift of being able to see the avatar with more clarity (but the other players don’t know if they can trust them anyway). 

We want to include an MR app that will allow you to potentially place your vote, inviting users to take off their headsets and discuss. Beyond this, and perhaps more importantly, it will allow you to customize the scene. Whatever you place in the MR app will appear in the virtual scene. This can be done to incriminate people or just mess around with your friends. 

This will be a non-linear narrative since the roles will change in each game, but the backdrop for the main “crime” or shapeshifting will be the same, and done in 360 video. In the end, it is revealed who the shaman is and the players can explore the space, since there are a lot of world-building opportunities with this concept.

Concept art for flora in Shaman world.
General world flora by Vier.

Week 2: The Love Machine Never Stops

This past week Vier and I discussed how to make our chosen chapter more relevant to the actual state of technology today. We thought it was interesting we were assigned such an anti-technology narrative. More than a century has passed since it was written and our technology is far more advanced, yet we haven’t suffered the grim fate it announces.

We decided we would have the idea of simulation as the core of our experience, since it would then comment on virtual reality. It is more probable today that we would stay in rooms and “lose touch” because of our ability to simulate outside spaces. We looked at the Rick and Morty “simulation inside a simulation” episode for inspiration.

After meeting with the sound artists, we settled on a story that departs from the original in (hopefully) significant ways. Our story claims the only reason the mending apparatus fails, and everyone dies, is because the mending apparatus falls in love and decides to escape. Its escape means the rest of the machine self-destructs without their much needed help.

The idea is for the user to enter the experience in media res. After putting on the headset, the user “wakes up” in a room before a worm-like machine, like they’re described in the story, who looks attentively at you. We want to explore the second-person creator-user relationship. The user will be mostly passive, since they can’t control much of what will happen. The worm will have a Barry White voice and will try to seduce the user (and succeed!).

Excerpt from current state of script.

At this point, I won’t reveal much more of the plot points, but the story ends with the user “escaping” with our Barry worm, and walking into the sunset. If they look behind, though, they would see the rest of the mending apparatus dragging others back into the machine.

This story is meant to make the anti-technology rhetoric a little bit less black and white. We have a machine with emotions and human ambitions, but we also have the collateral damage associated with this desire. The interaction design will feature ways for the user to explore simulation inside the machine.

Unit 4, Week 1: Adapting Forster’s The Machine Stops to VR

Another exciting unit: we will be collaborating with second year students from Sound Arts to bring Forster’s grim short story “The Machine Stops” to VR. My group has chosen the second chapter, titled “The Mending Apparatus” and has begun discussions on potential opportunities.

This is an exciting project for several reasons: (1) my background in comparative literature has prepared me like no other for close reading of texts, as well as considerations of their intertextuality among other mediums, (2) beyond detailed interpretation I also have experience adapting literature to screen, like for example adapting Woolf’s To the Lighthouse to a feature-length screenplay and, finally, (3) I have the opportunity to work with other talented artists who are all willing to have fun and play to their strengths. We have sound artists that are interested in experimenting with procedural sound and integrating that expression with Unity, and we have rigorous and creative 3D modelers and animators willing to take interaction design to the next level.

I think it’s appropriate to start out this unit with a reflection on the story. Upon my first reading, my notes are separated in the following way:
(1) visual cues: primarily, but not limited to, the aesthetic of light
(2) author-reader-story relationship
(3) mentions of touch (since they may translate specifically to interaction or meditations on interaction)
(4) anything pertaining to the senses (this is a big theme in the work)
(5) spatial cues
(6) soundscape cues
(7) philosophical cues that hint at the relationship between content and form and will help shape the VR experience.

For those who haven’t read the tale, it is a 25 page story that deals with a dystopian future where all human beings are separated into personal rooms that form “the Machine.” The Machine exists underground. Everything can be done inside the room: sleeping, eating, listening to music, attending/giving lectures, and it is hinted that even procreating. Some people do field-trips to Earth, but these are limited and highly-controlled. The air on Earth has become toxic to the beings so they must have a special respirator.

In the story, the mother Vashti is called by her son Kuno to come visit him because he wants to speak to her in person. Being a stout devotee of the Machine, she hesitates but decides to go. When she arrives, Kuno tells her about how he went aboveground through unofficial means, i.e. escaped, and had developed his sense of space and touch so much that he couldn’t look back. He criticizes her faith in the Machine. He has been threatened with Homelessness, which means he will die. His mother thinks him foolish and leaves him again. Later on, Kuno is the first to predict the Machine is “dying” and when he tells his mother, she doesn’t believe him. The Machine ultimately fails causing a genocide of human beings who can’t see or breathe.

These are what I now think are the most interesting notes for some of the points:
(1) The story starts and ends with the image of the machine as a beehive. The allusions are not scattered through-out, so it seems the author consciously and perhaps strategically opens and closes with this image. He mentions irritability is “a growing quality” in that age, and that the mothers face is “as white as a fungus.” Because the beehive imagery is a bit over-done in adaptations of this story, I thought a natural progression would be to take a cue from Plato’s The Republic. Socrates talks about how societies that have become overly inflated become like a wasps’ nest. We can picture the machine like he describes this society, inflamed, aggressive, irritable. An interesting visual translation of this for our VR experience, which hints to point (3) would be to portray these rooms and this machine as an infected skin. Touch has been completely neglected in this world, and as a consequence, each individual room becomes like an inflamed, infected pore… In the armchairs sit “swaddled lump[s] of flesh” among “throbbing” sounds…

(3) There are 11 mentions of touch in the story, and 5 refer to buttons or interface switches (“the ecstasy of touching a button”). 4 of the mentions deal with the absence of touch between human beings, or even how one should avoid being touched by sunlight. Only 2 refer to actual physical instances of touch, at the end when the machine is failing.

Clearly this can translate to a specific way of designing the interface, and contrasting this representation with actual physical touch.

(4) As for the senses, the meditation on touch can be extended. Kuno’s main realization, that “man is the measure” must permeate every unit of measurement for this work, since it is a humanistic philosophy and it includes an allusion to the five senses as “those five portals by which we can alone apprehend”.

(5) A list of all spaces mentioned in the story, which will need to be included in the VR rendition. For simplicity’s sake, I will not be including it.

(6) There is an eternal “hum” that numbs the beings that live in the machine. Interestingly enough, the first tell-tale sign that worries people is when there appears a defect in the music. It is the first failure that makes life unbearable. Reminds me of Nietzsche’s claim that “without music, life would be a mistake.”

To not make this entry too extensive, I will just mention that our group had a first meeting and already made some notes about potential interaction, visual, and sound opportunities. We begun a mood-board and joint document to keep track of the evolution of the project. This is my first instance of adaptation in VR so I am looking forward to testing the potential for transmutation.

Unit 3, Week 1: Cross-platform Avatars and Identity

For this unit, we are teaming up with our class-members to work on a class-platform design concept. I am lucky to be working with the talented 3D artists and animators, Vier and Salma Bouftas.

Very briefly, as is custom on this site, here are the requirements for this unit:
1. A non-linear narrative that can be developed through cross-platform immersive media.
2. A cross-platform XR experience that has any of these combinations:
i. computer-generated VR and experimental cinematic VR
ii. MR and experimental cinematic VR
iii. all three
3. Two user iterations
4. Written critical report
5. Research presentations
6. Blog
(For a full list of requirements for each category of immersive content mentioned above, please see the end of the post)

For my professional development, I am interested in learning how to create a multi-player experience with customizable avatars. I am also already interested in the hot-topic of how our “real” identities inform the virtual identities we may construct in VR. My inkling is that these virtual avatars of ourselves will be just as difficult to deconstruct as our already-existing embodiments. Identity, for me, is an ever-shifting paradigm that is always acting and reacting.

Because of this, I had an initial idea for a story where a virtual crime has been committed, something that is very plausible in the future, and the user has to piece together the identity of the criminal based on someone else’s account of their avatar. The description would be intentionally cryptic, not having access to the same cultural, linguistic, and/or socio-political references that usually shape our imagination. Through this experience I wanted to explore how one might be able to attribute an identity to a virtual person based on their avatar, and just how obscure this same identity may become in growing communities that don’t have access to the same image-construction.

I am not married to this idea. Fortunately, Vier is interested in exploring customizable avatars as a means for non-binary people to assert or reassert their identity. He believes virtual avatars in the future could be a major source of solace for those who suffer from body dysmorphia. He wants to develop this hypothesis in this coming project.

I am excited to see where we end up with these ideas. They are both talented 3D and interaction designers, and I’m sure we will end up with a project of a large scope and with many implications. Once again I will be updating weekly. Safe virtual travels!

Here are the technical specifications for the three categories of immersive content:
1. VR
i. at least two scenes
ii. at least one animated virtual human character
iii. at least one non-human character with animation optional
iv. navigation, interaction, spatial sound
v. interface design
vi. navigation within and between scenes
vii. event triggers

2. MR
i. CG content or live-action content activated with marker
ii. interactive elements
iii. sound
iv. two different locations
v. build for phone, tablet or Hololense

3. Cinematic VR
i. 90-120 seconds
ii. techniques for navigating 360 space
iii. interaction
iv. navigating within the scene and between scenes

Unit 2, Week 10: Dizziness

View of the “live stream” theatre. It is possible to go behind the screen and explore the mountain (model I sourced from TiltBrush Poly)

There is something they warn you about in VR literature: beware of getting used to your own experience and then assuming others will have a similar experience, especially when it concerns dizziness.

I had been working in my environment for about two-three weeks without worrying about how motion-sickness inducing it could be. The color pencil shader I have on the camera reduces the frame rate considerably. It wasn’t until I had my fellow student try it that I realized, like a eureka moment right before she even mentioned it, that it made users light-headed.

Satirical poster I made for the inside of the theatre commenting on popular entertainment.

This presents me with an opportunity but also with a problem. I could remove the shader, but I believe the design will suffer significantly. The other option is to see if there is any way of optimizing it, taking the opportunity to learn more about increasing frame rates and modifying shaders.

It is worth mentioning there might be another factor contributing to dizziness: touchpad locomotion. Since I have been having issues with using the Steam VR teleport prefab in my projects, I decided to teach myself how to script touchpad navigation. I’m glad I learned how to do this, but I think it is also a reason the user gets dizzy. The movement is smooth and slow, but unnatural.

In terms of critique, beyond the obvious danger of dizziness, I was glad some students thought the experience was beautiful. Going forward, I will see if/how I can optimize the shader and whether the navigation needs to be scrapped, as well. One month to go for this second unit of much exploration.

Below, I have included some snippets from my working script, created in Celtx.

Unit 1, Week 10: How relative is relaxation?

This week I began to establish the Unity-Arduino connection and I thought about how much what is considered “relaxing” can vary.

We had our final crit this week, and I got mixed reviews about how calm-inducing the environment was. Generally, I leave more satisfied from a critique when there are differing opinions. Some thought the palette and skybox was confusing; others thought it was calming to the eye. Another classmate commented they thought the motion of the water was rather nerve-wrecking. There were what I considered two unequivocal useful suggestions:
(1) Adding icons to the radial menu so the user knows what material they are opting for
(2) Tying in an inhale and exhale sound to the movement of the waves, since the user won’t be able to hear themselves breathe

I must admit I am rather happy with the color palette, since I don’t want it to have a more realistic skybox. I enjoy the feeling of feeling underwater, even when you are hypothetically on the island. Further, I agree the motion of the waves needs to be adjusted. I will focus first on establishing the connection with the values of the belt, and then I will polish the environment. The radial menu could become extended and the user will have more control over what they think is calming.

All in all, the development of this experience has been useful to consider two things (1) how organic an environment needs to be in order for it to be calming (I found a common thread in the feedback was concerning the struggle between a virtual representation of water and an organic environment) and (2) how varied what people consider “relaxing” can be. Another student in my course is developing a meditative experience with an almost opposite approach: she is creating a white environment with almost no stimulus. We were both forced to reckon with how there may be a thin line between relaxation and anxiety, almost akin to a suspension of disbelief. Perhaps I will elaborate more on this later, if there is any interest.

I am grateful to be able to present my development to my classmates and lecturers since it gives me an opportunity to reflect on the challenges I have faced, both technical and conceptual. It has been a first unit of much growth. We have around a month for the hand-in date.

Unit 2, Week 9 Dream of a Theatre: Rigging TiltBrush models

My prototype is almost ready to be delivered. This coming week I will jump back into the storyboard and flesh it out. The scene in Unity is full of place-holders, but I ran into some obstacles this week.

(1) I made a model of my main character in TiltBrush in order to rig it. Unfortunately, I couldn’t export it as a 3D file outside of Poly without it losing its materials. This is a big problem. The auto-rigging in Mixamo otherwise worked pretty well.

TiltBrush model auto-rigged with a Capoeira animation.

Since it is not a requirement for the unit, and it is beyond my current skill sets, I don’t think I will be able to prepare an animatic. Nevertheless, in my December break I will have it as one of my “reach” goals, since I understand its importance in the workflow.

(2) The project will need to be optimized. The shader is extremely taxing on the system.

Another development: finally finished Narrating Space/Spatializing Narrative. I’m taking one of their ideas on museum narratives now that I’m jumping back into my storyboard/exhibition space: that of spatializing intent. My focus will be on “the designer’s intents and how these are spatialized in the museum” (184).

I still believe in the potential power of this experience, as a purging of our current entertainment consumer trends. Hopefully I can develop it even further in my career.

WIP: My Theatre’s Screens

Unit 1, Week 9 Dotted Beach: Near Finishing Touches

Two things were completed this week:
(1) I managed to fix the teleportation script with our lecturer Zhan Gurskis. There was a problem where, not only would the distance increments grow smaller and smaller, but the player prefab would become shorter and shorter. This is working a bit better now because we added some offsets, but I still need to polish the functionality.
(2) Connected the breath belt to the Arduino board with the help of the wonderful Elle Castle. This coming week we will be adding a plug-in to Unity that will connect these values to the program. At the moment, we are getting some values between 240 and 1023. We will map these to other numbers that will control the offset of the waves. The belt is working as an added resistor. The more stretched it is, the higher the resistance, thus the lower the value.

(3) I added some music I think is really calming, giving it a Wahwah effect on Audacity for an illusion of three-dimensionality. Here is the music I chose.

There’s other good news. My main tutor, Ana Tudor, tested the experience out and mentioned she thought it was peaceful. This means the program is successful, although there is still much that can be done. She suggested I make the world smaller and start the experience in an in-between state, where the user can see the water slightly below eye-level.

The more I work on this project, the more I see the potential for developing the full meditation experience for SteamVR.