MonthOctober 2019

Unit 2, Week 4: Virtual Storyboards

This week I thought about two things relating to my first VR narrative. The first was working up a second draft of the storyboard by the end of the week, and the second was the whole concept of embodiment in VR experiences.

Embodiment is something that VR developers see as necessary whenever it can be achieved. It means the user feels like they are “inside” a body, which is in then in turn “inside” a virtual environment. Degrees of embodiment range depending, research suggests, on how much said embodiment can interact with the environment or how purposeful said embodiment seems to the user.

I think this is fascinating, seeing VR users seem to complain that a virtual body does not positively add to an experience unless it has a meaningful purpose. In a more interactive narrative, if I am given a hand I need to be able to do something with a hand. In a less interactive narrative, if I am given a body I need to know why I am seemingly embodied in that space where I can’t move, or talk, or neither. This is just proof of the old adage: equip someone with a hammer and all they will see are nails.

We have to be very careful with what we equip users with on these experiences. The ornamental will easily be read as such and could potentially jeopardize the utility of the experience. Every hammer given must have a nail around to hit, or a very good reason for being useless in the given space.

Another interesting effect this concept has is placing standards on virtual embodiments we don’t place on our own “real” bodies. Each day more and more people identify less with their embodiment, and we exist on a sort of spectrum between feeling we have a body versus we are a body. The very wording of feeling we are “inside” a body implies entrapment, whereas being a body entails movement and identification with that movement. I imagine this spectrum is going to eventually map out onto the virtual worlds, where we will be able to consciously reimagine and reenact these muddled relationships we have with our own bodies.

Ok, enough musings about embodiment. This relates to my narrative project since I want to recreate the opposite of a sense of embodiment, I want to create an out-of-body experience (think lucid dreaming when one floats away from one’s own body). This will very much, if successful, create the sense of being outside one’s body.

I plan to achieve this in two ways (1) literally being able to switch between a personal POV and a more omniscient POV where characters walk below your nose in a doll-house way and (2) mimicking the visual changes one can have in a consciousness alternating experience. These changes include magnification, where objects seem to protrude towards you, and some of their details appear to be magnified in a dynamic way. (If you’re interested in how this can be done: this particular effect I intend to create using a shader with a light source that constantly moves. Dramatic forms and their changing shadows could recreate this visual change of a magnifying form.)

The storyboard is still unfinished– but given my time constraints I will jump back in and out into it during the next six weeks. Right now it is complete with place-holders, but I will retouch it continuously with details until it looks fleshed-out.

https://vimeo.com/369118149
Second draft of VR storyboard.

My next steps are creating all of the models in the next two weeks so that I can create the scene in Unity. I am not required to fully animate this for January, but of course, I will try my best. I intend to trace over models from Mixamo in TiltBrush that I can later auto-rig. Hopefully by the seventh week I can start animating and have the remaining three weeks to complete the interface.

This is all for this week, and I hope you’re enjoying this journey with me. Thank you to all who see these updates on social media and send me encouraging comments!

Unit 1, Week 4: What can meditation in VR look like?

Welcome back to another week of VR development! This week my challenge was to reimagine mediation (and I still haven’t done this successfully). How can we create a meditative space in VR that is (1) easy on the eyes and (2) shies away from more conventional meditation approaches that rely on green, organic forms from nature?

Because I want this to be a sort of escape from the natural world, I am aiming for more of a glitch aesthetic. Earlier this week I had been working in TiltBrush for several hours when it suddenly glitched. The resulting image was beautiful: the models I was working on stood larger than life on both sides of me, mirroring each other. In between them, below my feet, a strand of repetitive lights curled around itself like a DNA strand. I loved the feeling of witnessing a glitch. I felt apart (not a part) from it, and I stood in a sort of awe for many minutes. The moment was strangely peaceful.

So this is what I want to recreate: a sudden suspension of reality. The subtle feeling that something is not happening as it should, but is nonetheless worthy of taking a step back and letting it in. An invitation to observe an apparent lapse in space and time.

This is all very easy to express in words, let us see how I can explore the gap between the elemental/geometric and the natural/organic. Here are some concrete features I am going to add in the coming weeks:

(1) Breath belt! I just sourced a skater’s cloth belt that will be perfect for the breath measurement. I am lucky enough to be able to work with the intelligent and crafty people at the Creative Technology Lab at UAL, and Elle Castle from Physical Computing will be helping me build the belt. We will take some conductive rubber and sow it onto the belt, which will then wrap around the user and register the contraction and expansion of the stomach just below the ribcage. We will integrate Arduino with Unity to store these values and convert them into those of the virtual ocean.

(2) Object interaction: the user will be able to control features like dot size and density with hand gestures, not buttons on the controller. This will provide for a customizable and more intuitive experience, since everyone has different preferences and should have a measure of control over their meditative beach.

The user will be able to control the dot size and density with simple hand gestures.

(3) Gaze interaction: the user, after having spent ten minutes in the space, will have the ability to float over the ocean waves. Their position in virtual space will rise and wherever they look, they will move. This will give the sense of floating over one’s own breath levels. I am curious to try it out myself!

(4) The ability to go underwater and swim with whales. I will include whales with simple animations, but being careful to not make them look too realistic. Perhaps I will keep them a solid color. I am considering recreating an abstracted ocean floor where apparent light is refracted all around you. An algae forest could be nice.

During all of this, I have to keep in mind that my deadline is early January, so I want to be sure to have defined and simple objectives for each of the six weeks I have beforehand. I want to have a demo I can later expand.

This is all for this week and I hope you enjoy the progress! Let me know in the comments if there is a feature mentioned you are interested in, can improve upon, or maybe something altogether new you think is essential for meditation in VR.

Unit 2, Week 3: Scripting and Storyboarding VR

This week I managed to get a script onto paper. It isn’t written in the form of a traditional script, but gets nearer to a treatment. I thought this was more appropriate for my short experience (60-90 seconds) and it gave me much more liberty to deconstruct interactivity.

I won’t include the full breakdown in this post, but the experience takes place in a movie theatre and consists of three characters: a concession-stand employee, a movie-goer and an Ophelia-like screen actress. As of now, my main objective is to induce an altered state of consciousness, akin to that of dreaming. In its current state, it gives the user the constant possibility to switch between the movie-goers POV and a bird’s eye view. I have chosen to limit this freedom at two critical moments in the story: the beginning or set-up of the action and the climax/ending. I believe this will make for a more satisfying experience, since it provides a circular paradigm (given the specific context of the experience).

As for the storyboard, I went into TiltBrush a couple of hours and sketchily explored an idea I have for what a storyboard can be in VR. Since space will become more and more narrative in VR, I thought a storyboard could become a structure which reflects this story. I made a theatre which incorporates the ambiance and color palette of the intended world I want to create, and made it an exhibition place. Inside, there will be a color key for different characters, the interface, and different POV’s. The corridors diverge when viewpoints diverge.

https://vimeo.com/367557809
Dream of a Theatre: my first pass at storyboard as exhibited narrative space.

I opted for this method because I think of story as structure and structure and story. Wasn’t it Goethe who said to think of architecture as frozen music and music as liquid architecture? There’s an undeniable relationship between how we perceive spaces and how we perceive abstract expressions of emotion. For now, I am exploring this idea in the context of VR experiences. Storyboard could be an exhibited narrative space.

Unit 1, Week 3: Visual Strains and Gaze interaction

It’s going to be a short update this week. Two new things:
(1) The water now responds to input. When the space bar is down, the ocean recedes and rises slightly, and once it is released, it goes back to its initial position. At this stage I can start preparing the breath monitoring belt so we can later translate that keyboard input into the numerical values the belt will produce.
(2) I added a small island where the user will be able to teleport around.

Some conceptual challenges consumed most of my time this week. I need to figure out what sort of gaze interaction would be appropriate for this space, given my objectives of foremost prioritizing breath and second of increasing awareness of the relationship between our breath frequency and visual perception. I want the virtual environment to not be challenging or taxing on the senses, so I am considering working in a greyscale world with low contrast. Perhaps the gaze interaction could consist in making small RGB values appear in the TV static like ocean. Unfortunately, this could prove to be counterproductive given my aims of avoiding eye strain, since it can distract from the central focus of the world, our lungs.

All this I am starting to think about. As a side note, since its not a part of the requirements of this unit, I also found it compelling to see the mesh used for the water as a dome above us. Have a look for yourself:

https://vimeo.com/367554906
“Underwater”: under the water shader

I think this is an interesting texture to work with from many perspectives. Yet it is most probable I will refrain from using it anywhere else because it will detract from the bio-feedback ocean.

As for my MR app, the main challenge I now have to face is to ensure that it always has a correct perspective. If the virtual hand of the phantom limb user does not correlate with their perspective, the visual feedback will have no efficacy. For this I have been looking at cylindrical trackers and what they can do to ensure the virtual arm and hand’s correct orientation.

This is all for this week, and I hope you enjoy seeing this project come to virtual life. Comment any suggestions or reactions. I would love to know what would make this environment more useful.

Unit 2, Week 2: Cinematic VR

I was lucky enough to watch a 14 min VR film some days ago, which changed my whole approach to this story-telling unit. I had been convinced that this 60-90s story of mine needed an interactive element for it to be successful in VR, but this film proved me wrong. I watched it with a cardboard headset on my phone and my eyes had a hard time adjusting to the camera focus, yet it still moved me to tears. The camera angles and movements are compelling and intuitive. The use of close-ups is ultra-successful and right on the necessary emotional beats. It proved, for me, that cinematic VR could very well be defined by the simple choice of letting yourself go as a VR user and being led. Of course, leading well as a director is no easy matter.

This realization comes at a good time because my priority for this unit is narrating space. I am now convinced camera movement will be my protagonist, and the user will only need to sit and look around. This may seem boring to users who are used to more of a gamification of experiences, but trusting a narrative to move for you has great potential. It is imperative, however, that it is a camera that thinks. My instructor has recommended a book on spatializing narrative which I ordered for my personal library and am very keen to read, but still hasn’t arrived.

Here is a sketch I did exploring character design and palette.

The main character (right) and a laid-back girl in the theatre, whose face is a mirror (inspired by Maya Deren’s Meshes of the Afternoon).

I have been exploring some character design, but my goal for the incoming week, now that I have decided to fully limit interaction to head movement, is to write a starting script. This will raise many questions about scripting 360 stories, and is sure to be a feat. Wish me strength and flexibility as I start building this narrative world.

Unit 1, Week 2: Dotted ocean, bio-feedback and phantom limb pain treatments

For the first part of this unit, I have decided to pursue a meditative environment, where an ocean responds to the values of a user’s breathing. I will scrap the idea of including a hammock as a physical object or possibility in the program, since it might be outside of the scope of this project.

The ocean will be an abstracted sea, made to look like TV static with an extra dimensionality. Here are some updates:
(1) I went down to the Creative Technology Lab at UAL to see what would be my best option for breath detection. I was told of an experiment another student had done before, where she made a belt that translated the expanding and constricting of the lungs to values. We decided that might be my best option, since other more accessible options, like a microphone, would only record very exaggerated breathing and completely miss subtle cues like inhales.
I was told to first worry about getting the ocean to move and respond to some sort of input, and then we could worry about the bio-feedback. So I got to work on the dotted sea.

(2) Zhan Gurskis, our lecturer in immersive and interactive design, helped me add a script to a plane that makes it move like an ocean. This was done first by aligning a series of spheres and adding a transform function to the x position.

(3) I like this form of the dotted ocean, but I would prefer more of a density, and more of the illusion of volume. Zhan showed me another approach, which instead imports a simple plane from Maya and uses a shader to animate its points. This ended up looking something like this:

Zhan added some sliders to control dot size and density.

Going forward I want to play around with the 2D texture on the dots to give more of an illusion of their three-dimensionality. Ideally users could see the dots as if TV static were protruding out of a screen. The sea would retract once they inhale and crash against the shore as they exhale. Maybe they can control the color of the noise in the audio (there’s a whole rainbow of noise audio).

For the AR part of the assignment, I have carried on with the idea of direct visualization as a treatment for PLP (phantom limb pain). Mirror therapy (where unilateral amputees use a mirror to duplicate their working limb) has been shown to be a promising alternative. Most subjects describe that they can feel their phantom limb moving when viewing the reflection of their intact limb.

I want to create an AR app where the missing limb can become an avatar limb with no aim at photo-realism. I want to do this because I think an explicit suspension of reality might be more effective than an attempt at fully tricking the brain it is the original limb. Further, it seems more visually striking to me and might serve to add a symbolic dimension to pain that could prove to be useful. I imagine the avatar would look something like this:

The user would have the option to touch where it “hurts” and get visual validation of the pain.

From an article on the effect of direct observation on PLP:

The results of this specific study showed that direct visualization was more effective than mental visualization. Because of this, I would like to add some basic animation the user can choose through to observe as if happening in his own body, i.e. like those cited above.

For the animation of the limb, I haven’t decided if I also want it to be able to mirror another limb in the case of unilateral amputees. Nevertheless, I do not exclude the possibility of making it accessible to those who lack both limbs, be they arms or legs. This last use will require a different set-up which I will need to take into consideration.

This is what is going on so far around this unit, and I’m excited to see how it progresses.

Unit 2, Week 1: Storytelling in Virtual Reality

“If the poet’s duty is to reveal all their secrets, then I rather do so in my sleep.”                             (Jean Cocteau, I think? I can’t remember where I read or saw this, and if anybody knows the source, please leave a comment! I’ve been racking my brain for months now.) 

“The ultimate desire of any artist is to get someone to listen to their dream. With film, I can force you into it.”
                            (Another whose authorship or precise wording I can’t recall, yet sticks with me, possibly also Cocteau…)

Our second unit involves VR as an evolving storytelling medium. Many are changing the way they perceive in order to bring storytelling to this new technology, which necessarily means coming to terms with its limits as it stands today. Namely, not breaking the plausibility (once it is broken, it is possibly lost forever), inducing a sense of spatial presence which may fluctuate (on this users are a bit more forgiving), avoiding motion sickness, designating clear points of focus and interest, and the list goes on… 

Here are the requirements for this unit:
(1) A short script for a linear narrative VR experience (60-90 seconds)
(2) A storyboard in one of the available VR apps (with an integration of VR related concepts, like interaction, presence, or the uncanny valley, etc.)
(3) A VR environment based on the storyboard
(4) A critical report exploring
(a) the concept (500 words)
(b) why it’s best suited for a 360 immersive space compared to linear media/ how it reflects the relevant concepts (max 1200 words)

For this unit I have a clear objective. I want to prioritize the concept of spatial story, since I think it is an essential ingredient of this new medium. I want to create a sense of constructive and productive mystery that will blur the lines between the physical world and the virtual world. This I will achieve by creating a space and characters with levels of abstraction (not photo-realistic, but painterly). The script in itself will revolve around the language of dreams, where most of our visceral reactions are produced.

My objective is heavily influenced by both my experience with dreaming (I have kept a dream journal for more than a year now) and my recent research of immersive narratives. The term “environmental storytelling” was popularized in the gaming world, but may prove to be relevant in VR narratives, as well. According to Henry Jenkins of MIT,

“Environmental storytelling creates the preconditions for an immersive narrative experience in at least one of four ways:
(1) spatial stories can evoke preexisting narrative associations
(2) they can provide a staging ground where narrative events are enacted
(3) they may embed narrative information within their mise-en-scene
(4) or they provide resources for emergent narratives.”

(This citation was taken from John Bucher’s book Storytelling for Virual Reality, p. 66)

Here is the space I will be recreating, literally from a dream I had of a theatre:

Dream of a Theatre, sketched in Procreate, 2019

I believe the 90 second story will go something like this: the user begins in front of a theatre curtain and has to push it aside to come inside. In it, they will suddenly get a view of this theatre with its three characters. The one laying down may look at the viewer if the viewer chooses to pass by them, but the other will beckon for them to sit beside them. I’m still not sure how I will make sure they choose to do that, since an uncanny valley effect may be induced and I don’t necessarily want to force discomfort.

I may need to explore the concept of vection as it relates to VR, since it creates the illusion of self-motion. An example of this is when one is sitting in a stationary train and sees another depart. One may perceive their own train as moving. If this illusion is achieved (without motion sickness!), I believe it may be a great victory for all VR storytelling.

The problem of getting the user to sit by the character notwithstanding, this narrative may also be an experience where the ending remains the same (is set in motion) no matter what the user chooses. In this case, a flooding of the theatre with the screen as the spring. If the user does choose to sit by the man and listen to what he has to say, they will be subject to a reflection on dreams that will bring the unreality to the forefront while the theatre gets flooded. This interaction can serve as a verbal confirmation of the visual experience.

I have chosen this approach because one of my main interests is to use VR to create new mythologies. For this, I believe, a spatial story and a visceral presence is necessary to induce later reflections of narratives. I want emotion to be the primary effect, and in this case, a sense of wonder and curiosity. Should it fail, I think it will be a worthwhile resource to understand how users respond to these kind of abstract spatial narratives.

For my creation of atmosphere, I am currently looking at these examples taken from traditional 2D media. 

One of the most potent effects VR may have is a redefining of our understanding of authorship and story… cheers to that! I would love to hear reactions to this sort of approach I’m taking for Unit 2, as I’m sure they would inform the evolution of this story (if one can even call it that yet…).

Unit 1, Week 1: VR Environments and MR Apps for Mobile Devices

I had two motives when coming into VR. The first was to create meditative environments that serve as a breath of fresh air for the mind. The second, loosely related but with less explicit medical intentions, was to bring painting to this medium.

We will be working with Unity throughout the course. Here are the limitations for this unit:
(1) The VRE (Virtual Reality Environment) must include:
(a) a computer-generated scene including terrain, lights, and a skybox
(b) navigation of the space, be it teleportation or any other sort of locomotion
(c) some sort of interaction, including but not limited to controller-based and gaze interaction
(d) optional 360 sound

(2) The MR content must include:
(a) computer-generated content or live-action content activated via QR code
(b) computer-generated content must be 2D or 3D animation (max 5 seconds) or a 3D object
(c) exportation for phone or tablet on a platform of choice, Android or iOS
(d) sound

As of week one, my idea for the VRE is a recreation of my favorite hammock on the beach. I decided for this approach since it remains true to my vision for a meditative environment. Further, should it fail, I have it as a point of departure for the rest of my journey. Some initial objectives:
(1) To prepare an environment that can later be added onto with features for a meditation program, such as validation of time past (i.e. every ten minutes spent there, there is a gratification in the form of pleasing visuals) and/or unlocked potential within the world
(2) A crucial one: biofeedback or the integration of breath monitoring. I intend on making a shader in Unity for water whose texture is white noise. The waves crashing on the shore should correspond to the breath of the user. This is meant to increase awareness of the effects of breath on perception, since there seems to be a correlation between anxiety or lack thereof and conscious breathing. This is seen in people who have frequent panic attacks and learn to control their breathing to prevent them at the onset.
(3) Manipulation of graphic quality so as to not disturb the eyes as much as one can: this can be done by offering the option of a dark mode, like it is done in many modern apps. This ensures the user can remain in the environment for longer periods of time. It is important to note that, because it is a meditation app, the author does not preclude the possibility of the user closing their eyes as a part of the experience of the beach space. This is an essential part of meditation and the user must feel at liberty to do so and still remain a part of the environment.
(4) The physical object of a hammock could be integrated into the experience. I think this raises difficulties in computer animation and also physical issues with lying down with the HMD. A quick idea for the reversal of the latter is to present it in an exhibition space where a special headrest has been created to cradle the back of the head and diminish discomfort.

Enough about the VRE thus far. Admittedly, I have not given much thought to MR before this unit, and for some reason its applications come less naturally to me. Nevertheless, my idea for the MR application is designed for visualization techniques in the treatment of phantom limb pains. The idea is to create an app where the person missing a limb may visualize said limb and touch trouble areas that would light up. The user may then have the option of “treating” them through massage or other purely symbolic techniques. I need to do more empirical research on visualization techniques for phantom limb patients, and that is surely to come. My concept for this section is inspired by my belief that visualization is a powerful technique that can rewire perception of the human condition. Let’s see if this idea survives my research or if it is forced to evolve to something more useful and/or purely amusing.

That’s it for Week 1, stay tuned to see where these experiments may go. Be sure to check out my post on Unit 2 coming this week, as well, since it tackles the story-telling aspect of more cinematic VR. Safe travels to all virtual heads!

Introduction

Hi all,

As of this week I am beginning my journey to artistic VR. As part of my year and a half MA program on VR, I will document my projects on a weekly basis. This will serve two purposes: (1) evidence to my instructor that I am doing the required coursework and (2) provide a future reference for myself and other VR creators on our way to defining this new medium.

I choose the symbol of the Datura to accompany my endeavor. The obvious reason is my love for flowers and how they hold in their blossoming a universal truth of growth and fruition. The less obvious reason is how they too, like VR, hold both a great potential and a great danger. The Datura flower’s poison, in the best of cases, blurs the boundaries between fantasy and reality. In the worst of cases, it can be lethal.

Join me on this tightrope between the planes of intoxication and sobriety so that we can begin to fully understand how VR can illuminate perception as it stands today and/or shape it for the future.