CategoryDotted Beach

Unit 1, Week 10: How relative is relaxation?

This week I began to establish the Unity-Arduino connection and I thought about how much what is considered “relaxing” can vary.

We had our final crit this week, and I got mixed reviews about how calm-inducing the environment was. Generally, I leave more satisfied from a critique when there are differing opinions. Some thought the palette and skybox was confusing; others thought it was calming to the eye. Another classmate commented they thought the motion of the water was rather nerve-wrecking. There were what I considered two unequivocal useful suggestions:
(1) Adding icons to the radial menu so the user knows what material they are opting for
(2) Tying in an inhale and exhale sound to the movement of the waves, since the user won’t be able to hear themselves breathe

I must admit I am rather happy with the color palette, since I don’t want it to have a more realistic skybox. I enjoy the feeling of feeling underwater, even when you are hypothetically on the island. Further, I agree the motion of the waves needs to be adjusted. I will focus first on establishing the connection with the values of the belt, and then I will polish the environment. The radial menu could become extended and the user will have more control over what they think is calming.

All in all, the development of this experience has been useful to consider two things (1) how organic an environment needs to be in order for it to be calming (I found a common thread in the feedback was concerning the struggle between a virtual representation of water and an organic environment) and (2) how varied what people consider “relaxing” can be. Another student in my course is developing a meditative experience with an almost opposite approach: she is creating a white environment with almost no stimulus. We were both forced to reckon with how there may be a thin line between relaxation and anxiety, almost akin to a suspension of disbelief. Perhaps I will elaborate more on this later, if there is any interest.

I am grateful to be able to present my development to my classmates and lecturers since it gives me an opportunity to reflect on the challenges I have faced, both technical and conceptual. It has been a first unit of much growth. We have around a month for the hand-in date.

Unit 1, Week 9 Dotted Beach: Near Finishing Touches

Two things were completed this week:
(1) I managed to fix the teleportation script with our lecturer Zhan Gurskis. There was a problem where, not only would the distance increments grow smaller and smaller, but the player prefab would become shorter and shorter. This is working a bit better now because we added some offsets, but I still need to polish the functionality.
(2) Connected the breath belt to the Arduino board with the help of the wonderful Elle Castle. This coming week we will be adding a plug-in to Unity that will connect these values to the program. At the moment, we are getting some values between 240 and 1023. We will map these to other numbers that will control the offset of the waves. The belt is working as an added resistor. The more stretched it is, the higher the resistance, thus the lower the value.

(3) I added some music I think is really calming, giving it a Wahwah effect on Audacity for an illusion of three-dimensionality. Here is the music I chose.

There’s other good news. My main tutor, Ana Tudor, tested the experience out and mentioned she thought it was peaceful. This means the program is successful, although there is still much that can be done. She suggested I make the world smaller and start the experience in an in-between state, where the user can see the water slightly below eye-level.

The more I work on this project, the more I see the potential for developing the full meditation experience for SteamVR.

Unit 1, Week 8: Ramp Bridges Island and Seabed

Another short update this week since I was traveling for a wedding and didn’t have access to my computer. In brief, issues still needing resolution:
(1) Teleport script only works once, then the sphere pointer nearly blinds the user.
(2) Breath belt still needs to be finished.
Once these two are resolved, I can build the project for my deadline.

What I managed to do was set up a ramp to connect the island to the seabed, so that the user can go up and down as they wish. Haven’t been able to test it, but here is what it’s looking like so far.

I’m taking advantage of this coming week to forge ahead with the belt and finish it within the next two weeks. I will also be looking at the teleport script to see where the problem lies.

Ambitious goals for the January deadline:
(1) Have the gaze-triggered animation involve the whales circling the user.
(2) Making the sea-inspired aesthetic more cohesive by taking inspiration from the Czech surrealist film Malá Morská Víla, specifically their treatment of underwater scenes.

Unit 1, Week 7: Short update

This week I had to focus on my other projects, as well as my health, so I only got three things done for Unit 1:
(1) I scripted the gaze interaction for my meditative beach, now the whales in the scene respond to the gaze and dive. I will try to make more sophisticated animations in the coming weeks, like having certain whales swim around the user when looked at.
(2) I started rigging my AR phantom arm. This was my first time 3D modeling and rigging, so it has taken longer than I expected. I am still not done, but once the animations are in place, setting up the actual program via Vuforia and Unity only takes a couple of hours.
(3) Sewed in some conductive thread in my breath belt and confirmed we were getting some values. My next step will be to connect the Arduino board to this simple circuit.

Conductive thread in my breath belt.

Unit 1, Week 6: Radial Menu for Object Interaction

Hello, again! This week’s update will be far from prosaic.
(1) First, let’s talk about my object interaction.
My intention was to have an intuitive way to control the water shader’s properties, which then turned to an attempt to control said properties via the touchpad. Nevertheless, I have settled on another method for the sake of saving time. I found a tutorial for setting up a radial menu that activates once one touches the touchpad. With said menu, one will be able to change the material of the water shader. As it stands now, the user can choose from two different dot sizes and two different levels of dot density.
(2) Secondly, transport.
For some reason the STEAM VR teleport prefab was not working on my project. I even tried it on new scenes in the same project and on a new project altogether. I had to script my own teleportation following another tutorial. This one features a sphere which I parented to the left controller. I prefer this to the prefab since I can control the material of the sphere and I avoid the more grid-like interaction that could clash with the atmosphere of my meditation world. Nevertheless, right now, the teleportation works once and then the sphere appears at the same level of the user, coming forth and back. The press of the touchpad still ushers the user forward, but there is less control and, of course, vision is compromised. I will fix this in the coming weeks.
(3) Finally, gaze interaction.
I had mentioned that I wanted to include a floating feature. I will slowly work up to this, but for now I am striving for the gaze to trigger the whale animations.

The sphere is connected to the user’s left controller and wherever it is positioned determines where the user will be teleported upon clicking.

I finally began my breath belt!
This week Elle Castle and I started working on the belt I sourced. We cut the belt and punched a hole in one of the ends. I sewed conductive rubber to the other side. Next, we will sew in some conductive wire and connect it to an Arduino board. Can’t wait to connect the device to Unity!

Now let’s talk about my AR app. As a requisite for the course, I had to model my own arm. I completed it this past week, and am now ready to rig and animate it. Then, the building of the app should be fairly straightforward.

Some new limitations for the prototype:
(a) it will be for unilateral forearm amputees
(b) the user’s position will be limited to placing the elbow area on a surface
(c) I will create a simple rig, perhaps of wire, which will have the marker and serve as a prosthetic forearm
(d) I will also consider the potential uses for stroke patients, who also struggle with coordinating their hand and forearm movements

This is all for this week. A hearty thanks to all those reading and sending messages!

Unit 1, Week 5: Ocean Floor

Expanded terrain with more corals that will also serve as teleport areas (users can station themselves on the corals, whether they choose to stand or sit in the real physical space).

This past week I have been working on changing the controller bindings with the Steam VR Input and came across loads of bugs. The coming week I might reconstruct the whole project in order to reorganize all my assets and find out where some of the Steam VR prefabs stopped working. In the meantime, I can show what the world is looking like now after some post-processing effects and a new ocean floor where the user will be able to move about.

Some whales in the scene will have an animation that is triggered with gaze.

I wish I had more to show, but there is a lot of debugging to do. I am trying to control the ocean shader’s properties via the touchpad in the HTC Vive Pro controller. This will make the experience more customizable. The idea is as follows:
North: Increase dot size
South: Decrease dot size
East: Increase dot density
West: Decrease dot density


A refresher of what this could look like (from the Unit 1, Week 4 update).

Unit 1, Week 4: What can meditation in VR look like?

Welcome back to another week of VR development! This week my challenge was to reimagine mediation (and I still haven’t done this successfully). How can we create a meditative space in VR that is (1) easy on the eyes and (2) shies away from more conventional meditation approaches that rely on green, organic forms from nature?

Because I want this to be a sort of escape from the natural world, I am aiming for more of a glitch aesthetic. Earlier this week I had been working in TiltBrush for several hours when it suddenly glitched. The resulting image was beautiful: the models I was working on stood larger than life on both sides of me, mirroring each other. In between them, below my feet, a strand of repetitive lights curled around itself like a DNA strand. I loved the feeling of witnessing a glitch. I felt apart (not a part) from it, and I stood in a sort of awe for many minutes. The moment was strangely peaceful.

So this is what I want to recreate: a sudden suspension of reality. The subtle feeling that something is not happening as it should, but is nonetheless worthy of taking a step back and letting it in. An invitation to observe an apparent lapse in space and time.

This is all very easy to express in words, let us see how I can explore the gap between the elemental/geometric and the natural/organic. Here are some concrete features I am going to add in the coming weeks:

(1) Breath belt! I just sourced a skater’s cloth belt that will be perfect for the breath measurement. I am lucky enough to be able to work with the intelligent and crafty people at the Creative Technology Lab at UAL, and Elle Castle from Physical Computing will be helping me build the belt. We will take some conductive rubber and sow it onto the belt, which will then wrap around the user and register the contraction and expansion of the stomach just below the ribcage. We will integrate Arduino with Unity to store these values and convert them into those of the virtual ocean.

(2) Object interaction: the user will be able to control features like dot size and density with hand gestures, not buttons on the controller. This will provide for a customizable and more intuitive experience, since everyone has different preferences and should have a measure of control over their meditative beach.

The user will be able to control the dot size and density with simple hand gestures.

(3) Gaze interaction: the user, after having spent ten minutes in the space, will have the ability to float over the ocean waves. Their position in virtual space will rise and wherever they look, they will move. This will give the sense of floating over one’s own breath levels. I am curious to try it out myself!

(4) The ability to go underwater and swim with whales. I will include whales with simple animations, but being careful to not make them look too realistic. Perhaps I will keep them a solid color. I am considering recreating an abstracted ocean floor where apparent light is refracted all around you. An algae forest could be nice.

During all of this, I have to keep in mind that my deadline is early January, so I want to be sure to have defined and simple objectives for each of the six weeks I have beforehand. I want to have a demo I can later expand.

This is all for this week and I hope you enjoy the progress! Let me know in the comments if there is a feature mentioned you are interested in, can improve upon, or maybe something altogether new you think is essential for meditation in VR.

Unit 1, Week 3: Visual Strains and Gaze interaction

It’s going to be a short update this week. Two new things:
(1) The water now responds to input. When the space bar is down, the ocean recedes and rises slightly, and once it is released, it goes back to its initial position. At this stage I can start preparing the breath monitoring belt so we can later translate that keyboard input into the numerical values the belt will produce.
(2) I added a small island where the user will be able to teleport around.

Some conceptual challenges consumed most of my time this week. I need to figure out what sort of gaze interaction would be appropriate for this space, given my objectives of foremost prioritizing breath and second of increasing awareness of the relationship between our breath frequency and visual perception. I want the virtual environment to not be challenging or taxing on the senses, so I am considering working in a greyscale world with low contrast. Perhaps the gaze interaction could consist in making small RGB values appear in the TV static like ocean. Unfortunately, this could prove to be counterproductive given my aims of avoiding eye strain, since it can distract from the central focus of the world, our lungs.

All this I am starting to think about. As a side note, since its not a part of the requirements of this unit, I also found it compelling to see the mesh used for the water as a dome above us. Have a look for yourself:

https://vimeo.com/367554906
“Underwater”: under the water shader

I think this is an interesting texture to work with from many perspectives. Yet it is most probable I will refrain from using it anywhere else because it will detract from the bio-feedback ocean.

As for my MR app, the main challenge I now have to face is to ensure that it always has a correct perspective. If the virtual hand of the phantom limb user does not correlate with their perspective, the visual feedback will have no efficacy. For this I have been looking at cylindrical trackers and what they can do to ensure the virtual arm and hand’s correct orientation.

This is all for this week, and I hope you enjoy seeing this project come to virtual life. Comment any suggestions or reactions. I would love to know what would make this environment more useful.

Unit 1, Week 2: Dotted ocean, bio-feedback and phantom limb pain treatments

For the first part of this unit, I have decided to pursue a meditative environment, where an ocean responds to the values of a user’s breathing. I will scrap the idea of including a hammock as a physical object or possibility in the program, since it might be outside of the scope of this project.

The ocean will be an abstracted sea, made to look like TV static with an extra dimensionality. Here are some updates:
(1) I went down to the Creative Technology Lab at UAL to see what would be my best option for breath detection. I was told of an experiment another student had done before, where she made a belt that translated the expanding and constricting of the lungs to values. We decided that might be my best option, since other more accessible options, like a microphone, would only record very exaggerated breathing and completely miss subtle cues like inhales.
I was told to first worry about getting the ocean to move and respond to some sort of input, and then we could worry about the bio-feedback. So I got to work on the dotted sea.

(2) Zhan Gurskis, our lecturer in immersive and interactive design, helped me add a script to a plane that makes it move like an ocean. This was done first by aligning a series of spheres and adding a transform function to the x position.

(3) I like this form of the dotted ocean, but I would prefer more of a density, and more of the illusion of volume. Zhan showed me another approach, which instead imports a simple plane from Maya and uses a shader to animate its points. This ended up looking something like this:

Zhan added some sliders to control dot size and density.

Going forward I want to play around with the 2D texture on the dots to give more of an illusion of their three-dimensionality. Ideally users could see the dots as if TV static were protruding out of a screen. The sea would retract once they inhale and crash against the shore as they exhale. Maybe they can control the color of the noise in the audio (there’s a whole rainbow of noise audio).

For the AR part of the assignment, I have carried on with the idea of direct visualization as a treatment for PLP (phantom limb pain). Mirror therapy (where unilateral amputees use a mirror to duplicate their working limb) has been shown to be a promising alternative. Most subjects describe that they can feel their phantom limb moving when viewing the reflection of their intact limb.

I want to create an AR app where the missing limb can become an avatar limb with no aim at photo-realism. I want to do this because I think an explicit suspension of reality might be more effective than an attempt at fully tricking the brain it is the original limb. Further, it seems more visually striking to me and might serve to add a symbolic dimension to pain that could prove to be useful. I imagine the avatar would look something like this:

The user would have the option to touch where it “hurts” and get visual validation of the pain.

From an article on the effect of direct observation on PLP:

The results of this specific study showed that direct visualization was more effective than mental visualization. Because of this, I would like to add some basic animation the user can choose through to observe as if happening in his own body, i.e. like those cited above.

For the animation of the limb, I haven’t decided if I also want it to be able to mirror another limb in the case of unilateral amputees. Nevertheless, I do not exclude the possibility of making it accessible to those who lack both limbs, be they arms or legs. This last use will require a different set-up which I will need to take into consideration.

This is what is going on so far around this unit, and I’m excited to see how it progresses.

Unit 1, Week 1: VR Environments and MR Apps for Mobile Devices

I had two motives when coming into VR. The first was to create meditative environments that serve as a breath of fresh air for the mind. The second, loosely related but with less explicit medical intentions, was to bring painting to this medium.

We will be working with Unity throughout the course. Here are the limitations for this unit:
(1) The VRE (Virtual Reality Environment) must include:
(a) a computer-generated scene including terrain, lights, and a skybox
(b) navigation of the space, be it teleportation or any other sort of locomotion
(c) some sort of interaction, including but not limited to controller-based and gaze interaction
(d) optional 360 sound

(2) The MR content must include:
(a) computer-generated content or live-action content activated via QR code
(b) computer-generated content must be 2D or 3D animation (max 5 seconds) or a 3D object
(c) exportation for phone or tablet on a platform of choice, Android or iOS
(d) sound

As of week one, my idea for the VRE is a recreation of my favorite hammock on the beach. I decided for this approach since it remains true to my vision for a meditative environment. Further, should it fail, I have it as a point of departure for the rest of my journey. Some initial objectives:
(1) To prepare an environment that can later be added onto with features for a meditation program, such as validation of time past (i.e. every ten minutes spent there, there is a gratification in the form of pleasing visuals) and/or unlocked potential within the world
(2) A crucial one: biofeedback or the integration of breath monitoring. I intend on making a shader in Unity for water whose texture is white noise. The waves crashing on the shore should correspond to the breath of the user. This is meant to increase awareness of the effects of breath on perception, since there seems to be a correlation between anxiety or lack thereof and conscious breathing. This is seen in people who have frequent panic attacks and learn to control their breathing to prevent them at the onset.
(3) Manipulation of graphic quality so as to not disturb the eyes as much as one can: this can be done by offering the option of a dark mode, like it is done in many modern apps. This ensures the user can remain in the environment for longer periods of time. It is important to note that, because it is a meditation app, the author does not preclude the possibility of the user closing their eyes as a part of the experience of the beach space. This is an essential part of meditation and the user must feel at liberty to do so and still remain a part of the environment.
(4) The physical object of a hammock could be integrated into the experience. I think this raises difficulties in computer animation and also physical issues with lying down with the HMD. A quick idea for the reversal of the latter is to present it in an exhibition space where a special headrest has been created to cradle the back of the head and diminish discomfort.

Enough about the VRE thus far. Admittedly, I have not given much thought to MR before this unit, and for some reason its applications come less naturally to me. Nevertheless, my idea for the MR application is designed for visualization techniques in the treatment of phantom limb pains. The idea is to create an app where the person missing a limb may visualize said limb and touch trouble areas that would light up. The user may then have the option of “treating” them through massage or other purely symbolic techniques. I need to do more empirical research on visualization techniques for phantom limb patients, and that is surely to come. My concept for this section is inspired by my belief that visualization is a powerful technique that can rewire perception of the human condition. Let’s see if this idea survives my research or if it is forced to evolve to something more useful and/or purely amusing.

That’s it for Week 1, stay tuned to see where these experiments may go. Be sure to check out my post on Unit 2 coming this week, as well, since it tackles the story-telling aspect of more cinematic VR. Safe travels to all virtual heads!