Unit 1, Week 2: Dotted ocean, bio-feedback and phantom limb pain treatments

For the first part of this unit, I have decided to pursue a meditative environment, where an ocean responds to the values of a user’s breathing. I will scrap the idea of including a hammock as a physical object or possibility in the program, since it might be outside of the scope of this project.

The ocean will be an abstracted sea, made to look like TV static with an extra dimensionality. Here are some updates:
(1) I went down to the Creative Technology Lab at UAL to see what would be my best option for breath detection. I was told of an experiment another student had done before, where she made a belt that translated the expanding and constricting of the lungs to values. We decided that might be my best option, since other more accessible options, like a microphone, would only record very exaggerated breathing and completely miss subtle cues like inhales.
I was told to first worry about getting the ocean to move and respond to some sort of input, and then we could worry about the bio-feedback. So I got to work on the dotted sea.

(2) Zhan Gurskis, our lecturer in immersive and interactive design, helped me add a script to a plane that makes it move like an ocean. This was done first by aligning a series of spheres and adding a transform function to the x position.

(3) I like this form of the dotted ocean, but I would prefer more of a density, and more of the illusion of volume. Zhan showed me another approach, which instead imports a simple plane from Maya and uses a shader to animate its points. This ended up looking something like this:

Zhan added some sliders to control dot size and density.

Going forward I want to play around with the 2D texture on the dots to give more of an illusion of their three-dimensionality. Ideally users could see the dots as if TV static were protruding out of a screen. The sea would retract once they inhale and crash against the shore as they exhale. Maybe they can control the color of the noise in the audio (there’s a whole rainbow of noise audio).

For the AR part of the assignment, I have carried on with the idea of direct visualization as a treatment for PLP (phantom limb pain). Mirror therapy (where unilateral amputees use a mirror to duplicate their working limb) has been shown to be a promising alternative. Most subjects describe that they can feel their phantom limb moving when viewing the reflection of their intact limb.

I want to create an AR app where the missing limb can become an avatar limb with no aim at photo-realism. I want to do this because I think an explicit suspension of reality might be more effective than an attempt at fully tricking the brain it is the original limb. Further, it seems more visually striking to me and might serve to add a symbolic dimension to pain that could prove to be useful. I imagine the avatar would look something like this:

The user would have the option to touch where it “hurts” and get visual validation of the pain.

From an article on the effect of direct observation on PLP:

The results of this specific study showed that direct visualization was more effective than mental visualization. Because of this, I would like to add some basic animation the user can choose through to observe as if happening in his own body, i.e. like those cited above.

For the animation of the limb, I haven’t decided if I also want it to be able to mirror another limb in the case of unilateral amputees. Nevertheless, I do not exclude the possibility of making it accessible to those who lack both limbs, be they arms or legs. This last use will require a different set-up which I will need to take into consideration.

This is what is going on so far around this unit, and I’m excited to see how it progresses.