Unit 1, Week 3: Visual Strains and Gaze interaction

It’s going to be a short update this week. Two new things:
(1) The water now responds to input. When the space bar is down, the ocean recedes and rises slightly, and once it is released, it goes back to its initial position. At this stage I can start preparing the breath monitoring belt so we can later translate that keyboard input into the numerical values the belt will produce.
(2) I added a small island where the user will be able to teleport around.

Some conceptual challenges consumed most of my time this week. I need to figure out what sort of gaze interaction would be appropriate for this space, given my objectives of foremost prioritizing breath and second of increasing awareness of the relationship between our breath frequency and visual perception. I want the virtual environment to not be challenging or taxing on the senses, so I am considering working in a greyscale world with low contrast. Perhaps the gaze interaction could consist in making small RGB values appear in the TV static like ocean. Unfortunately, this could prove to be counterproductive given my aims of avoiding eye strain, since it can distract from the central focus of the world, our lungs.

All this I am starting to think about. As a side note, since its not a part of the requirements of this unit, I also found it compelling to see the mesh used for the water as a dome above us. Have a look for yourself:

https://vimeo.com/367554906
“Underwater”: under the water shader

I think this is an interesting texture to work with from many perspectives. Yet it is most probable I will refrain from using it anywhere else because it will detract from the bio-feedback ocean.

As for my MR app, the main challenge I now have to face is to ensure that it always has a correct perspective. If the virtual hand of the phantom limb user does not correlate with their perspective, the visual feedback will have no efficacy. For this I have been looking at cylindrical trackers and what they can do to ensure the virtual arm and hand’s correct orientation.

This is all for this week, and I hope you enjoy seeing this project come to virtual life. Comment any suggestions or reactions. I would love to know what would make this environment more useful.