MonthMarch 2020

Week 8: Putting it all together

This entry will be a joint entry for both units, since there is not much to expound on in any. The Fire Spirit is still on hold as we all go into our separate groups for the unit that is due next week: The Machine Stops.

As for The Love Machine Never Stops, we sat with sound students two whole days this past week to integrate their work with our project. We have mainly been using the Resonance plug-in to implement reverb audio rooms that respond to whatever sounds we want, i.e. worm movement and dialogue. Our sound artists have also been working with levels.

We also got some updated theme music tracks for different parts of the experience. Vier and I have been finishing up all the dialogue and animation triggers, as well as putting all of the buttons in place for the simulation scene and pleasure simulator scene. It has been a lot of adjust, test, and repeat. There is still much to do before our deadline Thursday, but we are confident we will be able to deliver an experience you can run-through fully.

It is worth mentioning that our team fully intends to keep on developing and polishing this experience after we deliver its first iteration. We will do user tests and begin thinking about distribution.

We were able to make our first build this past week, and it was successful.

Week 7: VR Chat Compatibility

This week, unfortunately, was rather uneventful for this unit since our deadline for The Love Machine is looming and we have a couple of more months for this one. Nevertheless, we gave a little bit more thought into what the redirection means for our project in terms of technicality and research.

I found out that VRChat users are just as protective of VRChat interactions as its developers, but there is still an overwhelming amount of people who wanted to be able to script their own interactions in their custom worlds. In fact, it’s a feature 400+ users voted for. You can read some of the hesitations there as well. These mostly revolve around controlling abuse.

VRChat responded to this feature request and worked on their own programming language, called Udon, that can be compiled in Unity. They spent two years working on it, and perhaps luckily for us, released it about two months ago. We will be looking into how useful this new release could be for our custom world.

In terms of research, we are embracing this redirection as a way to learn not only about an existing social platform and its ethics, but also about the possibility of future role-playing games being held in virtual platforms. Social virtual platforms have proved to be a host for all sorts of activities (baptisms, yoga, meditation, business meetings, you name it), so it is not such a long-shot to imagine games being designed specifically for these umbrella platforms.

Excited to see what we can achieve within these limits we have set for ourselves.

Week 7: Reverb, music, and lots of scripting!

This was an exciting week for the Love Machine team, both VR and sound alike, since we began to integrate everything. We got to add some room-specific reverb, our theme music, and script a lot of behaviors that we knew we needed and some whose necessity arose as we went along…

These scripts included:
(1) Our worm movement sound: the sound team gave us some squishy sounds to add to our worm avatar every time the user moves. The way we achieved this was by adding an audio source to the avatar prefab and a Worm Movement script that checks whether the x of the transform has moved beyond a certain threshold, and if so, it plays the audio clip.

(2) A wooping two new ones for our ladder: haptics + randomized audio on collision. The haptics were clear enough to script, since Steam includes a vibration action you can call and then give it overloads like duration, frequency, and amplitude. The randomized audio features an array of audio clips you can call on randomly upon collision with the ladder. This gives the climbing a more realistic feel, since it is a long section of the experience and hearing the same clunk over and over again would be off-putting…

(3) Material changing: our project already has two general interaction scripts, one for gaze and one for collision with buttons. These are identical scripts which call on arrays of game objects to activate or de-active, and animations to trigger. This week we added a material changer, where the script finds the mesh renderer of the object whose material we want to change, and feeds it a new material. This will be useful for the pleasure simulator scene, where the user touches buttons and gets reactions from the main character and now the general room, as well.

(4) State machine behavior script: some of our gaze interactions were triggering animations before their time, so we had to make sure a script checked whether others had been finished beforehand. We added a script to the animation controller that controls the state behaviors and creates a bool that is only checked once an animation state has finished. The next animation can only be triggered if that bool is set to true.

As a little preview: here is our theme tune for this experience. We will have variations on it throughout. Enjoy 😉

The tune was made by the sound artist Joey Sergi.