The Love Machine Never Stops Teaser (2020)
Et voila, finally, a teaser for the VR experience we have been working on.
Can you dig it? 😉
Et voila, finally, a teaser for the VR experience we have been working on.
Can you dig it? 😉
This entry will be a joint entry for both units, since there is not much to expound on in any. The Fire Spirit is still on hold as we all go into our separate groups for the unit that is due next week: The Machine Stops.
As for The Love Machine Never Stops, we sat with sound students two whole days this past week to integrate their work with our project. We have mainly been using the Resonance plug-in to implement reverb audio rooms that respond to whatever sounds we want, i.e. worm movement and dialogue. Our sound artists have also been working with levels.
We also got some updated theme music tracks for different parts of the experience. Vier and I have been finishing up all the dialogue and animation triggers, as well as putting all of the buttons in place for the simulation scene and pleasure simulator scene. It has been a lot of adjust, test, and repeat. There is still much to do before our deadline Thursday, but we are confident we will be able to deliver an experience you can run-through fully.
It is worth mentioning that our team fully intends to keep on developing and polishing this experience after we deliver its first iteration. We will do user tests and begin thinking about distribution.
We were able to make our first build this past week, and it was successful.
This was an exciting week for the Love Machine team, both VR and sound alike, since we began to integrate everything. We got to add some room-specific reverb, our theme music, and script a lot of behaviors that we knew we needed and some whose necessity arose as we went along…
These scripts included:
(1) Our worm movement sound: the sound team gave us some squishy sounds to add to our worm avatar every time the user moves. The way we achieved this was by adding an audio source to the avatar prefab and a Worm Movement script that checks whether the x of the transform has moved beyond a certain threshold, and if so, it plays the audio clip.
(2) A wooping two new ones for our ladder: haptics + randomized audio on collision. The haptics were clear enough to script, since Steam includes a vibration action you can call and then give it overloads like duration, frequency, and amplitude. The randomized audio features an array of audio clips you can call on randomly upon collision with the ladder. This gives the climbing a more realistic feel, since it is a long section of the experience and hearing the same clunk over and over again would be off-putting…
(3) Material changing: our project already has two general interaction scripts, one for gaze and one for collision with buttons. These are identical scripts which call on arrays of game objects to activate or de-active, and animations to trigger. This week we added a material changer, where the script finds the mesh renderer of the object whose material we want to change, and feeds it a new material. This will be useful for the pleasure simulator scene, where the user touches buttons and gets reactions from the main character and now the general room, as well.
(4) State machine behavior script: some of our gaze interactions were triggering animations before their time, so we had to make sure a script checked whether others had been finished beforehand. We added a script to the animation controller that controls the state behaviors and creates a bool that is only checked once an animation state has finished. The next animation can only be triggered if that bool is set to true.
As a little preview: here is our theme tune for this experience. We will have variations on it throughout. Enjoy 😉
The tune was made by the sound artist Joey Sergi.
Accomplished this week:
(1) Ladder movement scripted with our ladder model
(2) Avatar embodiment scripted
(3) Constellation scene modeled in TiltBrush
(4) Large machine atmosphere modeled in Maya
(5) Final sunset scene modeled in TiltBrush
(6) Main voice actor sent all recordings
(7) Second round of sound deliverables
(8) Main button functions scripted: trigger audio, trigger animation state, trigger change in render settings for fog, change in material and activation/deactivation of game objects
(9) Everything starting to be put together in Unity
We had been considering making our treatment of gender more consistent, but upon a chat with our team and tutors we decided it’s not necessary. The main theme of the work deals with desire and its role in our relationship with technology, so gender (albeit present) is a bit of a red herring.
This coming week we will be integrating everything into a single Unity project, finishing animations, and sitting down with the sound team to work the sounds into the experience. We have two weeks before the delivery date.
First, a succinct update of this week’s progress.
(1) We received the first round of recordings from the voice actor.
(2) Vier and I have begun animating the main worm character.
(3) Our sound colleagues are due today for their first round of deliverables, which include:
(a) voice editing of the dialogue between mother + Kuno.
(b) the sound of worm eyelids (the user’s) opening at the beginning.
(c) variations for all sorts of buttons.
(d) the atmosphere of the larger machine, as experienced while the user climbs the ladder in the huge, dark shaft.
We had our mid-term crit this past week, and we prepared our presentation alongside the sound artists. An interesting thread came up which underlined the whole project: a sort of gender fluidity. As I presented, I kept on misgendering the main mending worm, despite being adamant about making the character ambiguous. Of course, a deep Barry-White-like voice is gendered in itself. Nevertheless, the body is less straight-forward. The contrast in itself is what at this point remains interesting to me: that out of a worm-like machine, a hearty (and perhaps gender-less) voice projects itself.
We later realized there was another event in which we played with gender subconsciously. One of our male sound colleagues recorded the voice for the mother. Next week, we will have a conversation about this as a group so we can clearly define how we are applying this theme of gender fluidity to romance, technology, and dystopia.
Fortunately, more than a handful of things were accomplished this week.
(1) We recorded dialogue for the mother and Kuno.
(2) Vier Nev finished the concept art for the rooms, the escape shaft, and the final scene. (Find them under the list)
(3) Sound artists experimented with procedural sound for the outside scene, made some ambisonic recordings for the simulation scenes, created some custom reverb for different spaces, and started experimenting with variations on sounds for buttons, loading scenes, and the heart fractal generator.
(4) Vier scripted the room-scale portal exploration.
(5) We modeled and rigged the mending apparatus, the room, and the barn scene for one of the simulations.
And here’s a sample of the barn, made in TiltBrush:
And, finally, (6) we set specific deadlines for deliverables, leaving two weeks for debugging.
We did four significant things this week:
(1) Sourced the voice actor, who will be the voice for the main worm, Finn.
(2) Started building a prototype for the main type of exploratory movement – it will be a room-scale kind of re-directed walking.
(3) Got the concept art ready for the main character, Finn.
(4) Spoke with our sound artists and began to shape the general soundscape.
(5) Scripted a prototype for a heart fractal generator we will need for a narrative beat (no pun intended), and which we are considering making a design motif.
(1) We already had a read-through with Víctor Emanuelle and came up with excellent results. He will be recording the lines in separate files and sending them over this week.
(2) Vier began building a prototype for the movement we are looking for. At the moment, we are using stencil shaders to create portals the user can go into, giving the illusion of space expanding.
(3) Here is another of the pages Vier delivered this week. Try and find our favorite so far.
(4) In our meeting this week with the sound artists, we created an asset list and heard some samples for general atmosphere and button sounds. We are looking to make the worm machines sound more human-like than the actual humans in the story.
(5) Here is a clip of the heart fractal generator.
We are so excited for this project.
This past week Vier and I discussed how to make our chosen chapter more relevant to the actual state of technology today. We thought it was interesting we were assigned such an anti-technology narrative. More than a century has passed since it was written and our technology is far more advanced, yet we haven’t suffered the grim fate it announces.
We decided we would have the idea of simulation as the core of our experience, since it would then comment on virtual reality. It is more probable today that we would stay in rooms and “lose touch” because of our ability to simulate outside spaces. We looked at the Rick and Morty “simulation inside a simulation” episode for inspiration.
After meeting with the sound artists, we settled on a story that departs from the original in (hopefully) significant ways. Our story claims the only reason the mending apparatus fails, and everyone dies, is because the mending apparatus falls in love and decides to escape. Its escape means the rest of the machine self-destructs without their much needed help.
The idea is for the user to enter the experience in media res. After putting on the headset, the user “wakes up” in a room before a worm-like machine, like they’re described in the story, who looks attentively at you. We want to explore the second-person creator-user relationship. The user will be mostly passive, since they can’t control much of what will happen. The worm will have a Barry White voice and will try to seduce the user (and succeed!).
At this point, I won’t reveal much more of the plot points, but the story ends with the user “escaping” with our Barry worm, and walking into the sunset. If they look behind, though, they would see the rest of the mending apparatus dragging others back into the machine.
This story is meant to make the anti-technology rhetoric a little bit less black and white. We have a machine with emotions and human ambitions, but we also have the collateral damage associated with this desire. The interaction design will feature ways for the user to explore simulation inside the machine.
Another exciting unit: we will be collaborating with second year students from Sound Arts to bring Forster’s grim short story “The Machine Stops” to VR. My group has chosen the second chapter, titled “The Mending Apparatus” and has begun discussions on potential opportunities.
This is an exciting project for several reasons: (1) my background in comparative literature has prepared me like no other for close reading of texts, as well as considerations of their intertextuality among other mediums, (2) beyond detailed interpretation I also have experience adapting literature to screen, like for example adapting Woolf’s To the Lighthouse to a feature-length screenplay and, finally, (3) I have the opportunity to work with other talented artists who are all willing to have fun and play to their strengths. We have sound artists that are interested in experimenting with procedural sound and integrating that expression with Unity, and we have rigorous and creative 3D modelers and animators willing to take interaction design to the next level.
I think it’s appropriate to start out this unit with a reflection on the story. Upon my first reading, my notes are separated in the following way:
(1) visual cues: primarily, but not limited to, the aesthetic of light
(2) author-reader-story relationship
(3) mentions of touch (since they may translate specifically to interaction or meditations on interaction)
(4) anything pertaining to the senses (this is a big theme in the work)
(5) spatial cues
(6) soundscape cues
(7) philosophical cues that hint at the relationship between content and form and will help shape the VR experience.
For those who haven’t read the tale, it is a 25 page story that deals with a dystopian future where all human beings are separated into personal rooms that form “the Machine.” The Machine exists underground. Everything can be done inside the room: sleeping, eating, listening to music, attending/giving lectures, and it is hinted that even procreating. Some people do field-trips to Earth, but these are limited and highly-controlled. The air on Earth has become toxic to the beings so they must have a special respirator.
In the story, the mother Vashti is called by her son Kuno to come visit him because he wants to speak to her in person. Being a stout devotee of the Machine, she hesitates but decides to go. When she arrives, Kuno tells her about how he went aboveground through unofficial means, i.e. escaped, and had developed his sense of space and touch so much that he couldn’t look back. He criticizes her faith in the Machine. He has been threatened with Homelessness, which means he will die. His mother thinks him foolish and leaves him again. Later on, Kuno is the first to predict the Machine is “dying” and when he tells his mother, she doesn’t believe him. The Machine ultimately fails causing a genocide of human beings who can’t see or breathe.
These are what I now think are the most interesting notes for some of the points:
(1) The story starts and ends with the image of the machine as a beehive. The allusions are not scattered through-out, so it seems the author consciously and perhaps strategically opens and closes with this image. He mentions irritability is “a growing quality” in that age, and that the mothers face is “as white as a fungus.” Because the beehive imagery is a bit over-done in adaptations of this story, I thought a natural progression would be to take a cue from Plato’s The Republic. Socrates talks about how societies that have become overly inflated become like a wasps’ nest. We can picture the machine like he describes this society, inflamed, aggressive, irritable. An interesting visual translation of this for our VR experience, which hints to point (3) would be to portray these rooms and this machine as an infected skin. Touch has been completely neglected in this world, and as a consequence, each individual room becomes like an inflamed, infected pore… In the armchairs sit “swaddled lump[s] of flesh” among “throbbing” sounds…
(3) There are 11 mentions of touch in the story, and 5 refer to buttons or interface switches (“the ecstasy of touching a button”). 4 of the mentions deal with the absence of touch between human beings, or even how one should avoid being touched by sunlight. Only 2 refer to actual physical instances of touch, at the end when the machine is failing.
Clearly this can translate to a specific way of designing the interface, and contrasting this representation with actual physical touch.
(4) As for the senses, the meditation on touch can be extended. Kuno’s main realization, that “man is the measure” must permeate every unit of measurement for this work, since it is a humanistic philosophy and it includes an allusion to the five senses as “those five portals by which we can alone apprehend”.
(5) A list of all spaces mentioned in the story, which will need to be included in the VR rendition. For simplicity’s sake, I will not be including it.
(6) There is an eternal “hum” that numbs the beings that live in the machine. Interestingly enough, the first tell-tale sign that worries people is when there appears a defect in the music. It is the first failure that makes life unbearable. Reminds me of Nietzsche’s claim that “without music, life would be a mistake.”
To not make this entry too extensive, I will just mention that our group had a first meeting and already made some notes about potential interaction, visual, and sound opportunities. We begun a mood-board and joint document to keep track of the evolution of the project. This is my first instance of adaptation in VR so I am looking forward to testing the potential for transmutation.