Hello people,
Following the [APPROVED] Generative Sound Design for Motion Landscape in VR #1 and [APPROVED] Generative Sound Design for Motion Landscape in VR #2, I set the pipeline for Pure Data implementation into Unity for generative and spatialized sound making, and designed a phase vocoder patch as the main sonic tool for the project.
Landscape #1
I used the phase vocoder tool to mix the spectra of natural sounds of interactions of stones with other materials, as a gesture of transposing some aspects of Chicoās work into the sound making process. This was used to create an interactive sound for the moving stones.
I created a process of algorithmic composition for a musical background using 9 different layers of sound to create a complex relation unfolding in time. This was set to be with a āsuspendedā mood to keep this light. Also, I had a spectral morphing of watery sounds to the background
Landscape #2
In this case, the floating objects had a fixed sonic material in loop, which was created with the phase vocoder tool. Each stone had a portion of the spectrum of the ācompleteā sound, and the soundscape was generated by the simultaneous auditory perception of all sources.
The musical background layer was more dark for this landscape, opening the algorithm to some minor intervals too, and adding more string and granulated layers. The background also had an interactive layer, which basically consist in the spectral morphing of the composition with a water sample, making the sound more diffuse when the user went to the āhighestā point of the landscape.
Let me know if you have any questions.
Thank you,
nico