Deepscape: Transversal is a radiophonic computational artwork consisting of a planetary soundscape live-generated by one deep neural network.
Deepscape: Transversal leverages 16 hours of soundscape transversally recorded online over 28 places worldwide in late April 2022 as the dataset for a RAVE deep neural network. Planetary spacetimematterings are contained in the bio/geo/anthropophonies of the dataset. Yet, the strangely rumbling soundscape that continuously unfolds from the deep neural network seems to produce a whole new world when we deeply listen through it.
By fostering spatial modes of sonic resonance with a deep neural network, the work draws attention to the scape of AI, understood as the global flows of media generated by deep neural networks throughout the Internet, entangled with the material, human and cultural resources they capitalise on throughout infrastructures of AI. Who terraforms this deepscape? Whose scapes are getting threatened as AI sucks our attention away from planetary concerns?
The project is being developed with Axel Chemla–Romeu-Santos, in collaboration with the ACIDS group of IRCAM, and the Locus Sonus group of ESAAIX, in the context of a postdoctoral fellowship program supervised by Emanuele Quinz at EUR ArTeC (Université Paris 8), in collaboration with the Reflective Interaction group of EnsadLab.