I’m sitting in a latent space exploring the deterioration of information through the use of generative AI models.
Starting with images of glaciers, the work investigates how the cyclical repetition of the training process leads to a progressive loss of details and meaning. Each iteration of the training process synthesizes what it has already learned, resulting in an increasingly distorted and altered representation of reality.
The work raises a reflection on the future of our relationship with the environment, suggesting that, with the extinction of glaciers, we will no longer have new images of them and will have to settle for a digital memory of what we are losing in the present.
This digital memory, made up of synthetic data generated from other synthetic data, will increasingly distort what has been, until it becomes unrecognizable.