Yulia Vergazova
Nikolay Ulyanov
Svan/Nectar
About the Item
Based on scientific research in the field of how plants perceive audio and video content, the authors created a film intended primarily for hearing and vision.
Plants are known to respond to the sounds of pollinating insects by increasing the concentration of sugar in their nectar. The film reproduces the sound range of a single honey bee hovering with peak frequency of 200-500 Hz, to which plants are most responsive.
Caucasian honey bees are highly rated by beekeepers all over the world, so the authors chose microtonal Svan songs as the soundtrack for the film: Elia Lrde, Kriste Agsdga and Dale Kojas. The first melody is a nonsensical hymn based on syllables and vowels. The second is an Easter hymn. The third is a ballad about the fatal golden-haired goddess Dal, who lives in the highlands of Georgia (mainly in Svaneti). The features of these songs (3 voices, 4 verses, an algorithmically built grid of notes based on the original chords) are reproduced and played on samples of buzzing and pollination sounds.
The video sequence generated in accordance with the music consists of output data from the BigGAN neural network – the first neural network to generate images from a wide variety of domains chosen from 1000 classes (subclasses of the ImageNet dataset). The authors selected the type of images related to bees and the plants they pollinate. There are 9 in total: a bee, chamomile, yellow lady’s slipper, rosehip, rapeseed, orange, lemon, strawberry and cucumber.
To adapt the image to the ‘vision’ of plants, the artists used the optical flow principle – a machine-learning method for processing video content. The results of this work resemble the colour spectrum of a photographic lamp, and the principle of such vision is a speculative version of the bee’s vision. For example, only areas in motion or temperature maps are visible. Based on video documentation of how bees pollinate plants, the author calculated the optical fluxes, compiled a dataset of 330 images and trained the pix2pix neural network, which reinterpreted the originally generated video from BigGAN and brought it as close as possible to an option suitable for a plant cinema.
About the Artist
Yulia Vergazova is an interdisciplinary artist, curator, and educator based in Germany. Having studied at Cologne Academy of Media Arts, Moscow Rodchenko Art School, and Salzburg International Summer Academy of Fine Arts, Vergazova now works on the edge of biology and technology, natural and synthetic intelligence. Her practice is focused on research and diving deep into the context of mutating, accelerating reality. Lately, she has been developing projects in collaboration with data scientist and musician Nikolay Ulyanov.
Nikolay Ulyanov – artist, musician, poet, programmer. Member of the Coincidental Institute, creator of the Noise Environment Synthesizer for the 12th Prepared Environments Festival.