In physical space two networked computers communicate, mediating video data in sync with each other. Meanwhile, somewhere in a data center, web servers listen for user input. These works form a network of Synthetic Perception.
As computing technologies increasingly mediate everyday life, their use by tech corporations as sources of data for advertising becomes a major concern. The opacity of the data-manipulation algorithms used means that the everyday user is becoming increasingly distanced from what their data is being used for and by whom. Synthetic Perception is a research project that aims to critique the algorithmic processing of users through code-based video and sonic outcomes. It questions the techniques used to interact with technological surveillance and automate humanity.
What once may have been distinctly human behaviour - exploring a place, connecting with friends and family, and seeking out information about the world around you - is increasingly mediated by the platforms which facilitate these online. The difference between the organic impulses which drive these behaviours and the tools which supposedly facilitate them is that online platforms have been created to encourage consumption. Google autofills searches and suggests places to go, Facebook tags images with yours and other’s faces and feeds you with the content that an algorithm has determined will keep you engaged. YouTube autoplays the next video in an endless stream. Every time you interact this creates a feedback loop between organic and suggested behaviour, training the algorithm how best to exploit you.
Extraction Network (2019)
Autoplay True (2019)
Restorative Circulation (2019)