Text project Bodily, by Paolo Cirio. 2020
The interactive video installation Bodily aims to problematize body scan technology. In particular, this work looks at the advancement of Thermal Infrared Imaging and how this technology is used to profile individuals and diagnose health conditions. [1]

During the global pandemic, devices such as thermal cameras became ubiquitous. Even if these devices are meant to protect public health, they can also turn into social control instruments. [2] Thermal cameras can be particularly privacy-violating because they can scan the inside of human bodies from a distance and even if covered by clothes or behind walls.

This artwork raises awareness about the detection and collection of sensitive biometrics concerning our bodies and organs. This biometric data can profile distinct bodily fingerprints of individuals and abuse their privacy in public space by anyone and in private by the health industrial complex. Algorithms can subsequently process the collected data to determine diagnoses in health care contexts or for further commercial interests. Moreover, body scanners and related data about organs can be used to detect human feelings. Recent researches established physical maps of human emotions measured with thermal cameras scanning temperatures of body parts and organs. [3]

Biometric technology is prone to surveillance, bias, and discrimination. It's not only about sensitive data of an individual's chronic illness or disability that can be used to discriminate against them, but it's about everyone's body conditions. A simple swing in internal organs temperature or dimension can be detected instantly by body scans and eventually used to discriminate on access to jobs, health, transportation, and even just as a member of society. [4]

Medical technology's ethics are central in the artwork Bodily by referring to the unregulated and defective advancements of artificial intelligence and computer vision used to diagnose patients' health and mental states.

After the Covid global pandemic, these ethics must be more popularized to produce public debates and legal frameworks. Medical technology has now entered the public sphere and everyday life. The unrestrained collection of medical data and diagnostics algorithm’s opacity is not tackled and often justified by pseudoscience introduced by new technology. These flaws in the technical, scientific, legal, and ethical frameworks of medical technology can result in biased diagnoses or even in fatal consequences. [5]

For instance, recent scientific research using thermal cameras to detect human emotions is ethically highly questionable. [6] By creating physical maps of temperature variations in the body’s parts, the study identified a unique topography of bodily sensations. Eventually, the body’s spontaneous thermal irradiation determines individuals' health and emotions via bodily maps across different subjective feelings.

The research claims to make it plausible to talk about the thermal expression of emotions and classify bodily events by profiling mental and health states at a distance through thermal cameras. The poor ethics of these kinds of advanced medical and psychological study seem to have paradoxical similarities with the primitive idea of bodily fluids determining human temperaments. Hippocrates' temperament theory suggested that four bodily fluids (called humor)—namely, black bile, yellow bile, phlegm, and blood—directly affect an individual's personality, behavior, and health. A belief that stayed in place for centuries and used in medieval science and medicine. [7]

Beyond profiling citizens and monitoring their emotions with bodily fingerprints, medical devices using body scanners and algorithms can eventually decide someone's health destiny. Medical diagnosis is being computed by algorithms that also automatize related medical decision-making processes. There have already been numerous cases in which biased algorithms decided on patients' medical diagnoses incorrectly or even suggesting deadly transplants. These algorithms often discriminate by cross-referencing data on preconditions, ethnicity, gender, age, and even class status. [8]

These grim consequences can happen because the use of these medical devices and the processing of the data collected is yet to be regulated and protected. These days medical data is sold and transferred to commercial entities without public oversight and patients’ consent. At the same time, today’s artificial intelligence and computer vision is allowed to make decisions on our health without transparency and accountability while deflecting liability to the machines and not to the technology creators and those who decide to use it.

The playful installation Bodily mimics biased and faulty algorithms that can decide on patients' lives, judge emotions, and profile individuals’ bodies. In the installation, diagnostic algorithms work like slot machines determining the fate of a patient.

With this artwork, the advantages of body scans are debunked, showing how the science of our organs' temperature using thermal cameras can eventually be misused in society and misleads in medicine. Through slot machines’ subtle irony, an imperfect algorithm evaluates the body conditions and infers the participants' feelings standing in front of the interactive installation. The diagnosis based on pseudoscience of bodily map sensations is screened on a monitor with both ironic and severe graphics overlapping the live video from the thermal camera.

This video installation screens the inside body like a magic mirror and uses colors as aesthetic and conceptual elements to uniquely see the viewers' thermal images. The viewers’ thermal colors and shapes are integrated as dynamic interactive features, which are experienced as personal information being exposed and judged arbitrarily. Ultimately, the artwork Bodily shows how aesthetically the colors of body temperatures in computer vision can carry political significance about the discrimination, surveillance, and ethics of technology.

Text by Paolo Cirio, 2020.


[1] “Thermal Imaging to Diagnose Disease” by W. Hardin, Association Vision Information, August 2018:

[2] “A.C.L.U. Warns Against Fever-Screening Tools for Coronavirus” by N. Singer, The New York Times, May 2020:

[3] “Thermal expression of intersubjectivity” by A. Merla, Frontiers Psychology, July 2014:

[4] “Ban Biometric Mass Surveillance” by EDRi European Digital Rights, May 2020:

[5] “Health Care AI Systems Are Biased” by A. Kaushal, R. Altman, C. Langlotz, Scientific American, November 2020:

[6] “Maps of subjective feelings” by L Nummenmaa, R. Hari, J.K. Hietanen, E. Glerean, PNAS, September 2018:

[7] Humourism, Wikipedia:

[8] “How an Algorithm Blocked Kidney Transplants to Black Patients” by T. Simonite, Wired, October 2020:

home | cv & bio | works | archive works | press | archive press | events | contact | top