Skip to main content

Intuitive Assessment of Mortality Based on Facial Characteristics: Behavioral, Electrocortical, and Machine Learning Analyses

πŸ“„ Original study β†—
Delorme, Arnaud, Pierce, Alan, Michel, Leena, Radin, Dean β€’ 2018 Current Era β€’ mediumship

πŸ“Œ Appears in:

Plain English Summary

Can people tell whether someone is alive or dead just by looking at their photo? That's exactly what this study tested. Twelve self-described intuitives (people who claim heightened perceptive abilities) looked at over 400 photographs β€” half of living people, half of deceased β€” carefully matched so there were no obvious visual giveaways like image quality or age differences. Overall, they got it right 53.6% of the time versus the 50% you'd expect from pure guessing. That might sound tiny, but it was statistically solid (p=0.005), and five of the twelve participants were individually significant. Here's where it gets really interesting: accuracy jumped to nearly 57% for recently deceased individuals but dropped to basically chance for people who died long ago. The researchers also recorded brain activity with EEG, and found a telltale difference in brain waves just 100 milliseconds after seeing the photo β€” way too fast for conscious reasoning. This early neural blip appeared over the right back of the head specifically when participants correctly identified deceased faces, hinting at some kind of pre-conscious recognition process. To make sure people weren't just picking up on subtle visual cues like lighting or skin tone, the team trained machine learning algorithms on eleven image features β€” and the computers couldn't beat chance. Whatever signal these intuitives were detecting, it wasn't anything a random forest could see. The effect is modest but genuinely puzzling, and the mechanism remains a mystery.

Actual Paper Abstract

Studies of various characteristics of the human face indicate that it contains a wealth of information about health status. Most studies involve objective measurement of facial features as correlated with historical health information. But some individuals also claim to be adept at intuitively gauging mortality based solely upon a quick glance at a person's photograph. To test this claim, we invited 12 such individuals to see if they could tell if a person was alive or dead based solely on a brief examination of his or her photograph. All photos used in the experiment were transformed into a uniform gray scale and counterbalanced across eight categories as follows: gender, age, gaze direction, glasses, head position, smile, hair color, and image resolution. Participants examined 404 photographs displayed on a computer monitor, one photo at a time, each shown for a maximum of 8 seconds. Half of the individuals in the photos were deceased, and half were alive at the time the experiment was conducted. Participants were asked to indicate if they thought the person in a photo was living or deceased by pressing an appropriate button. Overall, mean accuracy on this task was 53.6%, where 50% was expected by chance (P ¼ .005, two tail). Statistically significant accuracy was independently obtained in 5 of the 12 participants. We also collected 32-channel electrocortical recordings and observed a robust difference between images of deceased individuals correctly vs. incorrectly classified in the early event related potential at 100 ms post-stimulus onset. We then applied machine learning techniques to classify the photographs based on 11 image characteristics; both random forest and logistic regression machine learning approaches were used, and both classifiers failed to achieve accuracy above chance level. Our results suggest that some individuals can intuitively assess mortality based on some as-yet unknown features of the face.

Research Notes

Tests mediumistic ability to discern mortality β€” relevant to survival hypothesis. Combines behavioral accuracy with EEG correlates. Machine learning control strengthens design. Modest effect (53.6%) but statistically robust. Part of IONS anomalous cognition program. EEG difference at 100ms suggests pre-conscious discrimination. Connects to Wahbeh mediumship studies and Beischel's mental mediumship research.

Twelve self-identified intuitives viewed 404 photographs (50% deceased, 50% alive) balanced across 8 visual characteristics. Overall accuracy 53.6% vs. 50% chance (p=0.005); 5/12 participants individually significant. Performance best with recent deaths (56.8%, p < 0.002) vs. old (51.7%) and very old (50.2%). 32-channel EEG showed early visual ERP difference (~100ms, right parieto-occipital) for correct vs. incorrect classification of deceased photos (cluster-corrected p < 0.05). Machine learning (random forest, logistic regression) on 11 image features failed to exceed chance, ruling out simple visual cues. Results suggest some individuals can weakly discriminate mortality status from facial photographs via unknown mechanism.

Links

Related Papers

Also by these authors

More in Mediumship

πŸ“‹ Cite this paper
APA
Delorme, Arnaud, Pierce, Alan, Michel, Leena, Radin, Dean (2018). Intuitive Assessment of Mortality Based on Facial Characteristics: Behavioral, Electrocortical, and Machine Learning Analyses. Explore. https://doi.org/10.1016/j.explore.2017.10.011
BibTeX
@article{delorme_2018_intuitive,
  title = {Intuitive Assessment of Mortality Based on Facial Characteristics: Behavioral, Electrocortical, and Machine Learning Analyses},
  author = {Delorme, Arnaud and Pierce, Alan and Michel, Leena and Radin, Dean},
  year = {2018},
  journal = {Explore},
  doi = {10.1016/j.explore.2017.10.011},
}