Treffer: Object correspondence in audition echoes vision: Not only spatiotemporal but also feature information influences auditory apparent motion.

Title:
Object correspondence in audition echoes vision: Not only spatiotemporal but also feature information influences auditory apparent motion.
Authors:
Kriegeskorte MC; Department of Psychology, University of Tübingen, Schleichstrasse 4, D - 72076, Tübingen, Germany., Rolke B; Department of Psychology, University of Tübingen, Schleichstrasse 4, D - 72076, Tübingen, Germany., Hein E; Department of Psychology, University of Tübingen, Schleichstrasse 4, D - 72076, Tübingen, Germany. elisabeth.hein@uni-tuebingen.de.
Source:
Attention, perception & psychophysics [Atten Percept Psychophys] 2025 Dec 04; Vol. 88 (1), pp. 29. Date of Electronic Publication: 2025 Dec 04.
Publication Type:
Journal Article
Language:
English
Journal Info:
Publisher: Springer Country of Publication: United States NLM ID: 101495384 Publication Model: Electronic Cited Medium: Internet ISSN: 1943-393X (Electronic) Linking ISSN: 19433921 NLM ISO Abbreviation: Atten Percept Psychophys Subsets: MEDLINE
Imprint Name(s):
Publication: 2011- : New York : Springer
Original Publication: Austin, Tex. : Psychonomic Society
References:
J Opt Soc Am A. 1985 Feb;2(2):300-21. (PMID: 3973763)
J Opt Soc Am A. 1985 Feb;2(2):284-99. (PMID: 3973762)
Psychon Bull Rev. 2020 Dec;27(6):1230-1238. (PMID: 32779116)
J Acoust Soc Am. 1969 Jun;45(6):1525-31. (PMID: 5803178)
Nature. 1998 Oct 29;395(6705):852-3. (PMID: 9804417)
Exp Brain Res. 2005 Oct;166(3-4):548-58. (PMID: 16132965)
Atten Percept Psychophys. 2014 Apr;76(3):793-804. (PMID: 24452384)
Percept Psychophys. 1997 Jan;59(1):87-99. (PMID: 9038411)
J Neurophysiol. 1998 Aug;80(2):1006-10. (PMID: 9705489)
Science. 1983 Jul 22;221(4608):389-91. (PMID: 6867718)
J Acoust Soc Am. 1999 Oct;106(4 Pt 1):1633-54. (PMID: 10530009)
Perception. 2008;37(5):725-39. (PMID: 18605146)
PLoS One. 2011 Feb 23;6(2):e17130. (PMID: 21383834)
J Exp Psychol. 1971 Aug;89(2):244-9. (PMID: 5567132)
JAMA. 2013 Nov 27;310(20):2191-4. (PMID: 24141714)
Trends Cogn Sci. 2004 Apr;8(4):162-9. (PMID: 15050512)
Neurosci Lett. 2005 Mar 22;377(1):59-64. (PMID: 15722188)
Trends Neurosci. 2011 Mar;34(3):114-23. (PMID: 21196054)
J Exp Psychol Hum Percept Perform. 1976 Feb;2(1):130-8. (PMID: 1262793)
Spat Vis. 1997;10(4):437-42. (PMID: 9176953)
Perception. 2007;36(10):1455-64. (PMID: 18265828)
Neuroimage. 2008 Apr 1;40(2):746-758. (PMID: 18234523)
J Vis. 2012 Jul 30;12(7):. (PMID: 22847805)
Vision Res. 1979;19(2):143-54. (PMID: 425333)
Vision Res. 1976;16(4):329-35. (PMID: 941407)
Philos Trans R Soc Lond B Biol Sci. 2011 Feb 27;366(1564):504-15. (PMID: 21242139)
Exp Brain Res. 2010 Jun;203(4):723-35. (PMID: 20473749)
Perception. 1999;28(7):877-92. (PMID: 10664779)
Psychol Rev. 1981 Mar;88(2):171-95. (PMID: 7291378)
Nat Rev Neurosci. 2010 Aug;11(8):599-605. (PMID: 20648064)
Science. 1976 Aug 6;193(4252):500-2. (PMID: 941023)
Spat Vis. 1997;10(4):433-6. (PMID: 9176952)
Atten Percept Psychophys. 2020 Jun;82(3):1038-1050. (PMID: 31773506)
Trends Neurosci. 2005 May;28(5):264-71. (PMID: 15866201)
Perception. 2006;35(6):807-21. (PMID: 16836046)
Nat Rev Neurosci. 2004 Nov;5(11):887-92. (PMID: 15496866)
Curr Biol. 2001 Apr 17;11(8):R322-5. (PMID: 11369224)
Nat Rev Neurosci. 2008 Apr;9(4):255-66. (PMID: 18354398)
J Cogn Neurosci. 1999 Sep;11(5):473-90. (PMID: 10511637)
Front Psychol. 2017 Apr 12;8:561. (PMID: 28446890)
Exp Brain Res. 2014 Jan;232(1):273-82. (PMID: 24141518)
Percept Psychophys. 1999 Jul;61(5):952-62. (PMID: 10499007)
Percept Psychophys. 1993 Aug;54(2):139-44. (PMID: 8361828)
Cereb Cortex. 2001 Dec;11(12):1110-23. (PMID: 11709482)
J Exp Psychol Hum Percept Perform. 2012 Aug;38(4):975-88. (PMID: 22564159)
Contributed Indexing:
Keywords: Apparent motion; Auditory perception; Grouping; Object correspondence; Perceptual organization; Ternus display
Entry Date(s):
Date Created: 20251204 Date Completed: 20251205 Latest Revision: 20251207
Update Code:
20251207
PubMed Central ID:
PMC12678477
DOI:
10.3758/s13414-025-03175-7
PMID:
41345757
Database:
MEDLINE

Weitere Informationen

A crucial ability of our cognition is the perception of objects and their motions. We can perceive objects as moving by connecting them across space and time. This is possible even when the objects are not present continuously, as in the case of apparent motion displays like the Ternus display, consisting of two sets of stimuli, shifted to the left or right, separated by a variable inter-stimulus interval (ISI). This is an ambiguous display, which can be perceived as both stimuli moving uniformly to the right (group motion) or one stimulus moving across the stationary center stimulus (element motion), depending on which stimuli are connected over time. Which percept is seen can be influenced by the ISI and the stimulus features. Previous experiments have shown that the Ternus effect also exists in the auditory modality and that the auditory Ternus is also dependent on the ISI. This is a first indication that correspondence might work similarly in the visual and auditory modality. To test this idea further, we investigated whether the auditory Ternus effect is dependent on the stimulus features by creating a frequency-based bias using a high and a low sinewave tone as Ternus stimuli. This bias was compatible either with the element-motion or with the group-motion percept. Our results showed an influence of this feature bias in addition to an ISI effect, suggesting that the visual and the auditory modalities might both use the same mechanism to connect objects across space and time.
(© 2025. The Author(s).)

Declarations. Conflicts of interest/Competing interests: There are no conflicts of interest. Ethics approval: The ethics committee of the University of Tübingen approved the experiments in this study (Labor_Rolke_2022_0413_252). Consent to participate: All participants signed an informed consent form in accordance with the ethical guidelines of the Declaration of Helsinki (World Medical Association, 2013). Consent for publication: All participants agreed that their anonyminous data may be used for research purposes, in particular to be used for journal publications, and made publicly accessible in a scientific online data archive, as for example zenodo.org.