
In two studies, we presented participants with movements of a human agent, either online or offline, and movements of a computerized oscillatory agent in three different blocks. Using a new version of the perceptual crossing paradigm, we tested whether participants resorted to interaction detection to tell apart human from machine agents in repeated encounters with these agents. Therefore, we designed a “minimal” Turing test to investigate how much information is required to detect these reciprocal sensorimotor contingencies. When I notice both components, I come to experience an interaction.

I react to your movement, you react to mine.

In other words, the experience of interaction takes place when sensorimotor patterns are contingent upon one’s own movements, and vice versa. Our idea is that interaction detection requires the integration of proprioceptive and interoceptive patterns with sensorimotor patterns, within quite short time lapses, so that they appear as mutually contingent, as reciprocal. Basic forms of interaction are face-to-face and embodied, context-dependent and based on the detection of reciprocal sensorimotor contingencies. The way the interaction takes place is not direct, but a distant conversation through computer screen messages. In the classical Turing test, participants are challenged to tell whether they are interacting with another human being or with a machine.
