Back to top

Accepted for ICCA 2023 !

Accepted for ICCA 2023 !

We’re proud to announce that we have been accepted for ICCA 2023 !

See : International Conference on Conversation Analysis

We will participate in the panel “Perception in Interaction” organized by Brian Due.

Here is the abstract for the panel that Brian Due proposed for the CA community :

"The majority of research into perception is conducted within a neurobiological paradigm, focusing on the different sensory systems, the motor functions and the relation to cognition, e.g. focusing on multisensory integration in the brain. The mind-body distinction is still evident in this line of research despite half a century of critique: Wittgenstein (1953) showing the language problems with perceiving, e.g. , what it means to “see” and Gurwitsch (1966) showing the problem of perceptually recognising order. Later, Gibson (1979) problematised the whole idea of perception as pure cognition and instead tied it to concrete situations, a move elaborated on in anthropology (Ingold, 2000).

In this panel, we relate specifically to the ways Garfinkel revisited and purposefully “misread” (Eisenmann & Lynch, 2021) previous descriptions of the subject with the intent of grounding what Coulter and Parson called the praxeology of perception (1990). A few EMCA studies have studied perception as both action and an interactionally configured phenomenon, especially Goodwin (2007), Nishizaka (2000) and, to some extent, the line in discourse psychology that generally opposes cognitivism (Edwards & Potter, 2005). The objective of this panel is to bring together researchers studying perception as both practical action and as embedded within interactional contexts. The goal is to further the discussion and developments of perception as a non-cognitive phenomenon and to shed light on the variety of settings in which perception as action and interaction occurs.

Two larger issues remain to be studied further: First, research has predominantly focused on perception as visual perception and thus related to the visual sensation. However, perception might also be based on other senses (Cekaite, 2020). This panel seeks to explore how perception is also based on other sensors and thus seek to connect more thoroughly with the line of research that studies multisensoriality (Mondada, 2021) with perception. Second, research has focused on perception as related to joint (visual) attention between humans. However, other non-human agents might also sense and produce information that can be integrated into situated contextual configurations (Due, 2021). The panel, therefore, also invites papers to discuss semiotic agency and distributed forms of sensory information for co-constructing perceptual fields."

Our paper will be entitled “Beyond the speech recognition, (doing) being perceived by a robot.”

Here is our abstract :

"This paper will investigate how humans make meaning of unscripted bodily conducts (such as head movements, gaze contact, rotations or micro-movements of the torso and mechanical sounds) produced by a humanoid robot (Pepper) as resources for establishing that they are "perceived" by it. As these resources are treated as embedded into a larger environment by the humans (walking trajectories, spatial arrangement of the group, sequence positioning, being followed...), they are interpreted as timely-produced by the robot. Thus we see that humans designate these conducts as accountable phenomena (whereas producing responses, but also comments, change-of-state tokens or response cries) in order to "become co-present" and somehow achieve "mutual perceptibility" (Pillet-Shore 2018) in the opening of interactions. They also seek for these perception cues when troubles or failures occur.

The dataset is composed of 491 naturally occurring interactions with the students of a university library. We placed the Pepper robot in the hallway and we let the students freely interact with him. The main task of the robot is to provide information and orientation services. A major part of these interactions is performed with groups of students. The production of this corpus is the first step of a larger project that brings together researchers in artificial intelligence and conversation analysis. We aim at analyzing task-oriented HRI in order to change the way a conversational program is designed, dealing with multimodality, sequentiality, repairs and accounts.

Through the analysis of several pieces of data, we highlight the following phenomena : 1. students that are demonstratively “perceived” by the robot treat this event as a preparing the opening (Harjunpää et al. 2018) 2. within groups, students dedicate practices to resolve the discrepancy between who initiates the interaction with the robot and who is displayed as perceived 3. when failure occurs, students repair perception as a last attempt 4. students accomplish "perception tests" as the sole purpose of the encounter as they display the avoidance of following-up exchanges

These analyses allow us to understand how humans locally and contingently humanize the robot by indexing preference, rights and obligations in interaction, and how they orient to their alignment or not for the practical purpose of building humorous activities or complaints addressed to the other humans.

Our corpus is characterized by the fact that the interaction is based upon the user's initiations whilst the fully autonomous robot displays a time-limited availability, contrary to an entertainment conducted by the robot (e.g. Gehle et al. 2017 for the analysis of gaze patterns with a Nao robot in museum). Under such circumstances where students might simply be passersby or leave whenever they want, being perceived, recognized and selected is decisive for the success of openings and for the sustaining of the interaction. We argue that the robot's displayed perception of the human is a key dimension for the success of the whole human-robot interaction. These phenomena distinguish the robot device from other non-human agents such as voice-user interfaces (Porcheron et al. 2018)."

Stay tuned !