Home From idle to interaction – assessing social dynamics and unanticipated conversations between social robots and residents with mild cognitive impairment in a nursing home
Article Open Access

From idle to interaction – assessing social dynamics and unanticipated conversations between social robots and residents with mild cognitive impairment in a nursing home

  • Mehrbod Manavi ORCID logo EMAIL logo , Felix Carros , David Unbehaun , Clemens Eisenmann , Lena Müller , Rainer Wieching and Volker Wulf
Published/Copyright: March 20, 2025

Abstract

This paper examines the potential impact of social robots on people with mild cognitive impairments in a nursing home. Within a 4-month design case study, we investigated the practices, attitudes, and social contexts of residents with mild cognitive impairment and their caregivers and designed two prototype apps for a robotic-based system. Subsequently, 10 residents, the former nursing home manager, and one social caregiver participated in a prototype evaluation study for 10 weeks. The goal was to assess group-based user experience and social interactions. Qualitative results indicate that the system can support participants in their individual, social, and daily activities and, therefore, consequently initiate potentially meaningful interactions. One key observation from the video analysis was that the participants initiated unanticipated conversations with the robot, which we discuss regarding the prompting character, design appearance, and affordances of the robot in interaction.

1 Introduction and background

Social changes, such as an increase in life expectancy and persistently low birth rates, result in a growing population of older adults, many of whom require increased help and care. Changes in family dynamics, including the increase in one-person households, a shortage of skilled workers, greater mobility, and the increasing distance between parents and children, coupled with advancements in medical treatment to provide high-quality services, such as robotic-supported surgeries or artificial intelligence to detect early cancer, are creating both new opportunities and unprecedented challenges for current health and care systems. Furthermore, as societies age, the likelihood of facing long-term care needs, mild cognitive impairments (MCI), and even dementia are becoming more prevalent. 1 The WHO outlines in their report on active aging that the number of older adults is expected to rise from 1 billion in 2020 to 1, 4 billion in 2030 and is estimated to rise to 2 billion by 2050. 2

Consequently, there will be even more people in need of care, combined with the current trend that there is already a need for more time to provide care and an existing shortage of skilled workers in various professions in the care sector. Therefore, society faces several challenges in providing resources and possibilities for good care, social activities, and medical treatment to those in need of care (especially in the future). 3 , 4 , 5 This global development will be accompanied by a focus on innovation, technology, and assistance systems in the debate on care and medical treatment. Technical assistance systems in the form of modern information and communication technologies to support relatives, professional caregivers, and people in need of care will likely characterize future settings. 6

Due to the demographic changes mentioned above and their economic and sociodemographic impacts, more people will likely be unable to access appropriate professional care and social support in the future. Technical assistive systems, especially social robots for people needing care and their caregivers, could help provide customized, needs-based therapy concepts and activities. Social robotics in care has recently been a much-discussed topic in various scientific fields and countries. Several studies in the last years indicated promising results and even caregivers and future caretakers are keen to use such systems in the future. 7 In a German survey, 83 % of respondents believed service robotics could enable them to live longer at home. 8 However, robotic systems in care are still not very widespread in Germany. For the most part, these are prototype robotics projects that are currently being developed primarily in research. It is also necessary to recognize that while the technology of social robots is promising for the field, the systems still have many flaws on both the development and implementation levels. There is a mismatch in the design affordances of a robot, i.e., a discrepancy between what a robot design is promising and what it can actually do. We will show some of these design issues in this paper, but they are also part of many other HRI discussions (e.g., Refs. [9], [10], [11], [12).

Social assistive robots are one of the leading domains within HRI. They are the intersection between robots that can assist a user and robots that communicate with a user through social interaction. 13 Social assistive robots in nursing settings were reported to have positive effects on improving health by reducing stress levels, increasing positive mood, reducing loneliness, increasing communication activity with others, and remembering the past. 14 , 15 , 16 It is also reported that older people become less lonely due to the intervention of companion robots. 14 , 15 , 16 In terms of health, these robots are said to reduce stress (e.g., measured by stress hormones in urine) and increase the immune system’s response. 17

The use of socially assistive robots in care has been a common practice in recent years as they can provide some sort of social companionship. 17 , 18 Social robots engage with people on a social level, promoting both cognitive and physical activation. In this way, they are intended to provide entertainment and motivation. 19 , 20 In the areas of enjoyment of life, quality of life, physiology, interaction, and communication, various studies have found positive results when using Paro. 21 , 22 , 23 , 24 , 25 There have been limited studies on social robots used in older adult care over the last decade, as revealed by Andtfolk. 26 Based on that, it is possible to categorize the purpose of using social robots in facilities for older adults into the areas of providing: (a) interaction, (b) cognitive training, and (c) physical training. 18 , 27 , 28

The design of social robots and communicative AI is not only about how conversations with the robot can be initiated, but also about how they can be continued and maintained without interruptions. 29 Already in the 1960s, Joseph Weizenbaum developed the computer program ELIZA, one of the first very simple chat-bots processing natural language that simulated a Rogerian therapist, to study human-computer interactio. 30 Weizenbaum showed and cautioned against how easily participants engaged in meaningful conversations with ELIZA despite its substantial limitations. 31 , 32 The participants entrusted ELIZA with their private affairs and engaged in significant and satisfying interactions that felt human to them. On the one hand, the success of the simulation relied on the social dynamics of client-centered therapy in which, as Weizenbaum argued, therapists are “free to assume the pose of knowing almost nothing of the real world” 32], p. 42, allowing for the expected evasive behaviors in such interactions. On the other hand, Garfinkel showed how the success of ELIZA depended on (the exploitation of) human sense-making practices, i.e., the assumption of mutual trust conditions and the turn-by-turn production of sense that participants depend on in social interactions, even when the program or robot does not completely fulfill these conditions. 33 , 34 Similarly, also the success of more contemporary voice interface technologies, such as Lenny, can be seen to depend on the actual work of participants, making sense out of displays of availability, such as expressing the willingness to talk and listen in the opening of the conversation (35], 36], regarding the sociability display of embodied robots and its accomplishment see, e.g., Refs. 37], 38]).

Subsequent studies have found that some people open up to robots and tell them stories. 39 It was observed that older adults entrusted robots with sensitive information that they had withheld from caregivers. 40 A study of the Paro robot found that older adults began talking to the seal and telling it about themselves. 41 It was also found that they are willing to tell robot secrets that they would not tell other people close to them. 42 , 43

In this study, the focus will be on the evaluation of the robotic-based system Pepper 44 and applications that have been co-designed with and for residents and their caregivers in a large nursing home in Siegen, Germany. Pepper is the first commercially available robot to communicate and interpret human emotions. 44 Despite its limitations and end of production, it is still a popular kind of robot in care settings. Pepper is considered a humanoid robot due to its human-like appearance and face with large eyes, arms, and a human-like figure. 18 , 44 The 10-inch tablet display on the robot’s chest enables developers to create content that is visible to most residents. We introduced the system to a group of older adults with MCI and conducted interviews and design workshops with the older adults’ relatives, caregivers, a social caregiver, and a former nursing home manager. This allowed us to gather insights into their attitudes, practices, and expectations regarding the robot, as well as its impact on individual and group interactions (see also Ref. 45 ). Our findings offer practical lessons for the future use of robots in aged care, providing valuable insights for researchers, developers, and professionals. During that process, residents and caregivers provided ideas and feedback regarding the system. The paper presented here will examine how the residents engaged with the robotic system and its influence on individual perception. In the results section, we will investigate the participant’s experience with a particular focus on group dynamics, joyful interactions, and emotional interactions. Thus, our results may help designers better understand where to put more emphasis when designing such systems in sensitive care settings. This paper seeks to address the following research question: To what extent can a robotic-based system with various applications foster activity, initiate meaningful interactions, and enable social impacts for the social care network? The presented work contributes to and expands the current discourse on HCI and HRI in care settings by examining the impact of a suite of applications designed to support the daily life activities of residents of care facilities and their caregivers.

2 Research approach and technology

The chapter introduces the collaborative study in a nursing home to explore how technology, particularly tailored applications for the Pepper robot, can support older adults. By involving diverse stakeholders through a co-design approach, the study emphasizes understanding the routines, social dynamics, and individual needs of users in order to develop meaningful, health-focused socio-technological solutions.

2.1 Research approach and stages

Our study was conducted in a large nursing home over 4 months, including a 10-week evaluation phase, and followed a co-design approach. This involved residents, their carers, a former manager, and researchers from various disciplines in an active exercise to obtain helpful information on the daily activities, memories, social environments, experiences with technology, and attitudes towards its use.

Developing appropriate technologies, such as exergames 46 performed by humanoid robots, to facilitate physical activity and interactions in the context of nursing home residents and their professional caregivers requires involving potential users from the outset. 18 , 47 , 48 It is necessary to recognize different viewpoints, institutional habits, social and individual routines, and how they combine with technology. Therefore, this study was set up as a design case study, as originally outlined by Wulf et al. 49 This approach consists of three phases (see Figure 1): (1) a pre-study involving an empirical analysis of existing individual and social practices within the empirical setting; (2) the co-design of innovative ICT-based artifacts informed by the findings of the pre-study; and (3) a long-term investigation into the interaction with and appropriation of the developed technological solution. 49 , 50

Figure 1: 
Timeline of the study.
Figure 1:

Timeline of the study.

In the pre-study phase, we examined existing practices, organizational and social perspectives, and the challenges residents and their care networks face in their daily environments. The second phase involved an iterative process of developing and deploying robotic-based prototype applications for the Pepper robot and its tablet. These were integrated within the nursing home, allowing residents and caregivers to utilize the system on a daily basis. We conducted a 10-week evaluation during the final phase, deploying the system in the nursing home environment. The sensibility of this environment requires close collaboration with the care home and the participants. Therefore, we visited the nursing home twice a week, conducted training sessions, and regularly spoke to participants within their environment. This phase concentrated on developing a concrete understanding of how it is possible to use technological solutions effectively in ways that meet the needs of caregivers and residents.

2.2 Technical infrastructure and implementation

Based on the apps available for the Pepper robot in the previous work, 18 , 51 , 52 , 53 we developed more memory games, reaction games, and some cognitive games for people with MCI. These new apps were focused on the system’s effectiveness in improving health, its integration into daily routines and the contexts of use of a potential system for older users. 54 , 55

Based on our findings in early intervention, we re-designed the prototype apps to meet the special needs of the residents and caregivers. This reformed the latest design, applications and development of the digital intervention scenarios. In what follows, we describe the modifications and upgrades we made to the final apps. Our robotic-based system consists of several technical components. First, the technical setup is centered around the humanoid robot Pepper. Our digital applications were developed and selected from sports science and gerontology fields and were further inspired by practical recommendations from our pre-study. The aim of the applications was to support physical activity and cognitive resources, as well as to remain as autonomous as possible, lessening dependence upon professionals providing care. The program includes a set of exercises that are necessary for the execution of everyday activities (e.g., climbing stairs, carrying bags, and sitting-to-standing transfers). The movement applications included exercises for the arms, legs, head and torso. They were accompanied by music to promote well-being and motivation during the exercises. The applications concerning cognitive activation included a variety of games combining movement, creativity, and cognitive tasks to encourage the participants to engage not only in individual quizzes but also in group discussions and dynamics. “Snow Waltz[1] is one of the movement applications designed in this process. It encourages residents to follow the robot’s dance movements. The other application called “Proverbs.” belongs to the cognitive training category. The app’s logic is to present the first part of a proverb and have the residents complete the proverb sentence.

3 Methods, participants and ethics

This section describes the data collection and analysis process used to explore interactions with robotic systems, details the participants and the setting in the nursing home, as well as ethical, legal and social considerations.

3.1 Data collection and analysis

Regarding data collection, we conducted semi-structured interviews and design workshops with different residents, caregivers, social caregivers, and the former nursing home manager. Then, we used the Praxlab 56 approach to gather valuable information among the mentioned actors to identify their expectations, interests, and ideas. Praxlab utilizes long-term participatory research in real-world living environments to understand and design technology that aligns with older adults’ socio-cultural needs and diverse experiences. 48 , 56 , 57 The qualitative data of the pre-study and appropriation study consisted of audio recordings and field notes collected during the interviews and workshops as well as video recordings of 15 (out of 17) group sessions with Pepper (ca. 11 h), following video analysis to obtain more detailed and comprehensive insights also into non-verbal interactional dimensions. 58 , 59 However, the overall qualitative data analysis was performed using a reflexive Thematic Analysis (TA) approach for all materials, 60 followed by a more detailed sequential analysis (see Ref. 61 ) of only thematically pre-selected sequences, which was conducted at the end. We used TA to analyze the data from interviews and videography. This process began with transcribing and iterative coding of the data to identify patterns and relationships. With that, it was possible to identify key themes and find connections between the ideas of the participants and others involved in the study. We identified the following principal themes during the coding sessions: group-based interaction, attitudes and gender affirmations toward the social robot, emotions and reactions of residents to the robot, reactions from caregivers and individual conversations with Pepper in idle mode. These overarching themes were derived from our original codes. Coding differences were discussed and eliminated by adding, editing, or deleting codes according to the outcome of the discussion.

3.2 Participants and setting

In our study, 10 residents (see Table 1), one social caregiver and the former nursing home manager were involved. They engaged in interviews, design workshops, and interventions that took place in the robotic-based group sessions. The participants, aged between 78 and 91, included nine females and one male. The participants were chosen on a voluntary basis. Participants were excluded from the study if: they were suffering from chronic diseases such as cardiovascular illness or cancer; or their physical fitness was impaired such that they could not move without assistance.

Table 1:

The list of the participants with mild cognitive impairment, social caregiver and former nursing home manager in the interventions.

ID Gender Age State of health Impairments Number of interventions Experience with technology
P1 Female 83 Physically restricted Use wheelchair, incontinence 14 None
P2 Female 88 Physically fit Arthrosis 10 None
P3 Female 87 Physically fit Diabetes, unsteady gait 10 None
P4 Female 78 Physically restricted Use wheelchair, arthrosis, rheumatism 10 Tablet
P5 Male 80 Physically fit Osteoarthritis 10 Tablet
P7 Female 84 Physically restricted Use wheelchair, breathing problems, visual impairment 7 None
P8 Female 89 Physically restricted Balance disorders 5 None
P9 Female 91 Physically restricted Use wheelchair, hearing and visual impairment 3 None
P10 Female 84 Physically fit 2 None
M11 Male 65 Former nursing home manager 10 Familiar
M12 Male 63 Social caregiver 14 Familiar

In addition, we engaged with the doctors, family members, and other caregivers of the participants in the nursing home. We do not list them here, as these interactions were informal and short-lived.

The study took place in a large nursing home in Siegen, Germany, with a PraxLab approach. The interventions with the residents were done within the nursing home in the common areas.

Group sessions

The interventions were designed to take place in a group atmosphere. A supervisor started the program and motivated the participants to follow the robot’s instructions. Figure 2 shows how participants and researchers were seated. The robot stood in the center of a semicircle of participants so everyone could see it. A camera was set up behind the robot so that a video analysis could be made later. The researchers sat beside the participants at a table, taking field notes and observing the robot’s performance. The interventions took place in a room surrounded by window walls. This meant that people walking along the corridor could see into the room. On the other hand, the participants sat overlooking the terrace of the nursing home, so they were not so much distracted by what was happening in the corridor.

Figure 2: 
Sketch structure of the interventions in the nursing home.
Figure 2:

Sketch structure of the interventions in the nursing home.

Here, training sessions were conducted in groups of 4–5 participants. Pepper planned to conduct the moderated sessions interactively. An example session sequence from cognitive training is that Pepper starts the session by greeting and introducing: “Hello everyone. I would like to play a game with you. I know a few proverbs. Unfortunately, I have forgotten the second part of my proverbs. I need you to help me complete them. Here we go!” The robot then starts with the first proverb: “The first part of the proverb is: ‘Rome wasn’t built …’.” The program then jumps to the Speech Recognition Box, where the correct solution is stored. The participants are then encouraged to respond via speech, completing the phrase with: “in a day.” When this happens, the robot says: “That was exactly right. Very good. Would you like to guess another proverb?

In total, we conducted 17 visits with an average duration of 1 h each. From time to time, informal and professional caregivers, as well as social workers, also participated (mostly passively) in the moderated sessions.

3.3 Ethical, legal, and social considerations

We ensured the privacy and data security of the participants. The interviews with the residents and caregivers were conducted in consideration of the guidelines on privacy and personal data from the university’s ethical commission. It included informed consent forms, a directory of processing activities and an ethical application. These documents were prepared according to legal provisions and were valid only when combined with a personal declaration of consent, which the participants could revoke at any time. The mentioned documents were part of the ethical approval issued by the commission of the University of Siegen. The study was done in accordance with the Helsinki Declaration to ensure the health and well-being of the participants. We did this by involving health professionals in the preparation of the study and by including them in the study itself. We followed their guidance in regard to the health and well-being of the residents. Because of its sensors, especially Pepper’s cameras, the system was only connected to the internal network for updates to protect the privacy of residents, staff, and visitors.

4 Results

In this section, we mainly present the findings of the 10 weeks evaluation study conducted in the nursing home. We show results on interaction and group behavior, general attitudes and gender affirmations towards the robot, the impact of interactions on the residents, and perspectives from caregivers. In the last sub-chapter, we highlight a conversation between one participant and the robot, where it was observed that the participant talked to the robot for an extended period despite not receiving any verbal responses from it. The intervention phase occurred in a group setting where participants were invited to a common area room to interact with a robot (see Figures 3 and 4).

Figure 3: 
The left side shows anonymized visualization of the group activities, while the right side depicts two forms of Pepper’s gestural movements during exercises and talk.
Figure 3:

The left side shows anonymized visualization of the group activities, while the right side depicts two forms of Pepper’s gestural movements during exercises and talk.

Figure 4: 
The sketch shows P1 talking to the robot when they are alone in the room.
Figure 4:

The sketch shows P1 talking to the robot when they are alone in the room.

4.1 Group based interaction

In the first session, only one participant appeared in the intervention. There was no pressure on them to participate in each session. However, the caregivers informed us that they had forgotten about the project. The participation increased with time. Often, we had to pick up some residents for the course. By increasing the number of participants in the course, those who had participated in the previous session became less active in the group setting. When we asked them whether the robot’s performance was unclear or if they needed more instructions, one participant answered: “That’s gymnastics. He [the robot] can do it beautifully.” (P7) It seemed that they were initially not confident to imitate the robot when they were in a large group. To break the ice, one researcher started imitating Pepper’s movement and asked participants to join them if they liked it. They then slowly started to follow Pepper’s movements. In some interventions, it becomes clear that participants only felt encouraged to do the exercises themselves once another participant had started. In one session, for example, P1 and P8 started doing the exercises, after which the other participants also mirrored the robot.

During the physical exercises, it quickly became apparent that the residents mirrored the movements shown by the robot. Participants performed those exercises according to their physical abilities. It was noticeable that in most cases, the residents performed only the movement that the robot could demonstrate based on its capabilities. A verbally announced leg movement, which Pepper could not perform, was rarely performed by them. Only if a participant carried out the robot’s verbal instructions would the others follow too. However, this behavior has changed over time. The participants not only mirrored the physical exercises performed by the robot, but also sometimes imitated the body language of the Pepper robot while speaking,[2] without these being part of the exercises. This showed that the residents were strongly fixated on the visual level and the physicality of the robot during the movement exercises and that acoustic instructions from the robot tended to be much less followed. However, this fixation on the visuals may also have led to one participant frequently waving at the robot during the movement exercises instead of continuing to perform the exercises. For example, during one session, P2 tried to establish personal contact with the robot and attract its attention.

In the middle of the study, the music and dance applications were integrated into the physical movement activities and rhythmically coordinated with the respective music. With these applications, it took a little longer for participants to perform the movements, but all participants took part in the movement dances. In contrast to the behavior during the movement exercises, the participants interact more with each other during the movement dances. For example, they laugh at each other and applaud each other after the dances. These reactions are probably due to the music. Besides the physical activity, the quiz game with regional and general knowledge questions was played in some courses. It was reported by the caregivers that this game made the participants talk to each other about topics outside the courses. They could even remember some events related to those questions and talk about the events from the past with whoever they had contact with.

At the end of the study, some participants no longer needed to mirror the robot or follow the instructions. In one exercise, Pepper instructed the participants to stand up and walk in a circle in the room. As Pepper has no legs, it could not demonstrate this exercise, but instead, it started showing a pre-recorded movie of the participants walking in the room. However, after the initial instruction, P3 moved his legs appropriately without paying attention to the movie displayed on Pepper’s tablet. The other participants then joined in the second round after watching her. The regulated sequence of the interventions affected the participants, as shown by the fact that the participants in the later interventions continued to move around after the physical exercises had been announced so that they could carry out the exercises correctly and not hit each other. In the last sessions, participants already knew what they must do when Pepper announced the name of the exercises. For example, they moved their chairs apart when the robot announced movement exercises to have enough space for their arms. Most of the participants also carried out the movements demonstrated by the robot more confidently. In most cases, the participants seemed to accept the robot as an exercise instructor during the activities, as they mirrored its movements. They saw the exercises as helpful in staying physically fit.

4.2 Attitudes and gender affirmations toward the social robot

Most participants had no contact with robots prior to the first intervention, and only two participants had brief previous experiences. P9 describes the robot as something new and interesting: “That was always a closed book. It’s not my world. But now I have the chance here. Maybe I’ll learn something.” Despite the uncertainty, P9 wanted to learn and get to know something new, displaying a certain level of curiosity that we also observed with other participants.

A reoccurring irritation in contact with the robot was the topic of gender. The participants tried to categorize the robot in familiar role patterns. In the first intervention, one participant asked herself whether Pepper was a woman or a man. As a result, she humanized the robot and tried to assign it a gender, as in her opinion, a robot should also fit into this role. However, she came to the conclusion that Pepper could embody both genders. The gender assignment also took place with other participants. There, Pepper was described as a “nice man,” although the robot had introduced itself as Paula before each intervention and spoke in a more feminine voice. While most of the participants perceived the robot as human-like, the gender remained unclear: “It’s more of a woman. […] It can be a man and a woman” (P9); elsewhere, she referred to the Pepper robot as a little animal. P2 described the robot as a nice man that she likes: “I think he’s a nice man.” One participant even described Pepper as a friend. Unlike all the other participants, she also wanted to spend time alone with Pepper. This became evident when she was in the room a few times too early and talked to the robot or played with it. None of the participants used the term ‘machine.’

4.3 Emotions and reactions of residents to the robot

The emotions that the participants in our study displayed towards the robot were quite diverse. P1, in particular, showed many non-verbal reactions. She waved to the robot almost once or twice in each session when the robot looked at her. She even referred to the robot as a friend and often stopped her exercises to wave at the robot. On the other hand, P2 was visibly startled when the robot started talking to her for the first time. She only settled down when P1 had a conversation with the robot, and later also initiated gestural greetings towards the robot. P8 was quite afraid of the robot and did not want the robot anywhere near her in the beginning. When the robot asked the participants if they wanted to do any exercises, P8 responded: “Okay, just do not come to me.” P5 was also skeptical at the beginning, stating in her first sessions: “I can’t relate to that little animal at all.” This was also the case for P3, who commented during the same session that “now he’s looking at you intensely, isn’t he? I don’t know anything about that kind of stuff. I’m not learning any more about it either.” However, after the researcher had shown her how to use the tablet, she came to a different conclusion: “I guess practice makes it perfect.” Some positive emotions were also evoked in the participants. In some exercises, the participants were thrilled when Pepper announced the exercise they wanted to perform, and they were good at it. Each robot’s apps were designed to get motivational courage when they responded or answered correctly. In another session, P3 commented on this again: “I’d like to do better. That’s fun for me.” P1 giggled in response to praise from the robot. She kissed the robot several times in two sessions when the robot congratulated her after guessing the correct answer on the quiz game. P1 also looked grimly at the robot when it switched off in intervention four. However, she is also playfully cheeky with the robot, which becomes apparent when she sticks out her tongue at the robot or toasts him with glass to attract its attention when the body of Pepper is pointed at the other participant, but its head is pointed at her (P1). At the beginning of each intervention, the robot greeted the participants and asked how they felt. When Pepper asked them a question in the quiz, the participants did not seem to know whether they should answer him or not. This became visible to us when they looked at each other questioningly. In the middle of interventions, the participants got familiarized with the robot and then usually nodded to the robot to acknowledge their readiness for the exercises when the robot gave instructions or made statements. One participant even demanded that the robot should make her healthy. For her, a robot should have more of a medically useful character.

4.4 Reactions from caregivers

The staff at the nursing home where the study took place were very interested in the project. Some watched from the outside or even took part in the interventions. They also saw a preventive benefit in the applications, such as the ‘catching the rabbit’ game,[3] as it could improve the resident’s ability to react. The staff reported that it became apparent that the residents with MCI had a lot of fun interacting with the robot. This is important information because they know the test subjects and their behavior very well during other activities without robots. It was also noted that interaction with the robot changed over the course of the study. The residents with MCI perceived the robot as increasingly natural. One employee compared the residents’ interaction with the robot to the way they interact with stuffed animals or dolls. The robot evoked emotions in the residents. The employees were asked whether they could imagine using the Pepper robot in the long term in the nursing home. On this point, the employees all said that they could imagine it, but only if accompanied by a person who is familiar with handling the robot and the residents.

4.5 Individual conversation with Pepper in idle mode

On several occasions, before and during the interventions, there were situations in which the participants conversed with the robot. This occurred both alone and in group sessions, as reported by caregivers and observers on site. This was often not captured on video, as the recording usually started once all participants had arrived. On one occasion in the third session, while the other participants were still being picked up, P1 was filmed conversing alone with the robot (R) for more than 10 min (see Figure 4). Meanwhile, a researcher came to the room to check on her, but did not stay with her. She used the time to talk to the robot, and the following transcript shows the opening sequence of this conversation in Table 2. Even before the beginning of the transcript, when P1 was wheeled into the room, she immediately began to greet the robot by waving to it. The moderator then placed the robot in front of her and left the room to look for further participants, which is when the transcript begins.

Table 2:

Translated transcript of opening sequence.

Line Speaker Quote
1 R: (looks up to the left)
2 P1: Hello!
3 R: (orients head towards P1)
4 P1: Good day (nods towards the robot)
5 R: (nods)
6 P1: (nods again more slowly) Do you still recognize me?
7 From yesterday? (3.0)
8 Certainly! (nods toward robot)
9 If you have such big eyes, then I have such big eyes too.
10 Yes (nodding) then [saying her name loudly] has big eyes TOO
11 R: (moves head quickly up to the right)
12 P1: Huch! (laughs, and then looks to the side)

Sitting in a wheelchair in front of the robot, P1 starts greeting Pepper (R) verbally (lines 2, 4), asking if the robot can remember her from the previous session (lines 6, 7). However, Pepper is only turned on in standby mode, and since there is no running application, it does not verbally respond to her questions. Nevertheless, the robot is orienting toward her voice (and to the lights in the room), makes smaller movements with its head, and is blinking with its eyes (by turning off and on of the LEDs in its eyes). These orientations initially (seem to) correspond sequentially to P1’s greeting attempts, since Pepper is moving its head in her direction (line 3) and initially seems to respond to her nodding (line 5). However, there are no further mutual orientations or verbal responses from the robot. She continues to observe him and starts to mirror the robot’s expressions, commenting on Pepper’s blinking eyes jokingly (lines 9, 10). Emphasizing loudly her name and that she can make big eyes as well, initiates the robot to reorient (searching for a potential standing participant in the room) looking abruptly up to the right (line 11). This seems to have surprised and startled P1, who exclaimed with a kind of response cry: “Huch!” (line 12), before laughingly realizing that everything seems to be okay, while Pepper also reorients its head towards her.

But even without any further verbal response from the robot, resident P1 kept talking to the robot for several more minutes, and some of her further statements are quoted in the following Table 3.

Table 3:

Further quotes of P1 from the 13 min conversation while she was alone with the robot. Pauses are indicated in bracketed seconds, as is the duration of omissions in the course of the conversation.

Line Quote
13 Say: ‘good day!’ (14.0) I am [her name]. (omission, 21.0)
14 No! don’t we want to talk? (omission, 11.5) Then [her name] will be sad. Then [her name] has to cry. (omission, 76.0) Are you sad? Why is that? (5.5)
15 Am I talking too much? (omission, 46.0) Are you tired? Then you need to take a nap and go to bed. (omission, 78.0)
16 Are you thirsty, too? (omission, 68.0) Do you know the song ‘Hänschen Klein’? ((she starts to sing)) ‘Little Hans went alone.’ You can sing well! (omission, 67.0)
17 What is your name? (2.1) Don’t you have any name? No? (omission, 15.0)
18 Do you smoke, too? or what? It’s not good either. My husband died from it. (2.3) He passed away now. (3.4) He’s lying down over there in the cemetery (omission, 4 min)

She tries to get the robot to speak by asking it to say: “good day” and begins to introduce herself again to the robot (line 13). After the robot still does not give any answers, she tells Pepper that she would get sad about not getting any responses (line 14). Then, she begins to look for reasons why the robot is not talking to her today (line 15). She speaks to the robot as if it is a child, uses childlike expressions, worries about the robot’s well-being, and asks if he is doing well. P1 knows that the robot had played that song in the previous session and is trying to encourage Pepper to sing it for her again (line 16). Although, in those moments, the robot does not respond, its autonomous mode, which is activated by default, still makes the robot show some gestures and head movements that only at times seem to correspond to her verbal attempts to establish contact. Despite the robot’s idle behavior, it seems to remain interesting to her, and she continues to ask the robot more questions (line 17). With a curious facial expression, she inquires if the robot could smoke too and starts to tell a rather private story to the robot (line 18). This specifically occurs when she gets comfortable with the robot and starts sharing very personal things, telling Pepper that her husband has died because of smoking. However, when the other residents enter the room, this flow is interrupted and she stops talking to Pepper.

5 Discussion

This section discusses the study’s findings and focuses on mismatched interactions, nonverbal communication, and design affordances, while also reflecting on ethical considerations.

5.1 Mismatching interactions of robotic systems and nursing residents

The interactions between robots and residents frequently exhibit mismatches within our study. Tian and Oviatt 62 view these as errors and divide them into performance errors and social errors, with social errors referring to social behavior (e.g., interrupting a user mid-sentence) and performance errors (e.g., not hearing someone) referring to the inability of a robot to perform a certain task. For example, we observed residents speaking or making gestures without receiving a response from the robot. In this case, the robot is not able to understand the social situation. Additionally, the robot does not recognize changes in situations, remaining in its current program until it finishes or is manually stopped, oblivious to any situational shifts, such as someone sharing a relevant anecdote. The robot also cannot comprehend a change in interaction partners, such as when a resident starts talking to another resident instead of to the robot. Group dynamics are unrecognizable to the robot, which responds only to what its sensors and programs can detect. Although algorithms exist to make sense of sensor data, they cannot distinguish whether the input is directed at the robot or someone else nearby. 12 , 63

These mismatches also occur in reverse (robot to resident) (cf. 64 ). For example, if the robot says something and the resident does not notice or hear it, the robot waits for a response, unable to understand that its prompt was not received by the interaction partner. Although humans also experience misunderstandings during interactions, the robot’s design exacerbates these issues. The robot’s human-like appearance suggests certain capabilities, and fiction further contributes to the belief that robots possess specific interactive skills. People also tend to extrapolate from successful interactions, assuming that if a robot responds appropriately to one question (like “How are you?”), it will do so for other questions as well, which is often not the case, as the robot might be programmed to answer only that specific question. This leads to unrealistic expectations, such as residents who expect the robot to shake hands in response to a greeting, which it cannot do. This type of behavior has also been observed in other areas, such as the automotive sector, where users place too much trust in the capabilities of autonomous driving. 65

Another aspect of this mismatch is the understanding of each other. The standard robot’s voice was not adapted to nursing home residents. Many participants simply did not understand the robot’s voice because, in the beginning, either it spoke too quickly, the tone was too high, or they were not used to this type of artificial voice. This critical design mismatch was corrected during the design phase, and Pepper’s voice was modulated before the evaluation phase. This shows that the human-like appearance of the robot does not necessarily translate into a human-like and adaptable voice in different situations that can be understood accordingly. This is a very important detail that is crucial for this user group. 66

Ideally, caregivers (or moderators) should clarify these mismatches and explain the robot’s limitations. Especially since people with MCI have a high level of trust in robots, as Pino et al. point out. 67 However, such mismatches occur often and have already been explained by caregivers many times. Over time, caregivers may become desensitized and stop explaining why the robot is not responding appropriately. This is not out of malice, but due to the frequency of such incidents and the multiple tasks caregivers must handle. It may even seem easier to ignore a resident’s confusion than to explain every situation, as the explanation itself can disrupt the flow of interaction, which – as the idle situation with P1 shows – is not necessarily interrupted by such mismatches.

5.2 Nonverbal (re-)actions and storytelling

Nonverbal reactions to the robot are shown most significantly by P1. She waves to the robot very often. In some interventions, she also throws kisses with her hands or whistles at the robot, which is typical behavior with dogs. She also sticks her tongue out at the robot and looks at it grimly when it switches off. With these gestures, the participant tries to attract the robot’s attention. She treats it like a living being, such as a dog or a child. It is striking that she repeats these gestures over and over again in several interventions, even though the robot shows no reaction to them. This behavior has also been observed in animal therapy with dementia patients. Much is also communicated via gestures and facial expressions and less via speech. 68 The robot also did not respond to questions and requests. Nevertheless, these situations occurred repeatedly. Here, too, the focus was on gaining the robot’s attention through various requests. In the first five interventions, most participants were unsure whether to talk to or answer the robot. This could be due to fear of the unknown. In Europe, people were initially rather skeptical and fearful of robots. 69 In Germany, by contrast, 41 % of people can now imagine being cared for by a robot. 70 The participant who showed the robot many nonverbal reactions apparently had no fear of the robot. She often spoke to the robot when she was alone in the room. She introduced herself to the robot and kept asking questions. She also told the robot how her husband died and where he was buried. Other studies have also observed this storytelling (telling secrets) to robots. Residents without cognitive impairments were observed to tell robots sensitive information that they had not previously told a caregiver or relative. 39 , 40 , 42 , 43 In another study, children were asked whether they would tell robots secrets. This showed that half of the children would be willing to do so. This included significantly more girls than boys. They had hoped to find a friend in the robot, as a result. 71 The participant who told her stories to the robot also called it a friend. The reasons why she told the robot very private things about herself, among other things, are unknown. However, it should be noted that she did not share private information of this kind with anyone else during the study. As in Paro’s research, this information can be used for biographical work with people with dementia. 72 Biographical work with people with dementia is very important so that they can better identify themselves when other people talk to them about past events or former hobbies and carry out corresponding activities. 73

5.3 Investigating design affordances in social interaction

The individual interaction between P1 and Pepper in its idle mode allows us to further scrutinize social robotic design affordances in their interactional dimensions. Although Pepper actually does not respond to resident P1 in what would usually be considered a meaningful interaction, P1 continuously makes sense of its assumed and displayed responsiveness in the opening sequence of the interaction, such as: Pepper orienting its head towards her, blinking with its eyes, and nodding in a sequentially relevant place that becomes meaningful for P1, who nods accordingly (see transcript on Table 2) – and then initiates the described further storytelling to the robot. Rudaz et al. 74 have shown how important such pre-beginnings and openings, i.e., the first moments of co-presence between participant and robot, are for the possible interactional achievement and the emergence of robots as social agents versus inanimate objects. They point out that these first moments of the encounter are often not recorded in human-robot studies, but should systematically be considered since the initial behavior of the robot can be of relevance for subsequent sequential trajectories, as is the case in our example. Furthermore, Pelikan and Broth 75 show not only how interactional troubles and mismatches persist (and could be tackled), but also how participants adjust their turns to the limited capacities of robots. In the case of Pepper’s idle mode, these capacities are limited to a bare minimum; however, this does not stop P1 from continuing to probe, engage, and open up to the robot. Mutual orientation and presumed eye contact in combination with minor movements (that can become sequentially relevant) appear already to provide the achievement of mutual responsiveness that is crucial for media of cooperation (cf. 76 ). As in the case of the success of ELIZA, an ethnomethodological and conversation analytic perspective offers a re-specification of human-machine and human-robot interaction that zeros in on the actual practices in its use, thereby explicating technology reliance on human social interactional competencies. 77 , 78 , 79 , 80 They show “how that technology is embedded in and relies on human social practices for its sense” (33], p. 6; cf. also 34]). In this vein, Lipp 81 , p. 660 argues that it is not so much robots caring “for (older) people but rather, the other way around – people need to care for robots.” Thus, re-situating the interfacing between robots and humans, as already Suchman 82 suggested to rethink and reconfigure the design of computational systems in light of interactions with them (see also 83 : 57ff., 75 ). In view of this, the social agency as well as the thing-like character of robots – also in regard to its materiality – can be seen as an actual interactional accomplishment. 38 , 84 However, the discussion of the particular case in this paper calls for and highlights the relevance of further studies that more systematically analyze video data in the nursing context in its interactional and praxeological detail.

5.4 Ethical reflections

The use of technology with vulnerable users, such as nursing home residents, is a sensitive topic that requires ethical reflection. While robots are now often viewed as supportive ‘tools’ for specific tasks of caregivers (e.g., Ref. 85 ), one of the main drivers of using robots in nursing homes still is to achieve some sort of automation. However, it is questionable whether robots can be an adequate solution to address the shortage of skilled staff and the increase in people in need of nursing. 86 , 87 Instead, the main questions are what role they can play to help the field and how we can ensure humane care. 86 Among other things, care work is about being empathetic, attentive, and caring towards the patient or resident. Many people are unsure whether a robot should be used in care due to the aforementioned requirements, which are extremely important in this field. It is also questionable whether robots will ever be able to replace human beings. In the technical setup that we had during this study, we can say with certainty that the robot cannot replace a human care worker and can only serve as an assistant. There are no time savings through the use of the robot in the study, but there is an increase in capabilities. Robots can have many applications installed, providing residents with a wide variety of entertainment and training options. However, there needs to be some form of moderation – a person to assist residents in using the robot. In order to recognize the extent to which a robotic system can be used in care, Grunwald recommends informing and educating everyone involved in detail. 88 Only then can well-informed and “ethically responsible decisions be made at a social level”. 89 Due to the wide variety of robot types that could be used in care, it is not possible to make a blanket ethical judgment, which should not be the goal in the first place. Ethical judgments are always culturally rooted and differ based on robot design, robot functionality, user group, and context. 90 , 91 Therefore, it should be addressed at several points during development and implementation. 92 , 93

The person with MCI could, for example, feel threatened in their privacy if they are monitored by a robot. 94 , 95 When using socially interactive robotics in the environment of people with MCI, the question arises as to whether the affected person is deceived by this robot. For example, if the robot Pepper satisfies the psycho-social needs of the person by creating some sort of perceived social bond, which may be seen as a deception by the robot or the way people introduce it. People with MCI are often at risk of having difficulties distinguishing between a machine and a living being. Studies have shown that positive feelings are evoked in people with dementia through interaction with Paro. 23 , 24 , 25 It is questionable whether this can be seen as a deception of people with dementia and whether that is a legitimate approach. 96 However, when users know that they interact with a machine, it may still evoke positive emotions that are also perceived as okay to feel. 97 In this situation, residents may still struggle to navigate the decision-making process. However, it is essential to prioritize enabling residents, even those with mild cognitive impairments, to make informed decisions about their interactions with robots. Therefore, the interventions must be carefully designed to provide clear and accessible information about the robot’s nature and function to minimize the risk of misunderstandings and potential emotional harm, such as feelings of betrayal or disappointment. Hence, the nature of the robot should become apparent only later.

Although some participants in the interventions referred to the Pepper robot as a living being using terms such as “little man” or “little animal”, they mostly spoke of the “robot” in conversations about Pepper. Based on our data, we believe that there has been no confusion that the robot is a person or an animal (see also Ref. 98 ). Those terms were rather some sort of nicknames. According to Kolling, the fact that the robot evoked emotions in the participants can be considered legitimate, 97 as the participants appear to create an emotional bond based on conscious interactions.

6 Conclusions

In this paper, we used a ‘design case study’ approach to explore how a social robotic system affects nursing home residents with MCI. To this end, we conducted a 4-month study in a nursing home. We gathered valuable information from a social caregiver, a nursing home manager, and MCI nursing home residents in order to adapt existing robot applications and create new ones for this target group. Finally, we tested those applications with 10 residents twice a week for 10 weeks. Each course’s program changed, but individual applications were repeated on a regular basis to reveal a progression process. Despite the evaluation’s short duration of only 10 weeks, resident progression changes were evident.

The various reactions of residents to robots and their applications have important implications. For example, physical exercises with music increased residents’ participation in group settings while gradually making their activities independent from the robot movement instructions as they rather paid attention to the other participants. Some applications increased interaction between the residents by making the robot a kind of “boundary object”. 99 , 100 Playful applications sparked a competitive spirit in some participants, resulting in eagerness to participate in robot exercises. The musical applications encouraged residents to participate in the courses despite their mobility issues. Perhaps one of the most remarkable moments in the study was when one of the participants spoke to the robot for more than 10 min without receiving verbal feedback. The participant initiated a conversation with the robot several times, even telling the robot about some family issues she was experiencing. This and other situations demonstrate a mismatch between the robot’s capabilities and design affordances in social interactions. These discrepancies between expected robot behavior, actual performance, and reactions in specific situations are crucial. These differences necessitate future studies that focus on human sense-making practices.

Thus, we see potential for future research in the area of human-robot conversations. In this regard, the lack of good speech recognition and chatbot systems on the Pepper robot is a limitation of this study that may not be the case with other robots or software. In particular, we see potential in using large language models on such devices. However, this raises ethical concerns regarding residents’ privacy when interacting with such AI services.


Corresponding author: Mehrbod Manavi, Business Informatics and New Media, University of Siegen, Siegen, Germany, E-mail: 

Funding source: European Union H2020 Programme and the Japanese Ministry of Internal Affairs and ommunication

Award Identifier / Grant number: 101016453

Award Identifier / Grant number: JPJ000595

Acknowledgment

We would like to thank all of the participants and the institution who were involved. This research would not have been possible without their generous support.

  1. Research ethics: The study was conducted in accordance with the Declaration of Helsinki. The ethical approval was issued by the commission of the University of Siegen under ethics number Er_16_2024 on 3 September 2024.

  2. Informed consent: Informed consent was obtained from all individuals included in this study, or their legal guardians or wards.

  3. Author contributions: All authors have accepted responsibility for the entire content of this manuscript and approved its submission.

  4. Use of Large Language Models, AI and Machine Learning Tools: We did not use AI, LLM, or ML in this project or to write the manuscript. However, we only used the Grammarly tool and ChatGPT to correct spelling.

  5. Conflict of interest: All other authors state no conflict of interest.

  6. Research funding: The presented work has received funding from the European Union H2020 Programme under grant agreement no. 101016453.

  7. Data availability: The data that support the findings of this study is available from the corresponding author, Mehrbod Manavi, upon reasonable request.

References

1. Meyer, C. Menschen mit demenz als interaktionspartner. eine auswertung empirischer studien vor dem hintergrund eines dimensionalisierten interaktionsbegriffs/People with Dementia as Interactional Partners: An Analysis of Empirical Studies Based on a Dimensional Concept of Interaction. Z. Soziol. 2014, 43 (2), 95–112. https://doi.org/10.1515/zfsoz-2014-0203.Search in Google Scholar

2. World Health Organization. Ageing and Health, 2023. https://www.who.int/news-room/fact-sheets/detail/ageing-and-health (accessed 2024-08-08).Search in Google Scholar

3. Heinemann, S.; Manavi, M.; Taugerbeck, S.; Bräuer, J.; Wolf, A.; Colak, C.; Müller, D.; Sauerwald, J.; Unbehaun, D.; Wulf, V. The Narrative Future of (Digital) Care – Envisioning Care Fiction(s) in Education-Based and Professional Care Settings. In Infrahealth 2023 – Proceedings of the 9th International Conference on Infrastructures in Healthcare 2023; European Society for Socially Embedded Technologies (EUSSET), 2023.Search in Google Scholar

4. Unbehaun, D.; Aal, K.; Wieching, R.; Wulf, V.; Vaziri, D. D.; Jahnke, S.; Wulf, B. Development of an Ict-Based Training System for People with Dementia. In Companion Publication of the 2019 on Designing Interactive Systems Conference 2019 Companion, DIS ‘19 Companion; Association for Computing Machinery: New York, NY, USA, 2019; pp. 65–68.10.1145/3301019.3325153Search in Google Scholar

5. Unbehaun, D.; Wulf, V.; Schädler, J.; Lewkowicz, M.; Bassetti, C.; Ackerman, M. The Role of Digitalization in Improving the Quality of Live in Rural (Industrialized) Regions. In Proceedings of the 14th Biannual Conference of the Italian SIGCHI Chapter, CHItaly ‘21; Association for Computing Machinery: New York, NY, USA, 2021.10.1145/3464385.3467686Search in Google Scholar

6. Carros, F. Design, Development And Sensemaking of Human-Robot Interaction in Care Settings; Springer Fachmedien Wiesbaden GmbH: Wiesbaden, 2024.Search in Google Scholar

7. Youssef, K.; Said, S.; Alkork, S.; Beyrouthy, T. A Survey on Recent Advances in Social Robotics. Robotics 2022, 11 (4), 75. https://doi.org/10.3390/robotics11040075.Search in Google Scholar

8. Forsa Umfrage. Service-robotik: Mensch-technik-interaktion im alltag. ergebnisse einer repräsentativen befragung. Chancen der Digitalisierung für mehr Teilhabe und Partizipation im Alter, 2016. https://www.bmbf.de/de/service-roboter-statt-pflegeheim-2727.html (accessed 2025-03-07).Search in Google Scholar

9. Nesset, B.; Robb, D. A.; Lopes, J.; Hastie, H. Transparency in hri: Trust and Decision Making in the Face of Robot Errors. In Companion of the 2021 ACM/IEEE International Conference on Human-Robot Interaction, 2021; pp. 313–317.10.1145/3434074.3447183Search in Google Scholar

10. Stower, R.; Calvo-Barajas, N.; Castellano, G.; Kappas, A. A Meta-Analysis on Children’s Trust in Social Robots. Int. J. Soc. Rob. 2021, 13 (8), 1979–2001. https://doi.org/10.1007/s12369-020-00736-8.Search in Google Scholar

11. Natarajan, M.; Gombolay, M. Effects of Anthropomorphism and Accountability on Trust in Human Robot Interaction. In Proceedings of the 2020 ACM/IEEE International Conference on Human-Robot Interaction, 2020; pp. 33–42.10.1145/3319502.3374839Search in Google Scholar

12. Goetz, J.; Kiesler, S.; Powers, A. Matching Robot Appearance and Behavior to Tasks to Improve Human-Robot Cooperation. In The 12th IEEE International Workshop on Robot and Human Interactive Communication, 2003. Proceedings. ROMAN 2003; IEEE, 2003; pp. 55–60.10.1109/ROMAN.2003.1251796Search in Google Scholar

13. Feil-Seifer, D.; Matarić, M. J. Socially Assistive Robotics. IEEE Robot. Autom. Mag. 2011, 18 (1), 24–31. https://doi.org/10.1109/mra.2010.940150.Search in Google Scholar

14. Summerfield, M. R.; Jacob Seagull, F.; Vaidya, N.; Xiao, Y. Use of Pharmacy Delivery Robots in Intensive Care Units. Am. J. Health-Syst. Pharm. 2011, 68 (1), 77–83. https://doi.org/10.2146/ajhp100012.Search in Google Scholar PubMed

15. Göransson, O.; Pettersson, K.; Larsson, P. A.; Lennernäs, B. Personals Attitudes Towards Robot Assisted Health Care – A Pilot Study in 111 Respondents. Stud. Health Technol. Inform. 2008, 137, 56–60.Search in Google Scholar

16. Tiwari, P.; Warren, J.; Day, K. J.; MacDonald, B. Some Non-Technology Implications for Wider Application of Robots to Assist Older People. Health Care Inform. Rev. Online 2010, 14, 2–11.Search in Google Scholar

17. Broekens, J.; Heerink, M.; Rosendal, H. Assistive Social Robots in Elderly Care: A Review. Gerontechnology 2009, 8, 94–103. https://doi.org/10.4017/gt.2009.08.02.002.00.Search in Google Scholar

18. Carros, F.; Meurer, J.; Löffler, D.; Unbehaun, D.; Matthies, S.; Koch, I.; Wieching, R.; Randall, D.; Hassenzahl, M.; Wulf, V. Exploring Human-Robot Interaction with the Elderly: Results from a Ten-Week Case Study in a Care Home. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 2020; pp. 1–12.10.1145/3313831.3376402Search in Google Scholar

19. Mirza, M. Zora, der roboter auf der kinderstation, 2017. Last seen 07.02.25.Search in Google Scholar

20. Janowski, K.; Ritschel, H.; Lugrin, B.; André, E. Sozial interagierende Roboter in der Pflege; Springer Fachmedien Wiesbaden: Wiesbaden, 2018.10.1007/978-3-658-22698-5_4Search in Google Scholar

21. Wangmo, T.; Duong, V.; Felber, N. A.; Tian, Y. J. A.; Mihailov, E. No Playing Around with Robots? Ambivalent Attitudes Toward the Use of Paro in Elder Care. Nurs. Inq. 2024, 31 (3), e12645. https://doi.org/10.1111/nin.12645.Search in Google Scholar PubMed

22. Granier, K.; Oltz, K.; Ingram, R.; Segal, D. Getting the Seal of Approval: A Critical Literature Review of the Evidence for the Use of the Paro Robotic Companion Seal with Older Adults with Cognitive Impairment in Long-Term Care. J. Aging Long-Term Care 2023, 6 (2), 57–79. https://doi.org/10.51819/jaltc.2023.1243669.Search in Google Scholar

23. Moyle, W.; Beattie, E.; Draper, B.; Shum, D.; Thalib, L.; Jones, C.; O’Dwyer, S.; Mervin, C. Effect of an Interactive Therapeutic Robotic Animal on Engagement, Mood States, Agitation and Psychotropic Drug Use in People with Dementia: a Cluster-Randomised Controlled Trial Protocol. BMJ Open 2015, 5 (8), e009097. https://doi.org/10.1136/bmjopen-2015-009097.Search in Google Scholar PubMed PubMed Central

24. Robinson, H.; MacDonald, B.; Broadbent, E. Physiological Effects of a Companion Robot on Blood Pressure of Older People in Residential Care Facility: A Pilot Study. Australas. J. Ageing 2015, 34 (1), 27–32. https://doi.org/10.1111/ajag.12099.Search in Google Scholar PubMed

25. Sung, H.-C.; Chang, S.-M.; Chin, M.-Y.; Lee, W.-L. Robot-Assisted Therapy for Improving Social Interactions and Activity Participation Among Institutionalized Older Adults: A Pilot Study. Asia Pac. Psychiatry 2015, 7 (1), 1–6. https://doi.org/10.1111/appy.12131.Search in Google Scholar PubMed

26. Andtfolk, M.; Nyholm, L.; Eide, H.; Fagerström, L. Humanoid Robots in the Care of Older Persons: A Scoping Review. Assist. Technol. 2022, 34 (5), 518–526. https://doi.org/10.1080/10400435.2021.1880493.Search in Google Scholar PubMed

27. Carros, F.; Bürvenich, B.; Browne, R.; Matsumoto, Y.; Trovato, G.; Manavi, M.; Homma, K.; Ogawa, T.; Wieching, R.; Wulf, V. Not That Uncanny After All? An Ethnographic Study on Android Robots Perception of Older Adults in Germany and Japan. In Social Robotics; Cavallo, F.; Cabibihan, J.-J.; Fiorini, L.; Sorrentino, A.; He, H.; Liu, X.; Matsumoto, Y.; Ge, S. S., Eds.; Springer Nature Switzerland: Cham, 2022; pp. 574–586.10.1007/978-3-031-24670-8_51Search in Google Scholar

28. Carros, F.; Schwaninger, I.; Preussner, A.; Randall, D.; Wieching, R.; Fitzpatrick, G.; Wulf, V. Care Workers Making Use of Robots: Results of a Three-Month Study on Human-Robot Interaction Within a Care Home. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, CHI ‘22; Association for Computing Machinery: New York, NY, USA, 2022.10.1145/3491102.3517435Search in Google Scholar

29. Onyeulo, E. B.; Gandhi, V. What Makes a Social Robot Good at Interacting with Humans? Information 2020, 11 (1), 43. https://doi.org/10.3390/info11010043.Search in Google Scholar

30. Shrager, J. Eliza Reinterpreted: The World’s First Chatbot Was Not Intended as a Chatbot at All. arXiv preprint arXiv:2406.17650, 2024.Search in Google Scholar

31. Weizenbaum, J. Computer Power and Human Reason: From Judgment to Calculation; W. H. Freeman and Company: San Francisco, 1976.Search in Google Scholar

32. Weizenbaum, J. Eliza–A Computer Program for the Study of Natural Language Communication Between Man and Machine. Commun. ACM 1966, 9 (1), 36–45. https://doi.org/10.1145/365153.365168.Search in Google Scholar

33. Eisenmann, C.; Mlynář, J.; Turowetz, J.; Rawls, A. W. “Machine Down”: Making Sense of Human–Computer Interaction–Garfinkel’s Research on Eliza and Lyric from 1967 to 1969 and its Contemporary Relevance. AI Soc. 2023, 39, 1–19. https://doi.org/10.1007/s00146-023-01793-z.Search in Google Scholar

34. Saha, D.; Brooker, P.; Mair, M.; Reeves, S. Thinking Like a Machine: Alan Turing, Computation and the Praxeological Foundations of Ai. Sci. Technol. Stud. 2023, 37, 66–88; https://doi.org/10.23987/sts.122892.Search in Google Scholar

35. Sahin, M.; Relieu, M.; Francillon, A. Using Chatbots against Voice Spam: Analyzing {Lenny’s} Effectiveness. In Thirteenth Symposium on Usable Privacy and Security (SOUPS 2017), 2017; pp. 319–337.Search in Google Scholar

36. Ivarsson, J.; Lindwall, O. Suspicious Minds: the Problem of Trust and Conversational Agents. Comput. Support. Coop. Work 2023, 32 (3), 545–571. https://doi.org/10.1007/s10606-023-09465-8.Search in Google Scholar

37. Müller, M. R. Social Displays–Creating Accountability in Robotics. Österreichische Z. Soziol. 2023, 48 (4), 469–487. https://doi.org/10.1007/s11614-023-00534-2.Search in Google Scholar

38. Tuncer, S.; Licoppe, C.; Luff, P.; Heath, C. Recipient Design in Human–Robot Interaction: The Emergent Assessment of a Robot’s Competence. AI Soc. 2024, 39 (4), 1795–1810. https://doi.org/10.1007/s00146-022-01608-7.Search in Google Scholar

39. Reben, R. Boxie the Story Gathering Robot 2010. https://areben.com/project/boxie-the-story-gathering-robot [accessed: Feb. 07, 2025].Search in Google Scholar

40. Hartzog, W. Unfair and Deceptive Robots. Med. Law. Rev. 2014, 74, 785.Search in Google Scholar

41. Hermann, T. Die möglichkeiten und auswirkungen von sozial-emotionalen robotern, insbesondere der robbe paro, im einsatz in der pflege. Schule für allgemeine Gesundheits- und Krankenpflegeam Sozialmedizinischen Zentrum Ost der Stadt Wien 2015. https://www.researchgate.net/publication/339433425_DIE_MOGLICHKEITEN_UND_AUSWIRKUNGEN_VON_SOZIAL-_EMOTIONALEN_ROBOTERN_INSBESONDERE_DER_ROBBE_PARO_IM_EINSATZ_IN_DER_PFLEGE/citations (accessed 07.03.2025).Search in Google Scholar

42. Bethel, C. L.; Stevenson, M. R.; Scassellati, B. Secret-Sharing: Interactions Between a Child, Robot, and Adult. In 2011 IEEE International Conference on Systems, Man, and Cybernetics; IEEE, 2011; pp. 2489–2494.10.1109/ICSMC.2011.6084051Search in Google Scholar

43. Westlund, J. K.; Breazeal, C.; Story, A. Deception, Secrets, Children, and Robots: What’s Acceptable. In Workshop on The Emerging Policy and Ethics of Human-Robot Interaction, Held in Conjunction with the 10th ACM/IEEE International Conference on Human-Robot Interaction, 2015.Search in Google Scholar

44. Pandey, A. K.; Gelin, R. A Mass-Produced Sociable Humanoid Robot: Pepper: The First Machine of its Kind. IEEE Robot. Autom. Mag. 2018, 25 (3), 40–48. https://doi.org/10.1109/mra.2018.2833157.Search in Google Scholar

45. Paluch, R.; Müller, C. “That’s Something for Children”: An Ethnographic Study of Attitudes and Practices of Care Attendants and Nursing Home Residents towards Robotic Pets. Proc. ACM Hum.-Comput. Interact. 2022, 6, 1–35. https://doi.org/10.1145/3492850.Search in Google Scholar

46. Oh, Y.; Yang, S. Defining Exergames & Exergaming. In Proceedings of Meaningful Play, 2010, 2010; pp. 21–23.Search in Google Scholar

47. Unbehaun, D.; Vaziri, D. D.; Aal, K.; Wieching, R.; Tolmie, P.; Wulf, V. Exploring the Potential of Exergames to Affect the Social and Daily Life of People with Dementia and Their Caregivers. In Proceedings of the 2018 Chi Conference on Human Factors in Computing Systems, 2018; pp. 1–15.10.1145/3173574.3173636Search in Google Scholar

48. Raß, E.; Unbehaun, D.; Wulf, V.; Lenz, G.; Fischer, A.; Eilers, H.; Lüssem, J. Envisioning Social Assistive Robotics in Long-Term Care Settings: Insights and Challenges Arising within a Praxlab. In Proceedings of the Conference on Pervasive Technologies Related to Assistive Environments, 2023.Search in Google Scholar

49. Wulf, V.; Rohde, M.; Pipek, V.; Stevens, G. Engaging with Practices: Design Case Studies as a Research Framework in Cscw. In Proceedings of the ACM 2011 Conference on Computer Supported Cooperative Work, CSCW ‘11; Association for Computing Machinery: New York, NY, USA, 2011; pp. 505–512.10.1145/1958824.1958902Search in Google Scholar

50. Stevens, Gunnar; Rohde, Markus; Korn, Matthias; Wulf, Volker Grounded Design: A Research Paradigm in Practice-Based Computing. In Socio-Informatics; Oxford University Press: Oxford, 2018; pp 23–46.10.1093/oso/9780198733249.003.0002Search in Google Scholar

51. Carros, F. Design, Development and Sensemaking of Human-Robot Interaction in Care Settings; Springer Vieweg: Wiesbaden, 2024.10.1007/978-3-658-45233-9Search in Google Scholar

52. Unbehaun, D.; Aal, K.; Carros, F.; Wieching, R.; Wulf, V. Creative and Cognitive Activities in Social Assistive Robots and Older Adults: Results from an Exploratory Field Study with Pepper. In Proceedings of the 17th European Conference on Computer-Supported Cooperative Work: The International Venue on Practice-Centred Computing and the Design of Cooperation Technologies – Demos and Posters; Reports of the European Society for Socially Embedded Technologies, 2019.Search in Google Scholar

53. Lehmann, J.; Carros, F.; Unbehaun, D.; Wieching, R.; Lüssem, J. Einsatzfelder der sozialen robotik in der pflege. In Digitale Transformation im Krankenhaus. Thesen, Potenziale, Anwendungen; Mediengruppe Oberfranken: Kulmbach, 2019; pp. 88–113.Search in Google Scholar

54. Vaziri, D. D.; Aal, K.; Gschwind, Y. J.; Delbaere, K.; Weibert, A.; Annegarn, J.; de Rosario, H.; Wieching, R.; Randall, D.; Wulf, V. Analysis of Effects and Usage Indicators for a Ict-Based Fall Prevention System in Community Dwelling Older Adults. Int. J. Hum. Comput. Stud. 2017, 106, 10–25. https://doi.org/10.1016/j.ijhcs.2017.05.004.Search in Google Scholar

55. Unbehaun, D. Designing, Implementing and Evaluating Assistive Technologies to Engage People with Dementia and Their Caregivers. Ph.D. Thesis; Universität Siegen, 2020.Search in Google Scholar

56. Müller, C.; Schorch, M.; Wieching, R. Praxlabs as a Setting for Participatory Technology Research and Design in the Field of Hri and Demography. In Proceedings of the Workshop, Socially Assistive Robots for the Aging Population: Are we Trapped in Stereotypes, 2014.Search in Google Scholar

57. Ogonowski, C.; Jakobi, T.; Müller, C.; Hess, J. Praxlabs: A Sustainable Framework for User-Centered Information and Communication Technology Development-Cultivating Research Experiences from Living Labs in the Home. In Socio-Informatics: A Practice-Based Perspective on the Design and Use of IT Artifacts; Wulf, P., Pipek, V., Randall, D., Rohde, M., Schmidt, K., Stevens, G., Eds.; Oxford University Press: Oxford, 2018; pp 319–360.10.1093/oso/9780198733249.003.0011Search in Google Scholar

58. Tuma, R.; Schnettler, B.; Knoblauch, H. Videographie; Springer Fachmedien Wiesbaden: Wiesbaden, 2013.10.1007/978-3-531-18732-7Search in Google Scholar

59. Knoblauch, H.; Tuma, R. Videography: An Interpretative Approach to Video-Recorded Micro-Social Interaction. In The SAGE Handbook of Visual Research Methods; SAGE Publications: London, 2020; pp 129–142.10.4135/9781526417015.n8Search in Google Scholar

60. Braun, V.; Clarke, V. Thematic Analysis; Springer International Publishing: Cham, 2023; pp. 7187–7193.10.1007/978-3-031-17299-1_3470Search in Google Scholar

61. Mondada, L. Understanding as an Embodied, Situated and Sequential Achievement in Interaction. J. Pragmat. 2011, 43 (2), 542–552. https://doi.org/10.1016/j.pragma.2010.08.019.Search in Google Scholar

62. Tian, L.; Oviatt, S. A Taxonomy of Social Errors in Human-Robot Interaction. J. Hum.-Robot Interact. 2021, 10 (2), 1–32. https://doi.org/10.1145/3439720.Search in Google Scholar

63. Bassetti, C.; Blanzieri, E.; Borgo, S.; Marangon, S. Towards Socially-Competent and Culturally-Adaptive Artificial Agents: Expressive Order, Interactional Disruptions and Recovery Strategies. Interact. Stud. 2022, 23 (3), 469–512. https://doi.org/10.1075/is.22021.bas.Search in Google Scholar

64. Stommel, W.; de Rijk, L.; Boumans, R. “Pepper, What Do You Mean?” Miscommunication and Repair in Robot-Led Survey Interaction. In 2022 31st IEEE International Conference on Robot and Human Interactive Communication (RO-MAN); IEEE, 2022; pp. 385–392.10.1109/RO-MAN53752.2022.9900528Search in Google Scholar

65. Chan Lee, S.; Ji, Y. G. Calibration of Trust in Autonomous Vehicle. In Human-Automation Interaction: Transportation; Springer International Publishing: Cham, 2022; pp 267–280.10.1007/978-3-031-10784-9_16Search in Google Scholar

66. Schilberg, D.; Schmitz, S. Informationsmodell für intentionsbasierte roboter-mensch-interaktion. In Pflegeroboter; Springer Fachmedien Wiesbaden: Wiesbaden, 2018; pp 23–36.10.1007/978-3-658-22698-5_2Search in Google Scholar

67. Pino, M.; Boulay, M.; Jouen, F.; Rigaud, A.-S. “Are We Ready for Robots that Care for Us?” Attitudes and Opinions of Older Adults Toward Socially Assistive Robots. Front. Aging Neurosci. 2015, 7, 141. https://doi.org/10.3389/fnagi.2015.00141.Search in Google Scholar PubMed PubMed Central

68. Hegedusch, E.; Hegedusch, L. Tiergestützte Therapie bei Demenz: die gesundheitsförderliche Wirkung von Tieren auf demenziell erkrankte Menschen; Schlütersche: Hannover, 2007.Search in Google Scholar

69. Lau, Y. Y. C.; van’t Hof, C.; Van Est, R. Beyond the Surface: An Exploration in Healthcare Robotics in Japan; Rathenau Instituut: The Hague, 2009.Search in Google Scholar

70. Paulsen, N. Große offenheit für digitale helfer in der pflege, 2018. Last seen 07.02.25.Search in Google Scholar

71. Fior, M.; Nugent, S.; Beran, T. N.; Ramirez-Serrano, A.; Kuzyk, R. Children’s Relationships with Robots: Robot is Child’s New Friend. J. Phys. Agents 2010, 4, 9–17. https://doi.org/10.14198/jopha.2010.4.3.02.Search in Google Scholar

72. Kreis, J. Umsorgen, uberwachen, unterhalten–sind pflegeroboter ethisch vertretbar? In Pflegeroboter; Springer Fachmedien Wiesbaden: Wiesbaden, 2018; pp 213–228.10.1007/978-3-658-22698-5_12Search in Google Scholar

73. Radenbach, J. Aktiv trotz Demenz: Handbuch für die Aktivierung und Betreuung von Demenzerkrankten; Schlütersche: Hannover, 2016.Search in Google Scholar

74. Rudaz, D.; Tatarian, K.; Stower, R.; Licoppe, C. From Inanimate Object to Agent: Impact of Pre-Beginnings on the Emergence of Greetings with a Robot. J. Hum.-Robot Interact. 2023, 12 (3), 1–31. https://doi.org/10.1145/3575806.Search in Google Scholar

75. Pelikan, H. R. M.; Broth, M. Why That Nao? How Humans Adapt to a Conventional Humanoid Robot in Taking Turns-at-Talk. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, CHI ‘16; Association for Computing Machinery: New York, NY, USA, 2016; pp. 4921–4932.10.1145/2858036.2858478Search in Google Scholar

76. Eisenmann, C.; Englert, K.; Schubert, C.; Voss, E. Varieties of Cooperation: Mutually Making the Conditions of Mutual Making; Springer Nature: Wiesbaden, 2023.10.1007/978-3-658-39037-2Search in Google Scholar

77. Button, G.; Coulter, J.; Lee, J.; Sharrock, W. Computers, Minds and Conduct; Polity Press: Cambridge, 1995.Search in Google Scholar

78. Mlynář, J.; de Rijk, L.; Liesenfeld, A.; Stommel, W.; Albert, S. Ai in Situated Action: A Scoping Review of Ethnomethodological and Conversation Analytic Studies. AI Soc. 2024, 1–31. https://doi.org/10.1007/s00146-024-01919-x Search in Google Scholar

79. Reeves, S. Some Conversational Challenges of Talking with Machines. Companion of the 20th ACM Conference on Computer-Supported Cooperative Work & Social Computing (CSCW ’17) 2017, 431–436. https://doi.org/10.1145/3022198.3022666.Search in Google Scholar

80. Porcheron, M.; Fischer, J. E.; Reeves, S.; Sharples, S. Voice Interfaces in Everyday Life. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, CHI ‘18; Association for Computing Machinery: New York, NY, USA, 2018; pp. 1–12.10.1145/3173574.3174214Search in Google Scholar

81. Lipp, B. Caring for Robots: How Care Comes to Matter in Human-Machine Interfacing. Soc. Stud. Sci. 2023, 53 (5), 660–685. https://doi.org/10.1177/03063127221081446.Search in Google Scholar PubMed

82. Suchman, L. A. Human-Machine Reconfigurations: Plans and Situated Actions; Cambridge University Press: Cambridge, 2007.10.1017/CBO9780511808418Search in Google Scholar

83. DiSalvo, C. Adversarial Design; The MIT Press: Cambridge, MA, 2012.10.7551/mitpress/8732.001.0001Search in Google Scholar

84. Alač, M. Social Robots: Things or Agents? AI Soc. 2016, 31, 519–535. https://doi.org/10.1007/s00146-015-0631-6.Search in Google Scholar

85. Pfadenhauer, M.; Dukat, C. Robot Caregiver or Robot-Supported Caregiving? The Performative Deployment of the Social Robot Paro in Dementia Care. Int. J. Soc. Rob. 2015, 7, 393–406. https://doi.org/10.1007/s12369-015-0284-0.Search in Google Scholar

86. Kehl, C. Wege zu verantwortungsvoller forschung und entwicklung im bereich der pflegerobotik: Die ambivalente rolle der ethik. In Pflegeroboter; Springer Fachmedien Wiesbaden: Wiesbaden, 2018; pp 141–160.10.1007/978-3-658-22698-5_8Search in Google Scholar

87. Geyer, J.; Börsch-Supan, A. H.; Haan, P.; Perdrix, E. Long-Term Care in Germany. Technical report; National Bureau of Economic Research, 2023.10.3386/w31870Search in Google Scholar

88. Grunwald, A. Ethische Aufklärung statt Moralisierung. Zur reflexiven Befassung der Technikfolgenabschätzung mit normativen Fragen. In Ethisierung der Technik - Technisierung der Ethik, 1st ed.; Nomos: Baden-Baden, Vol. 11, 2013; pp 232–247.10.5771/9783845245621-232Search in Google Scholar

89. Grunwald, A. Einleitung und überblick. In Handbuch Technikethik; J.B. Metzler: Stuttgart, 2013; pp 1–11.10.1007/978-3-476-05333-6_1Search in Google Scholar

90. Chan Kok Yew, G. Trust in and Ethical Design of Carebots: The Case for Ethics of Care. Int. J. Soc. Rob. 2021, 13 (4), 629–645. https://doi.org/10.1007/s12369-020-00653-w.Search in Google Scholar PubMed PubMed Central

91. Papadopoulos, I.; Wright, S.; Koulouglioti, C.; Ali, S.; Lazzarino, R.; Martín-García, Á.; Oter-Quintana, C.; Kouta, C.; Rousou, E.; Papp, K.; Krepinska, R.; Tothova, V.; Malliarou, M.; Apostolara, P.; Lesińska-Sawicka, M.; Nagorska, M.; Liskova, M.; Nortvedt, L.; Alpers, L.; Biglete-Pangilinan, S.; Oconer-Rubiano, F.; Chaisetsampun, W.; Wichit, N.; Ghassemi, A.; Jafarjalal, E.; Zorba, A.; Kuckert-Wöstheinrich, A.; Malla, R.; Özlem Akman, T.; Öztürk, C.; Puvimanasinghe, T.; Ziaian, T.; Eldar-Regev, O.; Nissim, S. Socially Assistive Robots in Health and Social Care: Acceptance and Cultural Factors. Results from an Exploratory International Online Survey. Jpn J. Nurs. Sci. 2023, 20 (2), e12523; https://doi.org/10.1111/jjns.12523.Search in Google Scholar PubMed

92. Carros, F.; Störzinger, T.; Wierling, A.; Preussner, A.; Tolmie, P. Ethical, Legal & Participatory Concerns in the Development of Human-Robot Interaction. I-com, 21 (2), 299–309. https://doi.org/10.1515/icom-2022-0025, 2022.Search in Google Scholar

93. Störzinger, T.; Carros, F.; Wierling, A.; Misselhorn, C.; Wieching, R. Categorizing Social Robots with Respect to Dimensions Relevant to Ethical, Social and Legal Implications. I-com 2020, 19 (1), 47–57. https://doi.org/10.1515/icom-2020-0005.Search in Google Scholar

94. Remmers, H. Ethische Implikationen der Nutzung alternsgerechter technischer Assistenzsysteme: Expertise zum Siebten Altenbericht der Bundesregierung; Deutsches Zentrum für Altersfragen: Berlin, 2016.Search in Google Scholar

95. Carros, F.; Jokisch, S.; Manavi, M.; Wulf, V. Fears About Social Robots in Nursing. In Infrahealth 2023 – Proceedings of the 9th International Conference on Infrastructures in Healthcare 2023, 2023.Search in Google Scholar

96. Turner, A.; Eccles, F.; Keady, J.; Simpson, J.; Elvish, R. The Use of the Truth and Deception in Dementia Care Amongst General Hospital Staff. Aging Ment. Health 2017, 21 (8), 862–869. https://doi.org/10.1080/13607863.2016.1179261.Search in Google Scholar PubMed

97. Kolling, T.; Baisch, S.; Schall, A.; Selic, S.; Rühl, S.; Kim, Z.; Rossberg, H.; Klein, B.; Pantel, J.; Oswald, F.; Knopf, M. What is Emotional About Emotional Robotics? In Emotions, Technology, and Health; Academic Press: San Diego, 2016; pp 85–103.10.1016/B978-0-12-801737-1.00005-6Search in Google Scholar

98. Seaborn, K.; Frank, A. What Pronouns for Pepper? A Critical Review of Gender/ing in Research. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems, CHI ‘22; Association for Computing Machinery: New York, NY, USA, 2022.10.1145/3491102.3501996Search in Google Scholar

99. Leigh Star, S.; Griesemer, J. R. Institutional Ecology, ‘Translations’ and Boundary Objects: Amateurs and Professionals in Berkeley’s Museum of Vertebrate Zoology, 1907–39. Soc. Stud. Sci. 1989, 19 (3), 387–420. https://doi.org/10.1177/030631289019003001.Search in Google Scholar

100. Malinverni, L.; Valero, C.; Schaper, M. M.; Garcia de la Cruz, I. Educational Robotics as a Boundary Object: Towards a Research Agenda. Int. J. Child-Comput. Interact. 2021, 29, 100305. https://doi.org/10.1016/j.ijcci.2021.100305.Search in Google Scholar

Received: 2024-08-09
Accepted: 2025-02-21
Published Online: 2025-03-20

© 2025 the author(s), published by De Gruyter, Berlin/Boston

This work is licensed under the Creative Commons Attribution 4.0 International License.

Articles in the same Issue

  1. Frontmatter
  2. Special Issue on “Usable Safety and Security”
  3. Editorial on Special Issue “Usable Safety and Security”
  4. The tension of usable safety, security and privacy
  5. Research Articles
  6. Keeping the human in the loop: are autonomous decisions inevitable?
  7. iSAM – towards a cost-efficient and unobtrusive experimental setup for situational awareness measurement in administrative crisis management exercises
  8. Breaking down barriers to warning technology adoption: usability and usefulness of a messenger app warning bot
  9. Use of context-based adaptation to defuse threatening situations in times of a pandemic
  10. Cyber hate awareness: information types and technologies relevant to the law enforcement and reporting center domain
  11. From usable design characteristics to usable information security policies: a reconceptualisation
  12. A case study of the MEUSec method to enhance user experience and information security of digital identity wallets
  13. Evaluating GDPR right to information implementation in automated insurance decisions
  14. Human-centered design of a privacy assistant and its impact on perceived transparency and intervenability
  15. ChatAnalysis revisited: can ChatGPT undermine privacy in smart homes with data analysis?
  16. Special Issue on “AI and Robotic Systems in Healthcare”
  17. Editorial on Special Issue “AI and Robotic Systems in Healthcare”
  18. AI and robotic systems in healthcare
  19. Research Articles
  20. Exploring technical implications and design opportunities for interactive and engaging telepresence robots in rehabilitation – results from an ethnographic requirement analysis with patients and health-care professionals
  21. Investigating the effects of embodiment on presence and perception in remote physician video consultations: a between-participants study comparing a tablet and a telepresence robot
  22. From idle to interaction – assessing social dynamics and unanticipated conversations between social robots and residents with mild cognitive impairment in a nursing home
  23. READY? – Reflective dialog tool on issues relating to the use of robotic systems for nursing care
  24. AI-based character generation for disease stories: a case study using epidemiological data to highlight preventable risk factors
  25. Research Articles
  26. Towards future of work in immersive environments and its impact on the Quality of Working Life: a scoping review
  27. A formative evaluation: co-designing tools to prepare vulnerable young people for participating in technology development
Downloaded on 14.11.2025 from https://www.degruyterbrill.com/document/doi/10.1515/icom-2024-0046/html?lang=en
Scroll to top button