Qualitative content analysis
Seven main categories emerged from the data and are described below, along with quotes from the focus groups and interviews.
Main category 1: dignity
In this category the participants’ fears as to how the robot’s use might affect the sense of value or reputation of persons or institutions were analysed. The participants could imagine that those in need of care might feel less worthy and less perceived as a human being when touched by a robot instead of by a human caregiver. One participant justified this with parallels to industry: "The patient might feel being the subject of mass processing because at the moment a robot arm is mainly known as part of the production line in the automobile industry. That is, he would not feel valued or perceived as a person" (Professional Nurse (PN) 1, Focus Group (FG) 3). In this regard, a patient also expressed uncomfortable feelings about “that such a soulless being should do all these movements” (P 4).
The participants argued further that the robot could even frighten those in need of care: "That can sometimes lead to the patient being frightened, when there is suddenly such a device at his bedside" (PN 10, FG 2). This was mentioned with regard to all people in need of care, but in particular for those with cognitive diseases and those who are not generally used to modern technologies.
Furthermore, the possible effects on the reputation of professional caregivers and institutions were discussed. Some participants feared that their work might be less valued, as people might think that human care could be replaced by robots or the work would no longer be so demanding when a robot is used. One participant stated "It reduces our, my sense of value, or affects the whole profession, because it sounds the same as when someone says "Anyone can nurse”” (PN 12, FG 1).
Main category 2: autonomy
The participants discussed how the robot might threaten nurses’ and patients’ ability to decide and act independently. On the one hand, they talked about the possibility to decide about the use of the robot. One participating nurse expressed concern about pressure from the employer who might focus on the economic aspect: "If a lot of money has been paid for it, then it ought to be used" (PN 9, FG 1). On the other hand, they indicated that abilities of both nurses and people in need of care might get lost as a consequence of the robot’s regular use. One participant stated that "if a residual ability or resource is there and the resident […] or because the robot takes over, the resources become more limited or are lost completely" (PN 4, FG 1). Concerning the part of nurses, one participant raised the concern that “the technology can very quickly be overestimated and that too much reliance is placed on the technical aspect” (PN 10, FG 2).
Main category 3: personal privacy
Discussion under this category revolved around possible negative consequences for the privacy of people in the robot’s environment. Some participants tended to the view that patients and caregivers might feel uncomfortable due to the need of cameras and microphones for the robot to function. They asked themselves what kind of data might pass the robot’s sensors, such as recordings of private conversations or photos of patients’ intimate areas. In this context, a professional nurse thought "One might feel that one is being monitored a bit due to the camera" (PN 11, FG 3).
Further concerns focused on the use of data. Some nurses expressed concerns that data might be used to monitor their work and hold them accountable. One nurse linked this to the relationship with the employer: "Have I got a boss who looks at these recordings […] or is my professionalism being trusted, and the material is not being used" (PN 5, FG 3). Participants asked themselves in general who might have access to the data. One participant asked if “the health insurance companies [would] be able to make use of it or something like that?” (PN 12, FG 1).
Main category 4: relationship level
In this category, participants’ thoughts about possible negative effects of the robot on human relationships were analysed. On the one hand, participants considered how contact with the person in need of care could suffer. Participants mainly expressed the view that the contact to the patient might suffer under the robot’s use because patients were unsettled by the robot’s presence or nurses were busy handling the robot. One relative formulated that there might be “less human contact perhaps or fewer conversations, less direct approach, because perhaps the nurse communicates more with the robot at that moment and not with the patient” (Relative (R) 4).
On the other hand, participants thought of possible consequences for the relationship between nurses. Nurses from a clinical setting explained they often work together with colleagues and are therefore worried about less teamwork when working with the robot: "And there is togetherness and teamwork at the bedside when staff help each other to move patients or make the beds, and that would then all be lost" (PN 13, FG 1).
Main category 5: safety
Issues of safety were not explicitly addressed by the question schedule but were named initially and spontaneously by participants. As an interface to classical risk analysis, participants mentioned that people in need of care could suffer harm such as skin lesions and pain caused by the robotic end effector when touching the patient. For instance, nurses raised the risk of fractures when moving a patient: "If there is muscular resistance or something, that it doesn't break your bones later on" (PN 14, FG 2).
Participants also mentioned doubts concerning the robot’s abilities and that malfunction or breakdowns could endanger patients. Nurses indicated that another perspective might be missing, as the robot does not have the observation and communication skills that human colleagues do: "Well, if the robot has to turn a patient over […] and there's a wound or something, the nursing professional can't see that, and the robot can't pass it on either, like "Look here, there's a wound” (PN 7, FG 1).
During the discussions, nurses raised further concerns about the robot’s level of autonomy and the possibilities to control it: "But the robot is voice-controlled and then makes his own actions and that is, again, if I'm not stood right next to it, impossible to control it that way “ (PN 14, FG 2).
Main category 6: organizational matters
All fears that could not be assigned to the other categories in terms of content were dealt with in this category. The only topic that emerged predominantly not only in the focus groups but also in the individual interviews was the robot’s effects on the staffing ratio which was mentioned here by six out of eight patients and relatives. Participants raised concerns about whether the robot might be counted as a human caregiver rather than as serving as an additional support for the nurses. One relative stated "For nursing in general, I see the danger that the care robot will replace the nursing staff, who will be redundant" (R1).
Another concern of the participants was the question of efficiency. They discussed the fear that the robot might not lead to relief but to more work in operating the robot like in terms of bringing it to the patient, handling during usage, or documentation. One relative furthermore mentioned “the time that is needed to teach the patients, explain everything so that they accept it. […] Probably having to check each morning if it is still working properly and charge it" (R4).
Apart from that, nurses raised the question of liability in the event of mistakes during working with the robot: "If it's really because it's not controlled properly, then I think the liability issue is that the nurse is responsible" (PN 2, FG 2). Finally, issues of distributive justice were mentioned due to limited availability.
Main category 7: positive aspects
The focus groups were conducted to identify ethical concerns regarding the robot from the user’s perspective. Accordingly, the moderation and analysis focused on negative aspects that could result from the robot’s use. Nevertheless, many participants mentioned positive effects of the robot or contradicted the negative effects discussed. Regarding the aspect of self-esteem as a nurse, one participant stated: "It can't do holistic nursing. It can do the—the mechanical thing, but not everything else, it can't carry out the whole process" (PN 12, FG 1). In the context of autonomy, one nurse highlighted a possible positive effect of the robot: "I don't need to wait for a colleague, I can perform tasks directly one after the other" (PN 7, FG 1). In particular, opinions relating to the robot’s sensors and privacy differed greatly among participants. Whereas some of the participants expressed concerns in this area, as described in category 3, others did not see any problems here: "In principle, it is completely unproblematic, since the camera only captures a moment in time" (PN 10, FG 2).
Patients especially very rarely raised concerns about the robot’s use. On the subject of fear or insecurity, as discussed in category 1, one patient explained she would not be afraid of the robot: "I'm not afraid because when a person is there too, everything is okay" (Patient (P) 2). In the matter of human relationships (see category 4), one nurse stated “I think it can't disturb a relationship […] because this robot does not replace the nursing and the human touch during care" (PN 16, FG 3]”. One patient even came up with a possible positive effect of the robot, here: "And the robot is never in a bad mood" (P3).
Prioritising of ethical issues
At the end of the focus group discussion, the moderators asked the participating nurses to name up to three aspects of the discussion that they considered most important. Nurses named issues of safety and effort in use most frequently. Numerous other prioritised issues were the loss of functional patient resources and deterrence due to the robot’s appearance. In this regard, the participants suggested giving the robot a name and making it look less industrial. On the other hand, issues concerning personal privacy at the relationship level or staffing ratio were rarely named as being most important. The results of the voting are summarized and sorted by frequency in Table
3.
Table 3
Most important aspects (nurses' perspective)
Safety | 10 |
Efficiency/ effort in use | 8 |
Deterrence due to robot’s appearance | 5 |
Loss of functional patient resources | 4 |
Costs | 2 |
Personal privacy | 1 |
Relationship level | 1 |
Staffing ratio | 1 |
Small number of available robots | 1 |
Identified ethical risks and requirements
The complete table with all identified risks, mitigating requirements and defined validation/ verification methods is presented in additional file
3. The structure of ethical issues is guided by the categories from the analysis of focus groups and individual interviews. Risks that arose in focus groups or individual interviews are included in the table.
There was a large overlap between risks identified in focus groups/ individual interviews and in the literature. For example, the issues of staffing ratio [
15,
16], reduction of human contact [
17] and liability [
18] have also been emphasized in the literature. A risk that did not arise in the focus groups or individual interviews but did so in the literature was that of an incomplete representation of phenomena in the population, leading to errors in use for persons with certain characteristics [
6]. For the project this could specifically mean that detecting the position of a patient’s leg leads to errors when the patient’s skin colour is black. Further risks regarding robots in the literature revolve, for instance, around machine learning and lack of transparency concerning robotic actions [
6,
10,
19] or the misuse of robots [
6,
10]. Additional risks contributed by members of the research team were, among others, the fear that holistic care might get pushed into the background and the idea that efficiency could take precedence. Furthermore, environmental issues were not discussed in the focus groups or individual interviews. Members of the research team therefore added the risks that a partial defect could lead to total uselessness and that resources used for the conception and use of the robot might cause damage to the environment.
Mitigating requirements were assigned to ethical risks. For example, to mitigate the risk of deception an anthropomorphic design should be limited, perhaps by means of a functional design that is oriented towards the technical properties of the robot [
20]. Regarding the risk of errors in use for persons with specific characteristics, data for the adjustment of the robot should be inclusive and represent different population groups [
19]. The specific requirement for the project is therefore that the robot should be able to process male and female voices, different accents and skin colours equally well. With regard to risks of inappropriate control by the robot, it is suggested to limit robotic self-learning without human supervision [
10] and to keep algorithms verifiable [
19]. To mitigate problems in the field of personal privacy, there are legal requirements that need to be respected. Accordingly, personal data should only be processed with the consent of those concerned [
21]. Furthermore, the principle of data minimization applies. Therefore, only data that is necessary for the fulfilment of the purpose should be collected, and should be deleted immediately afterwards [
21]. For further requirements, see additional file
3.
To define validation or verification methods, hard-/software verification was assigned to requirements when these addressed the functions or design of the robot, such as limiting anthropomorphic appearance or that sensors should be able to deal with different human characteristics. Social assessment was chosen when the corresponding risk or mitigation was about social outcomes such as the reputation of patients or nurses, and legal assessment when aspects of law must be respected, as in the field of personal data or liability. User validation was defined as a validation method when a personal opinion by users is needed, e.g. to answer the question how persons in need of care experience the treatment with the robot.