Assistive Care Robots and Older Adults: Employing a Care Ethics Lens

To date, ethical critiques of the use of assistive healthcare robotics have not closely examined the purported care relationship between such robots and their users. Drawing upon the work of care ethics scholars, I argue that authentic care relies upon capacities inherently reciprocal and responsive in nature, which ultimately precludes socially assistive robots from being useful caring tools. Résumé Abstract Jusqu’à présent, les critiques éthiques de l’utilisation de la robotique d’assistance médicale n’ont pas examiné de près la prétendue relation de soins entre ces robots et leurs utilisateurs. En m’appuyant sur les travaux des spécialistes de l’éthique des soins, je soutiens que les soins authentiques reposent sur des capacités intrinsèquement réciproques et réactives par nature, ce qui empêche en définitive les robots d’assistance sociale d’être des outils de soins utiles. To date, ethical critiques of the use of assistive healthcare robotics have not closely examined the purported care relationship between such robots and their users. Drawing upon the work of care ethics scholars, I argue that authentic care relies upon capacities inherently reciprocal and responsive in nature, which ultimately precludes socially assistive robots from being useful caring tools. to descriptive SAR a meaningful way. robotics technology to be in extensive way, the of human carers, a substandard provision of care to older adults in


Article abstract
To date, ethical critiques of the use of assistive healthcare robotics have not closely examined the purported care relationship between such robots and their users. Drawing upon the work of care ethics scholars, I argue that authentic care relies upon capacities inherently reciprocal and responsive in nature, which ultimately precludes socially assistive robots from being useful caring tools.

INTRODUCTION
Care of a rapidly aging population has presented numerous challenges to health systems worldwide. In Canada alone, there will be over 9.5 million seniors by the year 2030 (1). The World Health Organization (2) notes that by 2050 the world's population aged 60 years and older is expected to total two billion. Consequently, all countries face increased challenges ensuring that their health and social systems can accommodate this demographic shift.
The accelerated change in the world's aging population has resulted in a shortage of both professionally trained and informal (e.g., family and friends) caregivers to provide older adults assistance with daily living (3,4). One proposed solution gaining substantial traction is the use of assistive healthcare robots. There are a variety of models of assistive robots that can provide support and assistance for a number of tasks, including fetching household items for the user, helping to move an older adult from bed to wheelchair, and more recently, providing social companionship (5). Developers of socially assistive robots (SARs) assert that this technology can contribute to "therapeutic protocols aimed at improving or maintaining residual social, cognitive and global functioning of older adults" (6 p.140) while also being a more time and resource-effective intervention compared to investing in human caregivers.
By drawing upon the works of scholars who specialize in the ethics of care, we can more closely examine the relationship between assistive robotics technology and the older adults for whom they are designed to provide caring assistance. Care ethics is a stream of ethical theory that asserts that "there is moral significance in the fundamental elements of [our] relationships and dependencies" (7). Care ethics provides an ethical framework from which to evaluate caring tasks to determine when the provision of care is merely adequate and when it is truly authentic and comprehensive (8). Moreover, care ethics understands the practice of caring as one that involves "both particular acts of caring and a general 'habit of mind'" (9 p.127) such that the care provider should understand their actions within the broader care context, including recognizing conflicting and evolving care needs.
In what follows, I review some models of care robots and show that many SARs are designed to carry out certain tasks that we recognize as part of traditional care. I critically examine this function from a care ethics framework. I argue that there are significant limitations to the way SARs execute these tasks and that they will often fail to be useful caring tools because they cannot approach care in a holistic manner, given their limited means of communication and interaction. I argue that useful and authentic care relies upon caring qualities that are inherently reciprocal and responsive in nature. By engaging a care ethics lens, we see that care ought to be understood as a reactive practice that can address the evolving needs of the care receiver, in addition to any conflicts or limitations to resources that may be hindering the care process. In showing that SARs have a much more limited means of interaction than is presented in descriptive literature from SAR developers, I argue that SARs do not possess the appropriate capacities to recognize caring needs and address them in a meaningful way. Thus, if we allow assistive robotics technology to be implemented in an extensive way, without the involvement of human carers, we will be promoting a substandard provision of care to older adults in our communities.

MODELS OF CARE ROBOTS
This paper focuses on robotics technology that has been designed to carry out complex human-robot interactions. Pino and colleagues (10) describe six broad categories of assistive robot designs: machine-like robots, human-like robots, androids, mechanical human-like robots, animal-like robots, and mechanical animal-like robots. I focus here on the categories of humanlike robots and mechanical human-like robots, "whose form resembles a human body and/or have human facial features (e.g., eyes, nose, mouth, eyelids, etc.)" (10 p.2). Within that group of robots, I am particularly concerned with those human-like robots that have been designed to be socially interactive. Kachouie and colleagues (11) helpfully draw a distinction between SARs and other kinds of assistive robots, noting that things like 'smart' wheelchairs, artificial limbs, and exoskeleton technology count as assistive robots but are "principally not communicative" (11 p.369). SARs are those that emphasize "the importance of social interaction in the process of providing specific assistance" (11 p.369).
While there is a great variety in the particular features of existing SAR models, in examining some of the most recent ones that have gained significant research and media attention, it is clear that such technology is meant to resemble human beings both physically and psychologically. For example, consider the assistive robots Brian™, Casper™ and the Care-O-Bot™ (See Figure 1). These robots are intended to resemble humans from the waist up; they have been built with a humanlike torso with two arms and a head, and with a synthetic 'face' or mask that mimics emotions like 'happy,' 'neutral' and 'sad' (6). SARs such as Brian™ (6), Casper™ (12), and the Care-O-Bot™ (13) are designed to carry out a variety of daily living tasks, including (but not limited to) assisting with meal preparation, engaging in social activities like playing card games, and offering reminders to take medications. All three robots depicted in Figure 1 are programmed with some variety of verbal interaction (e.g., encouraging meal preparation and consumption) and are anthropomorphic in nature. For example, the Casper™ robot can say phrases like, "My favorite food is pizza, it's delicious," and "We're finished making the sandwich, it looks very delicious" (14). Similarly, Brian™ also imitates human affect; it can say to the user "The main dish smells amazing. Why don't you pick up some food with your spoon?" (6 p.79).

Figure 1. Examples of Socially Assistive Robots
The essential observation here is that these robots are meant to resemble human beings both physically and psychologically. The physical resemblance is clear from their visual designs, while the psychosocial resemblance is evident in the robots' simulated emotional responses and speech scripts that include affective and descriptive language. Designers of these robots describe this proximity to human capabilities as necessary for the robot's therapeutic goals of providing social stimulation while assisting an older adult, in addition to generating acceptance from this user base (6,15,16). McColl and Nejat (17) assert that designing robots so that they can both read and reproduce emotive verbal and body language ostensibly imbues SARs with the capacity to "share information with, relate to, and understand and interact with people in human-centred environments" (17 p.261). A study exploring user acceptance of SARs amongst older adults found that humanlike communication was preferred over human-like appearance, but that participants also expressed positive feelings towards the robot's humanized 'face' and emotional communication abilities (18 p.147-148). Robots are no longer merely performing caring tasks; they are being designed so that they complete these tasks while bearing a likeness to human behaviour. Moreover, this likeness is advertised as providing a crucial psychosocial presence in the lives of older adult users.
However, much of the descriptive literature written by SAR developers overstates the actual caring capacities that such robots possess. This is argued by Sparrow and Sparrow (19), who contend that "discussions of human-robot interactions, or the higher-order properties of robots, are plagued by equivocations about how genuine the properties attributed to robots are" (19 p.153). They argue that robot developers' choice of language implies the presence of genuine emotion or thought which overstates the actual capabilities that robots have. Sparrow and Sparrow point out that in the proper application of terms like 'happiness,' 'companionship,' and 'emotion,' there is an intuition that "to be a real friend, or to really love someone, or to possess genuine rather than ersatz intelligence, is not something which can be exhaustively specified or captured by any algorithm or set of algorithms" (19 p.154).

DEFINING AN ETHICAL PRACTICE OF CARING
If we are to recognize how and when care is being done well, it is essential that we have a definition of what care is, in addition to caring practices and mindsets. Gadamer (20), in his philosophical commentary on the practice of medicine in a modern technological age, wrote that the art of medicine is nebulous to define, as it begins in a "particular kind of doing and making which produces nothing of its own and has no material of its own to produce something from" (20 p.34). A similar issue befalls the task of defining the practice of care. Some scholars of the ethics of care have proceeded on the "tacit assumption that […] we know what we are talking about when we speak of taking care of a child or providing care for the ill" (8 p.29). However, the wide variety of caring tasks has historically made it a challenge to determine what commonality lies at the heart of all good caring practices. Care ethics scholar Virginia Held writes: "Dressing a wound so that it will not become infected is not much like putting up curtains to make a room attractive and private. Neither are much like arranging for food aid to be delivered to families who need it half a world away. Yet all care involves attentiveness, sensitivity and responding to needs" (8 p.39).
Held advances an understanding of care as practice and value, stating that care should not merely be thought of as a set of individual actions or behaviours, but rather a "practice that develops" (8 p.42). She describes caring as a moral activity with "attributes and standards" (8 p.42) such that we can judge when the provision of care is merely adequate from when it is truly good. Moreover, she also writes that care ought to be thought of as a value in and of itself, to be used to "pick out the appropriate clusters of moral considerations, such as sensitivity, trust and mutual concerns" (8 p.38) in order to evaluate care activities on whether or not they are morally deficient. Held contends that care without an ethical framework can too easily veer into practices that are harmful or domineering. She asserts that "the various aspects and expressions of care and caring relations need to be subjected to moral scrutiny and evaluated, not just observed and described [emphasis author's own]" (8 p.11).
We may further nuance this definition by looking to Joan Tronto's (9) complementary construction of the four elements of care: attentiveness, which involves recognizing the needs of others; responsibility, understood as having more than a mere obligation towards another; competency, which is carrying out caring acts correctly and effectively, and, responsiveness, which is being aware of the care receiver's own perceptions and evolving needs. The four elements related to this definition exist concurrently as goals, stages, or virtuous frames of mind. I believe these elements of care to be especially useful in helping to generate a standard of care with which to evaluate our caring practices. The elements function to reify what an ethical practice of care is by providing qualities generalizable to the wide variety of care practices, in addition to informing what a good and caring "general 'habit of mind'" (9 p.127) ought to look like. Tronto also describes a sort of grounding principle of care, which she calls integrity. She argues that good care "requires that the four phases of the care process must fit together into a whole," and that this is only achievable by incorporating integrity into the caring process (9 p.136). One element cannot exist without the other, and in many respects, depending on the care context, these elements will be in conflict. Tronto asserts that care as an ethical practice involves having to make judgments about how to best resolve such potential conflicts between our care values. The injunction that one ought to act with these four elements in their practice of caring does not itself describe to what that injunction amounts. Tronto subsequently describes what I believe is a sort of moral discernment in the practice of caring: Care as a practice involves more than simply good intentions. It requires a deep and thoughtful knowledge of the situation, and of all of the actors' situations, needs and competencies. To use the care ethic requires a knowledge of the context of the care process. Those who engage in a care process must make judgments: judgments about needs, conflicting needs, strategies for achieving ends, the responsiveness of carereceivers, and so forth (9 p.136-137).
It is only when care incorporates these kinds of thoughtful moral judgments that we can say that care is done with integrity. Good care requires that we integrate the broader social, political and personal contexts in which caring exists in order to truly understand the needs that we are to meet.
In sum, care ethics scholarship understands care as more than simply completing individual caring duties or activities. Certainly, these tasks partly define care, but even more so, good care is about the ability of the caregiver to use the practice of caring activities to evolve into a better caregiver for the future. The elements of care allow a caregiver to treat the caring scenario holistically and make thoughtful considerations about conflicting care needs, practical constraints, and the care receiver as an individual with a unique history and disposition. Good caregivers have the ability to discern variations in vulnerability, whether that be from things like physical decline or isolation due to a difference in sexuality or ethnic background and incorporate this into their caring approach. On the contrary, the limited, pre-programmed interactive skills of SARs do not lend much confidence in their ability to recognize and respond to the personal factors that come to bear on a person's care needs.

THE COSTS OF 'CARING' ROBOTS
In light of this definition of care, I argue that socially assistive robots do not currently possess any of the capabilities required to provide substantial and comprehensive care, as discussed earlier, that addresses the care needs of the older adults they have been designed to assist. I agree with Tronto's (9) understanding of the process of caring wherein using the concept of care as an ideal "can serve us analytically as we try to determine whether care is being well provided" (9 p.110). This analytical process cannot be achieved without the discerning and reflective qualities I have just discussed.
It is impossible to expect that the interactions of current SARs with older adults could ever address the complex variety of care needs older adults have in their later years. The following subsections address the care needs of older adults and the disconnection between SARs' actual capacities and their ability to meet these care needs.

The Care Needs of Older Adults
SAR developers are interested in using their technology to alleviate the care crisis for older adults. The growing population of aging persons presents unique considerations when conducting an ethical analysis of assistive technology. Veazie and colleagues (21) note that older adults are more susceptible to becoming socially isolated due to age-related changes in health and social status, including impaired hearing and vision, reduced mobility, and the deaths of family and friends. These factors limit an older adult's ability to create "sustained meaningful connection[s] to other people" (21 p.1), which in turn results in poorer overall health, an increased risk of dementia and mortality, as well as more physician visits overall. Older adults are also disproportionately prone to developing depression, anxiety, and other mood disorders (22).
While the developers of SARs have rightly recognized the above as care issues for older adults that require remediation, I argue that SARs do not have the capacity to help address these concerns. None of the SAR models explored here have any truly substantial capacity for reciprocal listening or communication. All the models are restricted in their communication to the input cues (whether that be vocal/physical gestures or button-touchscreen interactions) that their developers have integrated into their algorithms. Additionally, these SARs are limited to the scripts and scenarios with which they have been programmed. Arguably, the completion of a useful interaction like cooking a meal presumes the perfect interacting agent. For example, Casper™ is intended to be used with older adults with cognitive impairments (12). But while its developers claim that the robot is capable of displaying "a range of emotions that are needed for social interactions" (12 p.2), it is not clear that Casper™ is equipped to address any range of emotional outbursts or confusion (which are typical symptoms of cognitive impairments like dementia and Alzheimer's) that an older adult may have during the interaction. Moreover, it seems very likely that an adult in the later stages of a degenerative disease would have trouble becoming familiar with an entirely new technology.
Similarly, all other programmed scenarios are contingent upon the idea that the older adult will react to each step of the scenario in a way that the robot's algorithm can predict, which seems highly unlikely. Consequently, there remains a question as to the overall usefulness of such robots outside of their narrow, pre-programmed schemas, and, more crucially, a question as to the appropriateness of their use in caring scenarios.

Exercising Caring Capacities
It is important to note that in discussing caring capabilities, I am not claiming that human beings are infallible agents in care relationships. Of course, not all care is good care. The purpose of contrasting the current caring capabilities of SARs with those of human carers is to reflect on what underpins authentic and comprehensive care. I argue that assistive robots can only ever hope to participate in one aspect of good care practice: assisting with the completion of singular care tasks. However, I have argued that caring ought not to be reduced to singular care tasks, and furthermore, caring tasks and activities ought to be understood as opportunities to meet other, deeper human needs. For example, eating is not merely about ensuring a person meets a certain caloric intake for the day, it is also an opportunity for socialization. Similarly, bathing and assistance with personal grooming provide the possibility of comforting physical touch.
To reiterate, caring activities partly define the practice of care in the sense that there are tasks that we come to typically associate with caring labour. However, good care is also about the ability of the caregiver to carry out these tasks while providing a moral presence for the one being cared for, and also working to refine their caring capacities for the future. Consequently, while one could argue that SARs can help carry out tasks that we recognize as part of traditional caring duties, current robots do not possess the shared humanity that makes possible things like "empathic witnessing, listening to the illness narrative, and providing moral solidarity through sustained engagement" (23 p.1551).
I argue that SARs do not possess the moral discernment required to organically recognize the individual differences that affect the provision of care. Further, I argue that SARs also do not possess the existential presence and solidarity that is essential to the practice of caring. Those providing care ought to possess the qualities and capabilities that not only allow them to appreciate the same caring comforts, but also to recognize the ever-evolving care needs of the care receiver.

CONCLUSION
I have appealed to the limited nature of existing SAR models to demonstrate that not only do they have restricted practical application in the completion of relevant caring tasks, but more fundamentally, that they will (always) fall short in caring for human beings simply because they are non-human. We should not mistake simulated facsimiles of conversation, humour and compliments for genuine communication. Socially assistive robotics technology ultimately cannot perceive care in a comprehensive manner, wherein they understand caring actions within the larger context of a care relationship. I have provided strong reasons as to why it is important to treat all caring tasks as possibilities for addressing a person's deeper care needs. If we hold that human beings should be treated as ends in themselves, we ought to be at least somewhat concerned by a technology that may possibly take a demographic already at risk for social isolation and other psychosocial health concerns, and further remove opportunities for their care needs to be vocalized or to be perceived by their fellow human beings.
It is important to note that this paper is not committed to any claim that there are no useful applications for robotics technology. However, I argue that robots designed to merely mimic the complex, emotional care relationships between human beings when attempting to assist with day-to-day living are ultimately not enough. SARs are just one proposed solution to help the crisis of care resources for ageing persons. We ought to consider other options that challenge traditional care models for aging persons. Such alternative solutions include, among others, integrated living programs (e.g., housing a day care in a long-term care facility) (24,25), intergenerational friendship programs for young children and older adults in residential care (26), therapy animal programs (27), and changing long-term care facility design and construction to avoid the feel of a clinical setting (28).
In sum, caring is fundamentally about "attending, enacting, supporting and collaborating" (23 p.1551) and ought not to be reduced and compartmentalized to individual tasks abstracted from a person's particular lived experiences which can and will affect their care needs. Moreover, these caring needs fluctuate and are transformed within our social structures. While developers of SARs would likely say otherwise, I have argued that the health needs of older adults are unlikely to be met with the use of this technology. Future research must examine the financial and practical feasibility of SARs, and conduct a thorough longitudinal consultation with the older adults for whom such technology is intended to serve.