Marientina Gotsis is an Associate Professor of the Practice of Cinematic Arts at the University of Southern California. She has a broad background in arts, design and engineering with a special interest in interactive entertainment applications for health, happiness and rehabilitation. Here she talks with Despoina Limniotaki, in the #ConversationsWithFriends series, about Compassionate Technology and its application to public health, especially to mental health care. Discover more about Professor Gotsis here
In which ways can compassionate technology be employed in mental healthcare?
Mental health services are a space where there is a high need for compassionate technology. Consider two variables: stigma and access, for example. Telehealth, combined with privacy, can help address both of those variables. Asking people who are unwell to leave their comfort zone during a crisis is a lot. When someone is so depressed that they have not bathed in a week or when they get so distressed from social settings that they cannot be around crowds, or when they are so worried about being seen entering an office of a psychologist or psychiatrist that they will not go, what can we do to help? In the absence of in-person home services, telehealth can be a very compassionate technology. From virtual sessions with various therapists, to accessing self-paced therapy programs and workbooks, to mobile health assessments, to virtual conversation agents that can combat loneliness, people have a lot of options that can be offered to them.
Today I read a harrowing news article about a woman who passed from COVID-19 in a nursing home facility and during her desperation, she kept asking for help from her Amazon Echo. Despite all the criticism these devices get for lack of privacy and surveillance issues, they can be redesigned to triage and deal with emergencies. The bare minimum we can use our phones is for dialing the police or for an ambulance. Why not everything else? Closing the gap between help and urgency is another way to think of compassionate technology. Alleviating human suffering does not always need to be very complicated, but someone needs to have that in mind, always when designing every seemingly unrelated technology.
Can you give us an example of a successful intervention that involved both technology and compassionate caring?
Robotic partners for the elderly, such as PARO The Seal and related products, have always been controversial but are an excellent example of something relatively simple that can help combat loneliness. It is not a substitute for real human care, but what if such care is not possible 24/7? And what if the person who needs the company is no longer capable of taking care of a real pet who could be a good partner for them? Although PARO is not the active carer of the elder, the robot's behavior and their "aliveness" activates the caring and empathy of the elder who feels wanted and needed. In Japan, where robotic companions are popular, the influence of Shinto spirituality and "animism" -- the belief that everything has a soul – are pervasive in how technology is developed. In my mind, this practice is no less controversial than giving plush animals to children. Fantasy friends can serve real emotional needs at every age, especially as cognition falters.
Do you think that compassionate technology programs could be implemented by anyone interested in them?
I do think so, but some planning and cooperation are needed. Technology that has been tested early on with its target audience and related stakeholders (health professionals, family members, etc.) throughout from design to development to implementation will more than likely have the potential to be more compassionate. The intersection of user research and healthcare indeed points to this potential, but these spaces still do not intersect as much as it is necessary.
What is the future of applications and online services that are based on c.t.?
The newer generation of designers, health professionals, and technologists that I work with are far more attuned to this philosophical and methodological orientation. In some ways, they take it for granted because they are a born-digital generation where a lot of tools they use seem to cater to them. Yet when they encounter healthcare services, they are shocked by the usability and accessibility deficits across software and processes. In a post-COVID-19 world, the potential for technology to be cruel is very big, as all pandemics have shown us. Technology unchecked can widen disparities, enable oppression, and modify behavior in unimaginable ways. Consider the cooperation of mobile giants like Apple and Google to provide their users with an opt-in form to notify and be notified of having being exposed to the virus. Without adequate privacy protection, and also psychological support, these tools could bring more despair and paranoia instead of safety and comfort. More data does not equal more wisdom. Imparting wisdom is a more compassionate approach, but how will we translate "announcing bad news" in these new types of non-clinical encounters that have clinical significance? Imagine the technology we have now if it had been available in the era of AIDS. Would more death had been averted if people knew exactly how much sex their partners were having and how many were infected without revealing their identities? Instead, these people were shamed and the finger was pointed toward them. We've gone from sexual activity being stigmatized to day-to-day behavior being stigmatized: grocery shopping, exercise, socializing in person, etc. Fear has to be counteracted with compassion, or we will end up at odds with each other and the entire world very quickly.
Could you give us a description of a project you are currently working on?
I have an incredible team of collaborators on a project for gait rehabilitation of people with Parkinson's using virtual reality. We just celebrated six years of working together, and I marveled at the level of cooperation and compassion needed in our collaboration between us, but also all the stakeholders to our project: the patients, families, physical therapists, etc. It has been a labor of love to develop a joint process of working together that can hold everyone's dreams and satisfy their goals. Before we developed any games, we invited some people with Parkinson's in class to try to very new games that had hit the shelves and hear about their daily lives. Teaching listening before action is a very hard skill. Research takes a long time, and what we do may never manage to benefit directly the people we started working with. Listening to their lived experience, however, is what stays with us and motivates us to keep working with what sometimes seems mundane and even insignificant in the grand scheme of the problem's complexity. Yet even tiny problems have to be solved for any progress toward a big goal. Science for science's sake is not always motivating enough. Not for everyone. A lot of us are driven by the memories of stories people have told us. Those things some people dismiss as merely "anecdotes," but they are a lot more than that. These stories are the window to compassion. Until we can become better listeners, how will out technology do that for us?