The future of Mixed Realities and mental health
Of all the UX jobs, HealthTech is a unique opportunity to impact people’s lives in a positive way. –Diana Gonzalez, Wellsmith
If you care about the future of mental health, you’ve got an abundance of career options lying ahead of you. You might consider becoming a psychotherapist, nurse practitioner, doctor, social worker, or wellness coach. But one route you may not have considered is becoming a designer or developer in the mental health field.
The future of mental health depends in large part on designers and developers. Along with researchers and clinicians, these people are at the forefront of the health tech revolution, keeping it human.
Digital therapy is already becoming an accessible, affordable, and effective solution to a range of mental health issues. As the cost of Virtual Reality (VR) therapy treatment tools dips lower and DIY mental health treatment becomes more mainstream, demand for product designers will skyrocket.
Here’s an example of an active job opening with neurotech startup Flow Neuroscience in Malmö, Sweden (note: they’re looking for remote developers, based anywhere in the world):
“As an app developer you will be responsible for developing and maintaining our mobile frontends. You will work closely together with UX Designers and psychologists to develop the best digital therapy solution there is.”
And now that you know you can, here’s why you should.
By some estimates, the digital health market will be worth over $500 billion by 2025. That means it’s time to get real about planning ahead.
“Responsibility and ethics must be designed into the way XR (Extended Reality) tools are built and deployed, starting today,” write Accenture Research in their 2019 report, Waking Up to a New Reality: Building a Responsible Future for Immersive Technologies. “Incremental change is not sufficient. We need to boldly start building a responsible future today.”
Note the language here, these ethics must be “designed into” the process. That’s a call for more designers who truly care about the wellbeing of people, not just those interested in wealth or status.
You can easily imagine how augmented health innovation could backfire in the case of poorly designed therapy apps that aren’t backed by clinical research. But there are other concerns:
What about anonymous data collected of our brain waves during a VR session? Could information about our thoughts be collected and used by someone else, somewhere else, years later?
“The risk assessment lens needs to allow for a broader view of impact, incorporating individual and societal wellbeing,” Accenture writes. “And the lens must reach further into the future, including systemic second-and-third-order risks.”
Another challenge is mitigating the risks inherent in immersive experiences themselves.
The “Proteus Effect,” coined by Nick Yee and Jeremy Bailenson at Stanford University, describes how the experience of occupying a virtual body (or avatar) changes people’s behaviour in both the virtual and real worlds.
Sometimes this presents opportunities:
“When someone adopts an avatar that is more muscular than his actual self, he will act as if he is stronger in real life. People who see the effects of exercise on their bodies in the virtual world might exercise more in the real world.”
One study found that people taking the form of a taller avatar “negotiated more aggressively than subjects embodying a shorter avatar.” People whose avatars looked like themselves in old age “tended to allocate more money for their retirement after leaving the virtual environment.” Subjects whose avatars had superpowers (like flying) were “more likely to show altruistic behaviour after the experiment.”
On the other hand, prolonged exposure to immersive environments might have damaging effects on mental health, such as addiction and social isolation.
Understanding the Proteus Effect, says Accenture, will help reveal the opportunities and risks inherent in immersive experiences.
“How can we ensure the design of XR tools and experiences is human-centred, with the wellbeing of the user in mind? How can we ensure confidence that physical and mental faculties of users will not suffer from the correct usage of any new XR tool? How can we technically be prepared to safeguard against harm from incorrect usage?”
It’s up to designers and developers to hold companies accountable for their role in responsible innovation by questioning processes like these and to offer creative solutions to these challenges.
How does the science work?
At first, VR simulations were used to treat common phobias such as fear of heights, flying, and spiders. Now researchers can tailor more complex experiences for people, helping them overcome all kinds of mental ailments from social anxiety to psychosis to PTSD.
“Why are immersive experiences so effective for mental health?” write the researchers at Accenture. “The short answer is the deep, direct connection they make between our mental perceptions of ourselves and the world around us.”
Mental health is inseparable from our environment, and that’s one reason immersion works so well.
“The brilliant thing about virtual reality,” says Daniel Freeman, University of Oxford psychologist who is leading the first large-scale trial of virtual reality therapy for serious mental health conditions, “is that you can provide simulations in the environment and have people repeatedly go into them.”
Here are a few ways this works to support mental health:
Palo Alto-based LimbixVR lets therapists gradually expose patients to phobias, controlling exposure levels to help patients face increasingly stressful situations.
“I began testing VR after joining Limbix as Director of Psychology last year,” says Dr Sean Sullivan. “Now I use VR in my private practice to teleport patients to a range of virtual locations and exposures. As therapists, we can use VR treatments to enhance a patient’s emotional recall, facilitate stress reduction, demonstrate relaxation strategies, teach anxiety tolerance and treat a range of fears, phobias and other personal challenges.”
This is a method for overcoming fear and emotional trauma by gradually desensitising someone from the source of their anxiety, such as using a VR tool to overcome your fear of public speaking.
“Though patients know these experiences aren’t real,” writes Jeffrey Rindskopf in an interview with psychologist Albert Rizzo, “that doesn’t change the preconscious response and fear activation of their limbic systems, manifesting an increased heart rate and production of the stress hormone cortisol. Our emotional command centres naturally suspend disbelief even when our logical mind knows better, putting VR on par with real-life exposure therapy in clinical effectiveness, but with none of the travel costs or physical danger.”
“The brain creates an embodied simulation of the body in the world used to represent and predict actions, concepts, and emotions,” writes Riva et al., in their research report, Neuroscience of Virtual Reality: From Virtual Exposure to Embodied Medicine. “VR works in a similar way: the VR experience tries to predict the sensory consequences of an individual’s movements, providing to him/her the same scene he/she will see in the real world.”
To achieve this, the VR system, like the brain, maintains a model (simulation) of the body and space around it.
“If the presence in the body is the outcome of different embodied simulations, concepts are embodied simulations, and VR is an embodied technology, this suggests a new clinical approach discussed in this article: the possibility of altering the experience of the body and facilitating cognitive modelling/change by designing targeted virtual environments able to simulate both the external and the internal world/body.”
Barcelona’s VirtualBodyWorks recently conducted a study showing that patients who had a conversation with themselves embodied as Dr Sigmund Freud experienced significant mood improvements compared to talking through their problems in a virtual conversation with pre-scripted comments. The effect is known as “body-swapping,” and researchers found that “those in the body-swapping group got better knowledge, understanding, control, and new ideas about their problem compared to the control group (no body-swapping).”
By quantifying physiological responses like heart rate, facial muscles, electrodermal activity (GSR), and respiration, researchers can get a better idea of what creates fear, and how to provide successful treatment.
For instance, addiction treatment could benefit from the combined use of biometrics and VR-based therapy. “The self-reported withdrawal symptoms and increased heart rate of Nicotine-dependent cigarette smokers have been found to be predicting of craving experiences in a VR environment. This study showed how real craving can be elicited in a VR environment, which can be important for future studies predicting who will respond well to addiction treatment.”
VR therapy treatment tools are now accessible to most practicing therapists at a reasonable price. Whereas a full VR headset cost upwards of $15,000 in the 90s, you can now buy a standalone headset for around $200.
VR holds promise because “it depends less on the patient’s memory (recollection bias) and the interpretation of the clinician (interviewer bias),” says Dutch researcher van Bennekom.
One Cambridge study found that the VR-based navigation test was more accurate in diagnosing mild Alzheimer’s-related impairment than traditional cognitive tests, such as figure recall and symbol tests. Lead researcher and neuroscientist Dennis Chan says, “VR could take on a bigger role in diagnosing mental disorders as VR gear becomes cheaper and easier to use.”
“VR has immense potential to improve the assessment of mental health conditions but it is not currently used in clinics,” adds Daniel Freeman, a professor of clinical psychology at the University of Oxford, and the co-founder of Oxford VR, which is developing VR-based treatments for a variety of psychiatric conditions. More research is needed on proposed uses of VR to diagnose mental conditions, he adds. Until then, “clinics would not use VR for diagnosis.”
Freeman notes that VR is being used increasingly in labs and research institutions, primarily “to inform understanding of causes rather than diagnose [conditions].” Freeman’s team is hoping to develop a VR-based test that will better diagnose paranoia, by showing people “neutral” social situations. “If they see hostility from the VR characters, then we know that it is genuinely unfounded and therefore instances of paranoid thinking.”
Problems we need more people to solve
The gap between clinicians and tech entrepreneurs
Interviewing therapists, psychiatrists, psychologists, and social workers, a UX Designer and a content strategist, Marli Mesibov found two reasons why the design for mental health is lagging behind the design for other healthcare needs.
“Many designers told me they were starting from scratch,” she writes. “They did research with patients and learned what patients thought they needed from an app. But very few spoke with healthcare providers. As a result, the designers were missing the clinical expertise.”
One clinician shared with her the insight that “what people say they want is not often what they want.”
The same goes the other way around: Patricia Areán, a licensed clinical psychologist at the University of Washington, says many digital mental-health services designed by medical practitioners suffer from an “uptake” problem: A user may install an app but only use it once. “We need to start working with people who know how to actually design these things,” she adds.
That’s a clarion call for more designers who put people before tech.
Fast iteration in a slow HealthTech world
“Designers are often up against deadlines,” Mesibov notes. “Some work for large healthcare companies that want to launch in time for a specific trade show, or before a competitor gets to market. This is very different from the world of health care, which tends to move very slowly, waiting for compliance or FDA approval, clinical trials, and multiple rounds of validation.”
Designers can pave the way for the health tech field to move forward more quickly.
To add to that, the Accenture report identifies six risks we need future designers and product developers to solve:
Misuse of personal data
Personal data will go beyond people’s credit card number, purchase history, or social media activity. Their feelings, behaviours, judgements and physical likeness will all be exposed to potential cyber theft and manipulation.
When news and information are consumed through immersive experiences, it will be harder to separate reality from falsehood, profoundly influencing behaviours, opinions and decisions.
Not only could avatars be used to create new forms of identity-related crime, but ransomware and extortion risks will rise as more critical tasks, like surgery or engineering, become dependent on smooth, real-time XR procedures.
Over-dependence could significantly impact mental health and wellbeing by extending the gap between reality and what life could be like. New mental-health disorders related to extended periods in virtual worlds are still being explored.
Trolls could go from intimidating with words on social media to physically intimidating targets in a virtual world with avatars. And undesirable behaviour that is normalised in a virtual environment can creep into real-world behaviours.
Digitally divided worlds
Unequal access to new educational or working experiences amplifies social divisions. And increasing time spent in virtual worlds can disengage people from real-world societal problems.
Developers and designers can help mental healthcare become more accessible to a wider demographic.
“Artificial Intelligence (AI) could help improve access to care in three key ways,” says Adam Miner, PsyD, an instructor and co-director of Stanford’s Virtual Reality-Immersive Technology Clinic. “First, it could reach people who aren’t accessing traditional, clinic-based care for financial, geographic or other reasons like social anxiety.
“Second, it could help create a ‘learning health care system’ in which patient data is used to improve evidence-based care and clinician training.
“Lastly, I have an ethical duty to practice culturally sensitive care as a licensed clinical psychologist. But a patient might use a word to describe the anxiety that I don’t know and I might miss the symptoms. AI, if designed well, could recognise cultural idioms of distress or speak multiple languages better than I ever will.”
The future doesn’t need more tech; it needs more designers and developers in mental health to solve the challenges tech is already presenting. You can be a part of that solution.
Author: Saga Briggs.
Saga Briggs is a journalist covering trends in learning, creativity, intelligence, and educational technology. Follow her @SagaMilena