James Kirkup of the Telegraph has a very interesting article up, titled "Google wants to monitor your mental health. You should welcome it into your mind." Although the title is rather alarming, it isn't your mind that Google wants to get into but your smartphone or similar device, for the purpose of continuously tracking the state of your mental health. Like many advances in technology, this offers both a great potential to help people, and a great risk to someone's basic right to privacy.
The article discusses how Dr. Tom Insel, once the head of the National Institute of Mental Health, is leaving his position for a spot at Google Life Sciences, where he plans to explore how technology can be used to diagnose and treat problems in individuals' mental health.
In case you were unfamiliar, Google Life Sciences is the division of Alphabet, Inc. that is trying, among other things, to use technology to improve human health.
As it turns out, Google Life Sciences isn't the first company to have this kind of an idea. According to Kirkup, IBM, in partnership with Columbia University, found that computer analysis of someone's speech patterns was a better predictor of psychosis than traditional tools like brain scans. And our internet histories and online shopping habits, long diagnosed by marketers, could also be used to monitor someone's mental health. As Kirkup puts it, "computers can now tell when something is about to go terribly wrong in someone's mind."
Consider the fact that many of us are already using wearable technology, such as Fitbits, to constantly monitor our physical health. Dr. Insel simply thinks similar tools could do the same thing for your mind. And such monitoring may be dramatically helpful for those who need it. Kirkup notes that "the symptoms of depression are inconstant, ebbing and rising without obvious pattern. A short consultation with a doctor once every few weeks is thus a poor means of diagnosis. But wearable technology allows continuous monitoring."
This continuous monitoring is the key. A small portable device like a Fitbit, or your smart phone, would be able to test your normal state and monitor when changes in speech pattern and behavior suggest a mental health crisis may be imminent. Which is all well and good in a mental health system that is perfectly benevolent, but in an imperfect world where the interests of doctors, patients, tech companies, and the drug industry can often conflict, things are much more complicated.
Or, as Kirkup puts it, "If you don't find [the] prospect [of continuous monitoring] disturbing, you're either fantastically trusting of companies and governments or you haven't thought about it enough."
We must ask and know what the risks of such technological monitoring could be. Currently 70% of Americans are on some sort of prescribed medication, many of them for psychological problems. Will such a monitoring system up that already alarming percentage? And what are the risks of misdiagnosis?
My thinking would be to make such a thing strictly opt-in. I'm no mental health expert, so please correct me if I'm wrong, but there are few places where a person's mental health is continuously monitored, and people in those places are usually not there voluntarily.
Any new tool can be used and misused, so only the utmost caution should be employed. If we don't, we risk further pathologizing behavior that isn't dangerous simply because it falls outside of a given norm.