Should we trust Apple with mental health data?

“the new coaching service — codenamed Quartz — sounds like an expansion of the Apple Watch play from physical health to mental health, Bloomberg reported. It is “designed to keep users motivated to exercise, improve eating habits and help them sleep better” using “AI and data from an Apple Watch to make suggestions and create coaching programs tailored to specific users.” [..]

About five years ago, I wrote about the various ways that the Apple Watch failed as a behavioral intervention. There’s some behavioral science, but also — because I was using it — I discovered that the constant nudging for achievement made me miserable. I began to think of it as my failure bracelet.

It is half a decade later, and none of that has changed. There are no rest days. The default notifications are all switched on in attention-shattering ways. And while I was test-driving the watch all those years ago, I got guilt-tripped for being sick, an experience I repeated in January when HealthKit told me I was moving less than usual during a week in which I was laid out with covid and during the weeks I spent recovering from it. [..]

I bring up therapy because Quartz is meant, in part, to track emotions. This is a real cause for concern. Think for a moment about the person who wants to track their emotions the most — perhaps someone vulnerable, maybe with mental health issues. Do you think a pressure-laden notification nightmare is going to make that better or worse? [..]

I know “touch grass” is an online insult, but it’s also not bad advice: you can always go outside and feel better. Somehow, I don’t think a VR headset encouraging mindfulness is going to be more effective than lying on your back in a park and watching the clouds roll by — not least because the clouds aren’t going to make you nauseated.

I’m emphasizing non-screen interventions, particularly for health, because I’ve been watching what happened with research on social media: it makes people feel bad. I love computers (duh, I write online), and I love the stuff people do for them. But I am increasingly convinced we need to get the hell outside because we are weird primates who evolved with outside, not computers. We also need face-to-face time, as we all discovered the hard way in 2020. If you want to feel calmer, happier, and more connected, I feel confident that the best way to do that is to log off. No AI can possibly replicate those needs because, as social animals, what we need is other people.

[..] even if I am wrong about that — and I might be! — I am still concerned by Apple’s notification approach and its science-blind approach to behavioral health. That 12-hour stand goal on the Apple Watch? Someone just decided that was important. There’s no research behind it, as Apple told me all those years ago, just vibes. The decision to track calories as the default for the Move goal is dangerous for people with eating disorders. The focus on streaks can create compulsive behavior. I am doubtful the subscription services are going to be any better.

Behavioral health interventions are notoriously difficult. They require a grasp of psychology, sure, but they also require a certain amount of flexibility because people’s lives are complicated. Apple’s ham-handed approach to physical health has been bad enough — the idea it is now going to approach mental health does not fill me with confidence. I’m sure there are plenty of people out there who won’t mind letting Apple toy with their emotions. But we’ve got a lot of evidence now that too much screen time is linked to bad health — and for Apple, its entire business is getting you to spend more time with its software and gadgets, not less.”

Full article, E Lopatto, The Verge, 2023.4.26