Health

The Ethics of Digital Babysitting

To listen to all of those social media, cell phone, and computer commercials, one would think that we are in the midst of the happiest, most inter-connected time in the history of time. Unfortunately, childhood depression, loneliness, and alienation have never been higher. What are the ethics of digital babysitting?

Clinical depression (not “sadness”) among the group extending from 13 to 19, is expected to affect about 3 million teenagers. The rates are climbing each year. In 2017, 13 percent were depressed as opposed to 8 percent in 2010. Even sadder, teen suicide rates are climbing. Rates of suicide in girls has risen to more than 5 per 100,000, and among boys, the rates have tragically exploded to 14 in 100,000.

Numerous psychological studies have clearly linked digital device and social media use to teenage depression. Part of the problem is increasing stress to always put the best “psychological foot” forward despite deep-seated problems. Kids must keep up not only socially, but digitally with their peers. It is a hamster wheel that spins out of control.

A New Movement – The Ethics of Digital Babysitting

There is a new movement afoot, and ethically, there are many questions to be asked. It is being dubbed the “smartphone psychiatry movement,” led Dr. Thomas Insel, former head of the National Institute of Mental Health and other psychiatrists who have partnered with the app developers of Silicon Valley to develop new software to detect mental illness. The developers and the psychiatrists are wondering if the same smartphones that are contributing to teen depression could possibly be used to alleviate teen depression and help to prevent teen suicide.

The Ethics of Digital BabysittingDr. Insel has stated: “There might be as many as 1,000 smartphones ‘biomarkers’ for depression.”

According to Insel, researchers believe that variables such as changes in typing speed, voice tone, word choice and how often kids stay home could signal potential psychological problems.

A co-researcher in this area, Dr. Alex Leow, who is both an app developer and associate professor of psychiatry said: “We are tracking the equivalent of a heartbeat for the human brain.”

While the thought of “if you can’t beat them, then join them” comes to mind, the placing of an app on the very device that causes the problem is fraught with ethical peril.

Researchers admit that accurate measurement of mental status through the patterns of use of smartphones may be years away – and it would also require consent. While the developers claim that users could easily withdraw their consent, what the developers cannot yet determine where this information might reside.

Technology may be enabling our lives, but it also potentially comes with a price.

While Dr. Insel said he wants to reach kids not when they’re in crisis and very late in the course of mental illness but to identify the mental illness at its earliest signs. The psychiatrists envision the smartphones sending out texts to guardians and parents and even first responders to offer help when it is suspected. So far, several thousand college-age test subjects have signed up for various apps and to serve as early stage test subjects.

Let’s talk about the ethics

Is this a noble idea? I would guardedly say yes if it were not for larger concerns which do not seem to be addressed.

I cannot picture Silicon Valley developers as functioning totally on the mission of altruism. To be competitive, apps are normally installed at a relatively low price. Even if 3 million or more download the apps, the revenue stream alone will probably not be enough to satisfy greed.

We live in an age where information seems to be stored – and hacked, or stored – and sold (with apologies lagging far behind). Every app is linked to a phone number and every phone number is linked to a name. Who assures the privacy of this information? It seems, no one. Will a child’s name wind up in a database that is sold to local mental health providers? Pharmaceutical companies? Psychiatric care facilities? Will a name in a database be accessible to future employers or the military or law enforcement agencies?

Will the app do what is opposite of its purpose? Will it expose children to a bias against mental health treatment? In case after case, we have seen supposedly confidential information widely exposed.

If my line of questioning seems far-fetched, we need only to consider how Facebook abused the information it collected after users were convinced of their privacy rights.

However, the much larger question is how so many kids have become so far removed from their parents that the only way to reach their children is through a belated warning on a smartphone? If parents consent to give their permission to load such an app on their children’s smartphones, will they then abdicate all responsibility to talk to and monitor their kids? The ethics of digital babysitting should be termed, bad ethics.

Leave a Reply