It’s a question as frustrating as a hangnail, asked virtually every time I give a public lecture on teen brain development. It’s some form of: “is the digital world bad for the adolescent brain?”
I’m frustrated because I can’t really give a satisfying answer. I end up misting the auditorium with questions, gentle as rolling fog, and just as obscuring. “What is meant by the ‘digital world?” I ask, “Xbox? Laptop? iPhone? Texting? Snapchat? Instagram? Facebook (for the two teens who still use it?)” Then I pause. “Perhaps you mean video games? If so, is MarioKart the same thing as Grand Theft Auto?”
Questions about brain-effects have equal ambiguity. “Worried about attention spans? Social interactions? Psychopathologies? How much exposure does it take to see something negative, if there’s harm? Are boys affected more than girls?”
There’s reasoning behind this pea-soup. Most of these variables haven’t been investigated in carefully designed randomized trials. Until they are, the real answer is as familiar as it is, well, frustrating: We need more research. Because we don’t really know.
Not that people aren’t trying. In my most recent book “Attack of the Teenage Brain,” I give an example of papers from two separate research groups examining video games and attentional states. Their findings reveal how not-ready-for-prime-time our answers are. Here’s a quote from each paper:
“Viewing television and playing video games each are associated with increased subsequent attention problems in childhood.”
And …
“(Videogames) have now become tools in research facilities because of their ability to enhance attention.”
So which is it folks? Hurtful or helpful? Can we currently have it both ways?
Like I said. Frustrating.
The data that do exist point to some barely-there trendlines, fortunately. And so, with apologies to the audiences for whom I regularly stonewall, here’s a description. There’s both good news and bad news.
Social media, contrary to its reputation, actually seems to improve certain prosocial behaviors—empathy, to name one—in teenage populations. Researchers in one study followed a group of 10-14-year-olds for a year, tracking their use of use social media, primarily Twitter and Facebook. The experimental design was good old pre/post, using the AMES assay (Adolescent Measure of Empathy and Sympathy), a well-regarded empathy test. Scores actually improved the more the kids used social media. Said the researchers:
“…adolescents’ social media use improved both their ability to understand (cognitive empathy) and share the feelings of their peers (affective empathy).”
But it’s not all roses and right-swipes. Researchers have also examined the effects of the digital world on adolescents to ability to accurately decode nonverbal information. By “digital” I mean mobile devices, primarily texting. The research bordered on child abuse, because teen cell phones were (with permission) taken away for five days, whisking the kids off to a cell-inaccessible wilderness camp. Researchers also employed a pre/post design, assessing performance on a nonverbal communication competency assay (the DANVA2 test, for example). They found teen decoding dramatically improved. Here’s a quote from this paper:
“After five days interacting face-to-face without the use of any screen-based media, preteen’s recognition of nonverbal emotion cues improved significantly more than that of the control group.”
So we have a dash of “good news,” a pinch of “bad news,” and a potential framework to turn “no news” into “know news.”
But not right now. Controlling the variables necessary to obtain a clear view will take years. Until that happens, I’m afraid my poor audiences will have to put up with my stonewalling. I like the question well enough. I’m not at all wild about by my hangnail-answer.