Explore our collection of informative and educational blog posts to stay updated on the latest industry trends and expert advice.
6 Ways Digital Media Impacts the Brain
We are what we spend our time doing. The average adult now spends over twenty hours online each week. Nearly a third of that time is spent on social media platforms, with Facebook taking up fifty minutes of each day. According to the Bureau of Labor Statistics, that’s more time than we spend reading, exercising, or socialising most days. It’s almost as much time as people spend eating and drinking (1.07 hours).
Teens rack up even more tech time, with some spending nine hours engaging with digital media each day.
“It just shows you that these kids live in this massive 24/7 digital media technology world, and it’s shaping every aspect of their life,” says James Steyer, chief executive officer and founder of Common Sense Media. “They spend far more time with media technology than any other thing in their life. This is the dominant intermediary in their life.”
So what happens to the brain, over time, when we engage with digital technology at such a high rate? Two specialists, neuroscientist Susan Greenfield and cognitive scientist David Chalmers, offer somewhat opposing views on the matter.
Greenfield suspects that we’re generally worse off, with weaker memories, attention spans, and information processing ability.
“As a neuroscientist I am very aware that the brain adapts to its environment,” she says. “If you’re placed in an environment that encourages, say, a short attention span, which doesn’t encourage empathy or interpersonal communication, which is partially addictive or compulsive… all these things will inevitably shape who you are. The digital world is an unprecedented one and it could be leaving an unprecedented mark on the brain.”
Chalmers has a more optimistic perspective.
“Technology is increasing our capacities and providing us with newly sophisticated ways of thinking,” Chalmers says. “In a way, it’s automating work we used to have to do for ourselves painstakingly. It’s taking stuff we had to do consciously and slowly and making it happen fast and automatically.”
Where do these arguments leave us, especially when we have to make daily decisions about the tools and methods we use for learning about and navigating the world?
The bottom line is this: Whether digital media is changing our brains for the better or the worse, it’s our choice to allow or deny that change.
“Because of the plasticity of our brains… if you change your habits, your brain is happy to go along with whatever you do,” says neuropsychologist Joyce Schenkein. “Our brains adapt, but the process of adaptation is value-neutral—we might get smarter or we might get dumber, we’re just adapting to the environment.”
This means we actually have more control over the impact of digital media than we think. The point is to be mindful of how our brains are being affected so that we can adjust our tech time accordingly. Let’s take a closer look at what kinds of changes can occur, and explore a few ways we can respond when they do.
How Digital Media Impacts the Brain
1. Attention
Digital media encourages us to multi-task, if only because it’s so easy to switch between tasks when you can open multiple windows in your browser or turn on multiple devices. But is this a good thing?
Stanford neuroscientist Russ Poldrack has found that learning new information while multi-tasking can cause that information to go to the wrong part of the brain. Normally, new information goes into the hippocampus, which is responsible for the long-term storage of knowledge. If a student is studying while watching TV, Poldrack warns, that same information might instead go to the striatum, which is responsible for storing new procedures and skills, not facts and ideas. This means it will be stored in a shallower way, preventing quick retrieval in the future.
“Multi-tasking adversely affects how you learn,” Poldrack says. “Even if you learn while multi-tasking, that learning is less flexible and more specialised, so you cannot retrieve the information as easily.”
Nicholas Carr, author of The Shallows: What the Internet Is Doing to Our Brain, agrees:
“What psychologists and brain scientists tell us about interruptions is that they have a fairly profound effect on the way we think. It becomes much harder to sustain attention, to think about one thing for a long period of time, and to think deeply when new stimuli are pouring at you all day long. I argue that the price we pay for being constantly inundated with information is a loss of our ability to be contemplative and to engage in the kind of deep thinking that requires you to concentrate on one thing.”
If you can filter the important from the unimportant, though, shouldn’t instant access to loads of data facilitate the opposite—that is, allow you to devote more brain space to thinking deeply about the things that matter most?
2. Memory
Researchers at the University of California, Santa Cruz and University of Illinois, Urbana Champaign have found that “cognitive offloading,” or the tendency to rely on the Internet as an aide-mémoire, increases after each use.
Examining how likely people are to reach for a computer or smartphone to answer trivia questions, Storm et. al divided study participants into two groups: those who were allowed to use only their memory to answer questions and those who were allowed to use Google. Participants were then given the option of answering subsequent easier questions by the method of their choice.
Participants who had previously used the Internet to gain information were “significantly more likely to revert to Google for subsequent questions than those who relied on memory.” In fact, thirty per cent of participants who previously consulted the Internet “failed to even attempt to answer a single simple question from memory.” They also reached for the phones more quickly each time.
“Memory is changing,” Storm says. “Our research shows that as we use the Internet to support and extend our memory, we become more reliant on it. Whereas before we might have tried to recall something on our own, now we don’t bother. As more information becomes available via smartphones and other devices, we become progressively more reliant on it in our daily lives.”
Storm acknowledges that more research needs to be conducted to determine whether these findings spell trouble for the brain: “It remains to be seen whether this increased reliance on the Internet is in any way different from the type of increased reliance one might experience on other information sources, such as books or people.”
The question is, does it matter if the information fails to stick? One study out of Columbia University showed that when people know that they’ll be able to find information online easily, they’re less likely to form a memory of it.
While Storm argues that “the need to remember trivial facts, figures, and numbers is inevitably becoming less necessary to function in everyday life,” others might point out that conversation becomes far less enjoyable when we can’t recall facts or must always pause to look something up.
And what about memory for other types of knowledge, such as social and emotional cues or the kind of scaffolded understanding that defines expertise? Storm concedes that tech is best used as a supplement, not a substitute, for memory in these cases:
“Although the Internet may be effective in helping people access certain types of information, it may be much less effective in helping people access other types of information. In such cases, using the Internet to access information could prove detrimental. Furthermore, there are forms of expertise that require the possession of vast amounts of knowledge and the ability to rapidly and flexibly use that information is unlikely to be attained when it is stored externally.”
3. Thought
According to a new study published in the proceedings of the ACM Conference on Human Factors in Computing Systems, reading on digital platforms might make you “more inclined to focus on concrete details rather than interpreting information more abstractly.”
The research focused on a person’s “construal levels,” defined as “the fundamental level of concreteness versus abstractness that people use in perceiving and interpreting behaviors, events, and other informational stimuli.” Over 300 participants, ages 20 to 24 years old, took part in the studies. Participants were asked to read a short story by author David Sedaris on either a physical printout (non-digital) or in a PDF on a PC laptop (digital), and were then asked to take a pop-quiz, paper-and-pencil comprehension test.
For the abstract questions, on average, participants using the non-digital platform scored higher on inference questions with 66 percent correct, as compared to those using the digital platform, who had 48 percent correct. On the concrete questions, participants using the digital platform scored better with 73 percent correct, as compared to those using the non-digital platform, who had 58 percent correct.
Participants were also asked to read a pamphlet of information about four, fictitious Japanese car models on either a PC laptop screen or paper print-out, and were then asked to select which car model is superior. Sixty-six percent of the participants using the non-digital platform (printed materials) reported the correct answer, as compared to 43 percent of those using the digital platform.
Assistant professor Geoff Kaufman, who led the study, said: “Given that psychologists have shown that construal levels can vastly impact outcomes such as self-esteem and goal pursuit, it’s crucial to recognise the role that digitisation of information might be having on this important aspect of cognition.”
His colleague Mary Flanagan added: “Compared to the widespread acceptance of digital devices, as evidenced by millions of apps, ubiquitous smartphones, and the distribution of iPads in schools, surprisingly few studies exist about how digital tools affect our understanding—our cognition. Sometimes it is beneficial to foster abstract thinking, and as we know more, we can design to overcome the tendencies—or deficits—inherent in digital devices.”
The study was inspired by earlier research on the public health strategy game “POX: Save the People®” which found that players of the digital version of the game were more inclined to respond with localised solutions and players of the non-digital version more often looked at the big picture.
Jordan Grafman, chief of cognitive neuroscience at the National Institute of Neurological Disorders and Stroke, explains it this way: “The opportunity for deeper thinking, for deliberation, or for abstract thinking is much more limited. You have to rely more on surface-level information, and that is not a good recipe for creativity or invention.”
4. Empathy
In The Shallows, Carr includes a study showing that the more distracted you are, the less able you are to experience empathy. “Distractions could make it more difficult for us to experience deep emotions,” he explains. “This kind of culture of constant distraction and interruption undermines not only the attentiveness that leads to deep thoughts, but also the attentiveness that leads to deep connections with other people.”
One method of connecting that’s quickly becoming obsolete is handwriting, especially in the context of written correspondence. Melbourne handwriting analyst Ingrid Seger-Woznicki believes the discipline of writing legibly was once “a mark of respect between author and reader.”
“The lack of writing is reflective of our lack of clarity of communication,” she says. “We don’t see communication as an art as we used to. Writing by hand forces you to stop and think a bit, and it makes you more aware of how you affect others. Poor handwriting used to be seen as a lack of consideration.”
“When you write cursive you are wanting to connect with people’s minds at a deeper level, and as a society we don’t want to do that anymore.”
Some researchers even believe there is an “essential link between the movement of the hand and the creation of thoughts and memories that typing simply cannot replicate.”
Good penmanship takes deliberation, consideration, and concentration—qualities we’re starting to see less and less of as digital media pulls our attention in multiple directions.
5. Meta-awareness
Some studies suggest that heavy digital media use leads to a loss of cognitive control—not just a loss of attention, but a loss of our ability to control our mind and what we think about.
“One researcher from Stanford pointed out that the more you acclimate yourself to the technology and the constant flow of information that comes through it, it seems that you become less able to figure out what’s important to focus on,” Carr says. “Instead, your mind gets attracted just to what’s new rather than what’s important.
What’s new may be completely devoid of meaning, but the part of the brain that responds to it tends to trick us into thinking it’s significant.
“Each time we dispatch an email in one way or another, we feel a sense of accomplishment, and our brain gets a dollop of reward hormones telling us we accomplished something,” says Daniel J. Levitin, author of The Organised Mind: Thinking Straight in the Age of Information Overload. “But remember, it is the dumb, novelty-seeking portion of the brain driving the limbic system that induces this feeling of pleasure, not the planning, scheduling, higher-level thought centres in the prefrontal cortex.”
So, in a sense, the more we pursue empty rewards like Facebook “likes” and Twitter “favourites,” the dumber we get, and the harder it is to maintain some level of self-awareness over our habits.
“Make no mistake,” Levitin warns, “email, Facebook, and Twitter checking constitute a neural addiction.”
Dr. Nicholas Kardaras, author of Glow Kids: How Screen Addiction Is Hijacking Our Kids—and How to Break the Trance, agrees there’s a very real reason why it’s so hard to coax people away from their devices:
“We now know that those iPads, smartphones and Xboxes are a form of digital drug. Recent brain imaging research is showing that they affect the brain’s frontal cortex — which controls executive functioning, including impulse control — in exactly the same way that cocaine does. Technology is so hyper-arousing that it raises dopamine levels — the feel-good neurotransmitter most involved in the addiction dynamic — as much as sex.”
Grafman says this kind of addiction is especially dangerous for youth.
“The problem is that judicious thinking is among the frontal-lobe skills that are still developing way past the teenage years,” he says. “In the meantime, the pull of technology is capturing kids at an ever earlier age, when they are not generally able to step back and decide what’s appropriate or necessary, or how much is too much.”
The best thing we can do for our brains—and the brains of our students—is to bring the reality of tech addiction to the attention of the people it impacts most.
6. Attitude
“Hundreds of clinical studies show that screens increase depression, anxiety and aggression and can even lead to psychotic-like features where the video gamer loses touch with reality,” says Kardaras.
A Finnish study published last May in the Journal of Youth and Adolescence linked depression and school burnout to adolescents’ excessive internet use. Interestingly, it works both ways: The researchers also found that digital addiction is more likely to happen if adolescents already lack interest in and feel cynicism toward school.
Some research, though, casts a more positive light on tech-induced attitudes. For instance, the Pew Research Centre found last year that Facebook users have more close friends, more trust in people, feel more supported, and are more politically involved compared to non-social media users. A 2013 study found that teenagers often feel that “social media helps them to deepen their relationships with others.”
The bottom line is that, despite its undeniable boons, digital media does pose a threat to optimal brain function and healthy relationships with others. Remain as aware as possible of the way it influences your behaviour and you’ll be able to ride the wave instead of getting lost in the undertow.