Skip to content. Skip to more features. Skip to most popular. Skip to footer.

UCLA

Think Again

Print
Comments

By Dan Gordon '85

Published Apr 1, 2013 8:00 AM


Is technology good for our brains, or is it bad? Does it improve thinking, or imperil it? The debate rages, digitally or otherwise. The experts see it both ways.

art

Our brave new digital world, with 24/7 access to any and all information whether we're at home, at work, or out and about —is making us all smarter and better informed.

Or more shallow and stupid.

Everybody has their own opinions about whether digital citizens — in this world, there are no more immigrants; we're all natives now — are better or worse off than those who lived in the ante-Internet world.

The digital-equals-smarter caucus claims that since everything we could want to know now can be found through a quick visit to Google, we are embarked on an unpredictable-but-never-dull journey to new knowledge as we navigate our way from one site to the next. We can collaborate across continents, communicate with the world's experts and learn about issues from every possible angle. Liberated from time-consuming and mundane burdens such as the need to travel to the library or a meeting or to memorize anything, we can be more creative and productive than ever before.

Yes, but it's diminished creativity and productivity, the counterargument goes. We've become a society of skimmers, say the digital-equals-dumber camp, rarely digesting a complete paragraph from any piece of writing before we jump to the next site or are diverted by a text, email, video, Tweet or Facebook post. With so much digital media competing for our attention we jump back and forth, our attention easily drifts, and we spend less time engaged in deep thinking — the kind of critical analysis that might help us do a better job of sifting through the often questionable material we encounter in an age when so little of what we see has been vetted.

So which is it? Is the digital world making us smarter or stupider? We fired up our journalistic search engines to locate the experts and aggregate an answer.

And the answer is: Yes.

And no.

Let's log on and surf the science.

Cognition in the Machine

"Our brains are wired to want more and more, so we take on a lot and then it becomes hard to settle down and really think through problems," says Gary Small '73, director of the UCLA Longevity Center and Division of Geriatric Psychiatry and the Parlow-Solomon Professor on Aging, about our love-hate relationship with online life. "The question we're all struggling with is, can we get a handle on this technology so it allows us to be more creative and thoughtful and to lead more enriched and socially connected lives, or is it controlling us?"

In his 2008 book iBrain: Surviving the Technological Alteration of the Modern Mind, Small writes of the growing problem of "continuous partial attention," in which we're "keeping tabs on everything, but not really focusing on anything."

What we gain in breadth, in other words, we sacrifice in depth. What's more, as anyone with teenagers can attest, social media and texting has produced a generation of distracted digital multitaskers.

"We can assume the younger generation is better at it, from doing it more," adds Patricia Greenfield, Distinguished Professor of Psychology and director of the Children's Digital Media Center @ Los Angeles. "But every study that's compared the same task under multitasking conditions with single-task conditions has found that it causes a cognitive decrement."

In a 2009 analysis published in the journal Science, Greenfield cited a study showing that when college students were provided Internet access and encouraged to use it during lectures to further explore topics being discussed, they fared worse on a post-lecture test than students whose laptops remained closed. Multiple studies have shown that as the number of links in an online document goes up, reading comprehension declines.

Greenfield believes that while heavy use of digital technology (including video games) has led to improved visual-spatial intelligence, the price is weakness in higher-order processes such as reflection, critical thinking and imagination.

The Yin and Yang of Digital Culture

In assessing the impact of digital media on children's minds, Yalda T. Uhls M.B.A. '90, M.A. '10, a former film industry executive who is now a doctoral student in developmental psychology and a researcher in Greenfield's center, sees cause for both excitement and concern. The good news is that the ability to connect easily with people all over the world affords new opportunities to learn about other cultures, as well as accessing sources of information far beyond what traditional textbooks have provided.

The bad news is that outside the classroom, technology isn't necessarily used to elevate the mind. A 2010 Kaiser Family Foundation study found that children ages 8-18 devote nearly eight hours a day to entertainment media — and when media multitasking is factored in, they're packing 10 hours, 45 minutes of media content into their day. A 2009 MacArthur Foundation study concluded that just 20 percent of children use digital media for interest-driven learning, while the remaining 80 percent use it to socialize and "hang out."

Uhls and Greenfield studied preteen focus groups and found that the content of popular TV shows, combined with the opportunity to post videos and collect friends, followers and "likes" online, were linked to the rise of fame to the top of the list of the tweens' future goals. "The fear is that if they don't understand that fame should be connected to something they do, they won't develop themselves intellectually or through some other kind of skill," Uhls says.

Less Is More

You might think that in a world where news breaks in real time on Twitter and is updated, analyzed and discussed ad nauseam online, we'd at least be better informed about current events. But Michael LaCour M.A. '12, a UCLA doctoral student in political science, finds that the modern digital environment results in less news consumption and a more politically unaware electorate than when we got our news from three broadcast networks and our daily newspaper.

LaCour conducted experiments in which participants were observed in a waiting area, randomly assigned — unbeknownst to them — to conditions simulating one of three historic information environments: broadcast television, cable television or digital. LaCour then limited the access of some users to cable TV, Internet access and phone service. The research revealed that study participants with no Internet access or cell signal consumed twice as much news on television as those under digital conditions, and performed 40 percentage points better on a subsequent current-events quiz.

LaCour notes that in the broadcast era, we watched the nightly news because it was the only thing on. "It was information by default, not by choice," he says. "When you introduced more options with cable and digital, that inadvertent audience opted out." The current environment, by contrast, is information-seeking: You don't have to come into contact with information you don't care about. The result is a huge knowledge gap between the political junkies, who can binge all they want, and everyone else. And all of the competition for people's attention also has an impact on the way news is delivered.

"In the post-World War II era, when you had three networks and a local newspaper monopoly, if something important happened it was basically covered one way and you had to ‘eat your vegetables,' " says Tim Groeling, UCLA associate professor and chair of communication studies, who teaches a course called "Revolutions in Communication Technology." The need to stand out in the modern era, he contends, has fueled the rise of partisanship, sensationalism and entertainment in the delivery of news.

You're My Obsession

When Johanna Drucker walks across the UCLA campus, she's struck by the number of students who move without looking up from their smartphones, practically bumping into her along the way. "Everyone wants to be in contact with their social network in a continuous stream of exchange," says Drucker, UCLA Bernard and Martin Breslauer Professor of Bibliography in information studies, who teaches the undergraduate "Introduction to Digital Humanities" course and co-authored 2012's Digital_Humanities. Drucker doesn't like to judge one era against another, but she admits there are stark differences in thinking patterns.

"An 18th-century novel was written to engage the imagination to fill long hours by the fire with your family in the winter, when there was nothing else to do," she says. "Today it's endless browsing, and I find that my own capacity for sustained reading has changed."

She points out that when the parents of many of today's students were in college, they wrote letters home — communicating by phone only when it was time to arrange travel and rewarded once a day by the delivery of the mail. "Now we're addicted to email and texts," and reward is immediate, she says.

Brain-imaging studies do suggest that digital media activate some of the same reward pathways as addictive substances, releasing the neurotransmitter dopamine whenever something new and interesting is introduced. On the other hand, Small has found that the Internet can have positive effects on the brain — specifically, that among computer-savvy middle-aged and older adults, searching the Web triggers key centers in the brain related to working memory, a form of short-term memory involved in holding and manipulating information.

Small's team used functional magnetic resonance imaging scans to record the brain-circuitry changes of volunteers ages 55-76 while the subjects performed Web searches. In contrast to those who had little or no online experience, Web-savvy subjects registered activity in the areas of the brain that control decision-making and complex reasoning — neural circuitry not activated when reading a book.

Small points to other studies indicating that improving working memory through computer games can boost fluid intelligence — the ability to problem-solve. "There is the capacity to use the technology to make us smarter," he says. "We could approach it more systematically, the way we go about getting our cardiovascular conditioning and consuming enough fish in our diet. But we're not using it that way."

One way or another, digital media are permanently transforming the way we read, write, teach and communicate. They allow for more participation, more dissemination and a breaking down of barriers.

"If you have an archive that puts up walls and says we have the one copy of this book and we're not going to share it because it's so valuable, then suddenly it's digitized and made widely available to the world, it vastly expands the ways it can be used," says Todd Presner, professor of Germanic languages and comparative literature at UCLA, chair of the Digital Humanities program and a co-author with Drucker and three others of Digital_Humanities. He and his co-authors foresee fluid groups of experts from inside and outside the university, "asking and answering research questions that cannot be reduced to a single genre, medium, discipline or institution." In addition to zooming in on canonical authors like Shakespeare and Chaucer, the new tools allow students and scholars to zoom out, seeing the history of a genre through computational means.

So: smarter or stupider?

There's no doubt digital technology provides powerful tools for us to become a more intelligent society. If the price is to make us easily distracted, maybe it's just because the world has become much more interesting. As we outsource our memory to devices (who memorizes phone numbers anymore?), perhaps we're redefining intelligence.

"Professors used to memorize sonnets after one or two readings," says Groeling. "Training the mind to store that kind of information is no longer regarded as essential. That doesn't mean we're not as smart; it just means it now makes more sense to devote our brainpower in more productive ways."

Comments