Information Overload!!!

i-3f51db4783e03e6a0db1adf0f2072007-Brain-Scan.jpg

I don't have the attention span to write this article.

In the course of penning this introductory paragraph, I've taken umpteen email breaks, gotten distracted by several Wikipedia wormholes, and taken an hour's time out to watch Frontline documentary clips on YouTube. It has taken me, in toto, seven days to write a five-paragraph article about my generation's decreasing attention span. At least the irony isn't lost on me.

A social researcher tracking my movements across the web might discover that, in the words of University College of London professor and director of CIBER (the Centre for Information Behaviour and the Evaluation of Research) David Nicholas, I am "skipping over the virtual landscape." Nicholas is among the first people to systematically study the way people research online by analyzing millions of anonymous data records. According to this Telegraph article, he found that while researching a subject, four out of 10 young people never revisited the same web page twice, while people who grew up before the age of the internet repeatedly returned to the same source. Which is to say, people who came of age in an era when "research" meant "studious hours in the library stacks" are more accustomed to in-depth work with sources, while young ADD-addled rapscallions like myself are skipping around, skimming, gleaning bits and pieces of information from disparate sources as we bleep-bloop away on our infernal mobile devices. This comes as no surprise to me: at any given time, I have over fifteen tabs open in my browser, a couple of PDFs primed, and a handful of half-finished emails waiting to be attended to. And, although my methods are less scientific than Professor Nicholas', I sense the same proclivities among my peers. A friend of mine recently sent me a link to an article, bragging that it was "the longest thing he'd ever read online." I printed it out: it was four pages long.

What is happening? Well, the information overload we've come to expect from web-surfing just might be reshaping how we think.

Contrary to popular belief, the mind is plastic no matter how old you are. According to leading neuroscientist Michael Merzenich, of the University of California, San Francisco, the plasticity of the brain -- its ability to jumble neurons, and re-organize neural networks and their function based on new experiences -- exists from cradle to grave. Merzenich is famous for a series of experiments done on owl monkeys, in which portions of the monkey brains expected to be in total disarray after a series of peripheral nerves were cut and rearranged were found to be essentially normal, implying that the brain can reorganize itself in response to new situations and stimuli. Quoth Merzenich: "if the brain map could normalize its structure in response to abnormal input, the prevailing view that we are born with a hardwired system had to be wrong." Ie, "the brain had to be plastic." In short, brains can spontaneously reprogram themselves in order to function more efficiently, or more harmoniously with their environments.

Now consider the swiftly-changing information flow that has become the norm in 21st-century life. Certainly, the stimuli we regularly feed our brains has changed tremendously in the last ten years. From the moment we wake up in the morning to the moment we put down our iPhones at night, we're faced with an immeasurable fount of data, opinion, images, videos, text, and ideas pouring out at us from our computers, televisions, and devices. Would it be so bizarre to imagine that our brains are slowly molding themselves to work with this information flow, developing the ability to dart around, manage several chains of thought at once, power-browsing through the data-slop, and, as Professor Nicholas so quaintly puts it, learning to "skip over the virtual landscape?" Is the Net programming us?

Media critic Douglas Rushkoff thinks so. In a talk given at SXSW this year (see above), he warns, "if we don't create a society that at least knows there's a thing called programming, then we will end up being not the programmers, but the users -- and, worse, the used." He's discussing open-source software and media literacy, but he might as well be talking about the larger power that our programs hold over us, culturally as well as neurologically.

As Nicholas Carr much more eloquently puts it in this fabulous and terrifying article from the Atlantic, "As we use...our 'intellectual technologies'--the tools that extend our mental rather than our physical capacities--we inevitably begin to take on the qualities of those technologies."

He adds, further:

The process of adapting to new intellectual technologies is reflected in the changing metaphors we use to explain ourselves to ourselves. When the mechanical clock arrived, people began thinking of their brains as operating "like clockwork." Today, in the age of software, we have come to think of them as operating "like computers." But the changes, neuroscience tells us, go much deeper than metaphor. Thanks to our brain's plasticity, the adaptation occurs also at a biological level.

Clocks taught us to compartmentalize time. But what have computers done to us? To be sure, we live in an era when "like computers" no longer means what it once did. In the time of early cybernetic theory, Norbert Wiener et al. saw parallels between the computer and the human mind: we shared feedback systems, an urge to work upstream against prevailing forces of entropy; we both strove to create function and meaning out of chaos. A "computer" meant something rational, analytic, and methodical.

Now, "like a computer" represents something else entirely: a ceaseless flow of changing, multifaceted, immediately accessible information-particles -- and all the distraction that entails. It's possible that plasticity of our brains allows us to adapt to these factors, but how will that affect the way we think? Or relate to others? Certainly, I'm not advocating a reactionary attitude this new pace of things, or a dismissal of online reading and research, but I do think we should take a step backwards and frankly assess the state of our brains.

Our new brains.

Related (paper) sources:
The Brain That Changes Itself: Stories of Personal Triumph from the Frontiers of Brain Science
The Mind and the Brain: Neuroplasticity and the Power of Mental Force
Train Your Mind, Change Your Brain: How a New Science Reveals Our Extraordinary Potential to Transform Ourselves

Categories

More like this

The article is here, but it is too long for me and my attention span to read through. I got a snippet, though: But that boon comes at a price. As the media theorist Marshall McLuhan pointed out in the 1960s, media are not just passive channels of information. They supply the stuff of thought, but…
For your reading and collection development pleasure: 137: Jung, Pauli, and the Pursuit of a Scientific Obsession by Arthur I. Miller "The history is fascinating, as are the insights into the personalities of these great thinkers."--New Scientist Is there a number at the root of the universe? A…
I've got a review of The Shallows, a new book by Nicholas Carr on the internet and the brain, in the NY Times: Socrates started what may have been the first technology scare. In the "Phaedrus," he lamented the invention of books, which "create forgetfulness" in the soul. Instead of remembering for…
The Times recently had an article on the booming business of brain fitness: Decaying brains, or the fear thereof, have inspired a mini-industry of brain health products -- not just supplements like coenzyme Q10, ginseng and bacopa, but computer-based fitter-brain products as well. Nintendo's $19.99…

How do you propose that we "take a step backwards and frankly assess the state of our brains"? Any new experience "reprograms" the brain, it's how we learn (as you noted in discussing Merzenich's work on plasticity). I'm not aware of any evidence to suggest that the internet causes fundamentally different neural changes than other types of media.

Technocritics like Nicholas Carr have been around for a while. For some amusing old school TV bashing (starting at ~6:30-7:00), see:

Television: The Ultimate Drug

A fast paced, video magazine show which reflects on the rapid evolution and powerful influence of television on American society. Produced in 1981.

http://diva.sfsu.edu/collections/sfbatv/bundles/187457

[Link via @channelnvideo]

A couple of alternate explanations for young people not visiting links twice:
1. Older people could be revisiting links by accident. Reasons might be that they are on average more forgetful, and that they are less internet-literate, so maybe they don't know that a visited link changes color.
2. Younger people might just keep the page open in a tab if it is interesting. I have no real basis for believing this, although it might be explained by the fact that young people might be more comfortable with the technology. I currently have 24 tabs open in Firefox, 13 in Chrome and 6 in Internet Explorer (I use the different browsers for work, personal, and crappy websites that only work in IE). My parents are aware of the tabs functionality but never use it on purpose.

The very genesis of the internet came from the premise that linked information is ultimately more useful. While I agree with this premise it certainly has led to what you are talking about in your article. In fact, I got distracted by some tweets, the link to the Telegraph article and an IM ping all in the space of the 1-2 minutes it took to read the full article. Is linked information more useful?

By Laurie Desautels (not verified) on 07 Apr 2010 #permalink

I joined Team Yacht, midway through but came back and finished the article.

When I consider if things could be done, I often decide sure or hell yes. Only rarely do I think if they should be done. The computer lacks philosophy beyond its own inception.

I fear that the computer age may be spawning an army of poor thought and unlimited resource; basically I fear interNazis.

Principles are a sticky wicket, but this is a new time in history, fit for new principles.

By Ryan Harackiewicz (not verified) on 07 Apr 2010 #permalink