Can Our Brains Handle the Information Age?

Bret S. Stetka, MD; Daniel Levitin, PhD

Disclosures

September 24, 2014

In This Article

Editor's Note: In his new book, The Organized Mind, best-selling author and neuroscientist Daniel Levitin, PhD, discusses our brain's ability—or lack thereof—to process the dizzying flow of information brought on us by the digital age. Dr Levitin also suggests numerous ways of organizing mass information to make it more manageable. Medscape recently spoke with Dr Levitin about the neuroscience of information processing as well as approaches potentially useful to overworked clinicians.

The Fear of Information

Medscape: Your new book discusses how throughout history humans have been suspicious of increased access to information, from the printing press back to the first Sumerian writings. But I think most would agree that these were positive advancements. Do you think the current digital age weariness expressed by many is more of the same and that today's rapid technological progression will end up being a positive development for humanity? Or has the volume of data out there just gotten too big for the human brain to handle?

Dr Levitin: I have two minds about this. On one hand, there is this "same as it ever was" kind of complaint cycle. Seneca complained at the time of the ancient Greeks about the invention of writing—that it was going to weaken men's minds because they would no longer engage in thoughtful conversation. You couldn't interrogate the person who was telling you something, meaning that lies could be promulgated more easily and passed from generation to generation.

And then with the invention of the printing press, people decried the plethora of worthless and useless things that would be written and that it would soften people's minds because they would be reading a bunch of garbage instead of good information.

And, of course, television was supposed to rot our minds. But that was when we had Gilligan's Island rather than Breaking Bad and The Sopranos.

So I think there is this cycle of being resistant to new technologies, and there is a settling-in period required where we learn how to use them properly. The first things printed on the printing press were pornography, and one of the first uses of the computer was for, again, pornography.

But there are loftier uses these developments can be put to. So that's one side of the coin. On the other side of the coin it does seem as though the available amount of information has approached some kind of maximum in terms of the human brain's capacity to deal with it.

If we look back at our evolutionary history, the amount of information that existed in the world just a few thousand years ago was really just a small percentage of what exists now. By some estimates, the amount of scientific and medical information produced in the last 25 years is equal to all of the information in all of human history up to that point.

The human brain can really only attend to a few things at once, so I think we are reaching a point where we have to figure out how to filter information so that we can use it more intelligently and not be distracted by irrelevant information. Studies show that people who are given more information in certain situations tend to make poorer decisions because they become distracted or overwhelmed by the irrelevant information.

Comments

3090D553-9492-4563-8681-AD288FA52ACE
Comments on Medscape are moderated and should be professional in tone and on topic. You must declare any conflicts of interest related to your comments and responses. Please see our Commenting Guide for further information. We reserve the right to remove posts at our sole discretion.
Post as:

processing....