If you speak and teach corporate seminars, as I do, then you know what it feels like to look out at a sea of blackberries. And, in many companies, at open-lidded notebooks too.
At first, I took this as a personal insult. That turned out to be not very useful.
Nor was it useful to assume it was simply a comment on the low quality of my teaching and a challenge to improve. Not that I couldn't-but I noticed it wasn't just me who was being crack-berried, it was everyone.
Now I simply note at session outset that an inability to leave clients and co-workers to fend for themselves until the next break amounts to a neurotic mixture of insecurity and arrogance.
And then I let it go. Well, mostly. (And yes, I'm not without sin either when I'm in an audience).
Little did I know that there was some truth behind my accusation of risk to mental health. The Wall Street Journal's Melinda Beck reports:
Many cases of Alzheimer's do start out as 'senior moments,'" says P. Murali Doraiswamy, chief of Biological Psychiatry at Duke University Medical School and co-author of "The Alzheimer's Action Plan," a new book for people who are worried...
Names and dates that take time to retrieve "generally aren't well-archived," says Dr. Doraiswamy. You may not have paid much attention to them in the first place - especially if you were multitasking. "Your brain has an inexhaustible amount of storage, but you can't have too many programs running at the same time, or it's hard to attend to them," says Gayatri Devi, a psychiatrist and neurologist who runs the New York Memory Center. That may explain the in-one-ear-and-out-the-other phenomenon that plagues some people.
Paying attention is critical to laying down memories, which scientists now think are distributed all around the brain.
The computer metaphor strikes me as appropriate. If you think of our brains as analogous to computer memory, there's a trade-off between memory and processing speed. Multi-tasking is a choice to allocate more of our brainpower to processing-and hence less to the storage of memory. That choice inevitably lets some data slide by. And since we haven't yet got the ability to add more memory chips, it's a meaningful choice.
I recall recently reading (in the New Yorker? Ah, I've been multi-tasking too much) a story about the old Hindu spiritual leaders who would memorize days of stories from the holy literature, to be told at large gatherings. With the advent of literacy and tape recordings, the ability to memorize such large amounts of data disappeared from society.
The same was true for rural folk music in the US, which Alan Lomax presciently understood in the 30s and 40s when he did those great field recordings for the Smithsonian. It was true when kids were allowed calculators in school and forgot how to do long division. And it's true now when I recall my loved ones' speed-dial numbers, but not the underlying phone numbers they represent.
The article goes on to say:
"The richer you can make the experience, the more memorable it is...It's just as important to forget extraneous things and minimize mental clutter," says Dr. Devi. You can't dump those 1960s TV jingles from long-term memory, but you can free up your short-term memory by using calendars, lists and personal-digital assistants. "Put the burden on gadgets," says Dr. Doraiswamy.
Again, this strikes me as good commonsense. You are what you think about.
In The Monk and the Philosopher, (thanks Pierre), a French philosopher and his son, a Buddhist monk, touch on this issue. The father casts doubt on the ability of the human mind to focus for long periods of time, as the monks claim to do in meditation.
The monk-son retorts that Western minds are simply universalizing their own disinclination to pay attention: that in fact, paying attention is simply a habit, which we can choose to cultivate--or not.
Link to original post