Tibetan monk outfitted with electrodes

Electrodes measure a Tibetan monk's brain activity.

Photograph by Cary Wolinsky

Beyond the Brain

ByJames Shreeve
45 min read

The ancient Egyptians thought so little of brain matter they made a practice of scooping it out through the nose of a dead leader before packing the skull with cloth before burial. They believed consciousness resided in the heart, a view shared by Aristotle and a legacy of medieval thinkers. Even when consensus for the locus of thought moved northward into the head, it was not the brain that was believed to be the sine qua non, but the empty spaces within it, called ventricles, where ephemeral spirits swirled about. As late as 1662, philosopher Henry More scoffed that the brain showed "no more capacity for thought than a cake of suet, or a bowl of curds."

Around the same time, French philosopher René Descartes codified the separation of conscious thought from the physical flesh of the brain. Cartesian "dualism" exerted a powerful influence over Western science for centuries, and while dismissed by most neuroscientists today, still feeds the popular belief in mind as a magical, transcendent quality.

A contemporary of Descartes named Thomas Willis—often referred to as the father of neurology—was the first to suggest that not only was the brain itself the locus of the mind, but that different parts of the brain give rise to specific cognitive functions. Early 19th-century phrenologists pushed this notion in a quaint direction, proposing that personality proclivities could be deduced by feeling the bumps on a person's skull, which were caused by the brain "pushing out" in places where it was particularly well developed. Plaster casts of the heads of executed criminals were examined and compared to a reference head to determine whether any particular protuberances could be reliably associated with criminal behavior.

Though absurdly unscientific even for its time, phrenology was remarkably prescient—up to a point. In the past decade especially, advanced technologies for capturing a snapshot of the brain in action have confirmed that discrete functions occur in specific locations. The neural "address" where you remember a phone number, for instance, is different from the one where you remember a face, and recalling a famous face involves different circuits than remembering your best friend's.

Yet it is increasingly clear that cognitive functions cannot be pinned to spots on the brain like towns on a map. A given mental task may involve a complicated web of circuits, which interact in varying degrees with others throughout the brain—not like the parts in a machine, but like the instruments in a symphony orchestra combining their tenor, volume, and resonance to create a particular musical effect.

Corina's brain all she is…is here

FREE BONUS ISSUE

Corina Alamillo is lying on her right side in an operating room in the UCLA Medical Center. There is a pillow tucked beneath her cheek and a steel scaffold screwed into her forehead to keep her head perfectly still. A medical assistant in her late 20s, she has dark brown eyes, full eyebrows, and a round, open face.

On the other side of a tent of sterile blue paper, two surgeons are hard at work on a saucer-size portion of Corina's brain, which gleams like mother-of-pearl and pulsates gently to the rhythm of her heartbeat. On the brain's surface a filigree of arteries feeds blood to the region under the surgeons' urgent scrutiny: a part of her left frontal lobe critical to the production of spoken language. Nearby, the dark, dull edge of a tumor threatens like an approaching squall. The surgeons need to remove the tumor without taking away Corina's ability to speak along with it. To do that, they need her to be conscious and responsive through the beginning of the operation process. They anesthetized her to remove a piece of her scalp and skull and fold back a protective membrane underneath. Now they can touch her brain, which has no pain receptors.

"Wake up, Sweetie," says another doctor, sitting in a chair under the paper tent with Corina. "Everything is going fine. Can you say something for me?" Corina's lips move as she tries to answer through the clearing fog of anesthesia.

"Hi," she whispers.

The deep red hue of Corina's tumor is plain to see, even to a layperson leaning over the surgeon's shoulder. So is the surrounding tissue of her brain, a three-pound (1.4-kilogram), helmet-shaped bolus of fat and protein, wrinkled like a cleaning sponge and with a consistency of curdled milk.

Corina's brain is the most beautiful object that exists, even more beautiful than Corina herself, for it allows her to perceive beauty, have a self, and know about existence in the first place. But how does mere matter like this make a mind? How does this mound of meat bring into being her comprehension of the doctor's question, and her ability to respond to it? Through what sublime process does electrochemical energy become her hope that the operation will go well, or her fear for her two children if it should not? How does it bring into being her memory of clutching tight to her mother's hand in the hospital room half an hour ago—or 20 years before in a store parking lot? These are hardly new questions. In the past few years, however, powerful new techniques for visualizing the sources of thought, emotion, and behavior are revolutionizing the way we understand the nature of the brain and the mind it creates.

The opening in Corina's skull provides a glimpse into the history of the mind's attempt to understand its physical being. The patch of frontal lobe adjacent to her tumor is called Broca's area, named after the 19th-century French anatomist Paul Broca, one of the first scientists to offer definitive evidence that—while there is no single seat of thought—specific cognitive traits and functions are processed in localized regions of the brain.

Broca defined the area named for him by studying a stroke victim. In 1861 Broca met a patient who had been given the nickname "Tan," because "tan" was the only syllable he had been able to utter for the past 21 years. When Tan died, an autopsy revealed that a portion of his left frontal lobe about the size of a golf ball had been liquefied by a massive stroke years before.

A few years later German neurologist Carl Wernicke identified a second language center farther back, in the brain's left temporal lobe. Patients with strokes or other damage to Wernicke's area are able to talk freely, but they cannot comprehend language, and nothing they say makes any sense.

Until recently, damaged brains were the best source of information about the origins of normal cognitive function. A World War I soldier with a small-bore bullet wound in the back of his head might also, for instance, have a vacancy in his field of vision caused by a corresponding injury in his visual cortex. A stroke victim might see noses, eyes, and mouths, but not be able to put them together into a face, revealing that facial recognition is a discrete mental faculty carried out in the region of cortex destroyed by the stroke. In the 1950s American neurosurgeon Wilder Penfield used an electrode to directly stimulate spots on the brains of hundreds of epilepsy patients while they were awake during operations. Penfield discovered that each part of the body was clearly mapped out in a strip of cortex on the brain's opposite side. A person's right foot, for example, responded to a mild shock delivered to a point in the left motor cortex adjacent to one that would produce a similar response in the patient's right leg. Stimulating other locations on the cortical surface might elicit a specific taste, a vivid childhood memory, or a fragment of a long-forgotten tune.

The two surgeons in the UCLA operating room are now about to apply Penfield's technique to Corina's Broca's area. They're already in the general neighborhood, but before removing her tumor they must find the exact address for Corina's specific language abilities. The fact that she is bilingual requires even greater care than normal: The neural territories governing her English and Spanish may be adjacent, or—more likely, since she learned both languages at an early age—may at least partially overlap. Susan Bookheimer, the neuropsychologist communicating with Corina under the paper tent, shows her a picture on a card from a stack. At the same time, chief surgeon Linda Liau touches her brain with an electrode, delivering a mild shock. Corina feels nothing, but function is momentarily inhibited in that spot.

"What's this, sweetie?" Bookheimer asks. Groggy, Corina stares at the picture.

"Saxophone," she whispers.

"Good!" says Bookheimer, flipping through her stack of cards. The electrode is not touching a point critical to language. Meanwhile Liau moves the electrode a fraction of an inch. "And this one?"

"Unicorn."

"Very good. ¿Y éste?"

"Casa."

"¿Y éste?"

Corina hesitates. "¿Bicicleta?" she says. But it is not a bicycle; it is a pair of antlers. When Corina makes a mistake or struggles to identify a picture of some simple object, the doctors know they have hit upon a critical area, and they label the spot with a square of sterile paper, like a tiny Post-it note.

So far, this is all standard procedure. (Liau, whose own mother died of breast cancer that spread to her brain, has performed some 600 similar operations.) But the mapping of Corina's brain is about to take a turn into the future. There are a dozen people bustling about in the operating room, twice the number needed for a typical brain tumor operation. The extras are here to use optical imaging of intrinsic signals (OIS) during surgery, a technique being developed here at UCLA by Arthur Toga and Andrew Cannestra, one of the surgeons assisting Liau.

A special camera mounted on a boom is swung into position above Corina's frontal lobe. As she continues to name the pictures on the cards or responds to simple questions (What is the color of grass? What is an animal that barks?), the camera records minute changes in the way light is reflected off the surface of her brain. The changes indicate an increase in blood flow, which in turn is an indication of cognitive activity in that exact spot.

When Corina answers "green," or "dog," the precise pattern of neural circuits firing in her Broca's area and surrounding tissue is captured by the camera and sent to a monitor in the corner of the room. From there the image is instantly uploaded to a supercomputer in UCLA's Laboratory of Neuro Imaging, a few floors above. There it joins 50,000 other scans collected from over 10,000 individuals, using an array of imaging technologies. Thus Corina becomes one galaxy in an expanding universe of new information on the human brain.

"Every person's brain is as unique as their face," says Toga, who directs the Laboratory of Neuro Imaging and is observing the operation today from above his surgical mask. "All this stuff is sliding around, and we don't know all the rules. But by studying thousands of people, we may be able to learn more of them, which will tell us how the brain is organized."

Most of the images in UCLA's brain atlas are produced by a groundbreaking new technique called functional magnetic resonance imaging (fMRI). Like OIS, fMRI monitors increases in blood flow as an indirect measurement of cognitive activity. But, while not nearly as precise, fMRI is completely noninvasive and can thus be used to study brain function not just in surgical patients like Corina, but in anyone who can tolerate spending a few minutes in the tubular cavity of an MRI machine. The technique has been used to explore the neural circuitry of people suffering from depression, dyslexia, schizophrenia, and a host of other neurological conditions. Just as important, it has been trained on the brains of hundreds of thousands of subjects while they perform a given task—everything from twitching a finger to recalling a specific face, confronting a moral dilemma, experiencing orgasm, or comparing the tastes of Pepsi and Coke.

What does the new science tell us about how Corina's 28-year-old brain produced Corina's 28-year-old mind? In terms of brain growth, her birth in Santa Paula, a farming community about 50 miles (80.5 kilometers) north of Los Angeles, was a nonevent. In contrast, the previous nine months in her mother's womb were a neurodevelopmental drama of epic proportions.

Four weeks after conception, the embryo that would become Corina was producing half a million neurons every minute. Over the next several weeks these cells migrated to the brain, to specific destinations determined by genetic cues and interactions with neighboring neurons. During the first and second trimesters of her mother's pregnancy the neurons began to reach tentacles out to each other, establishing synapses—points of contact—at a rate of two million a second. Three months before she was born, Corina possessed more brain cells than she ever would again: an overwrought jungle of connections. There were far more than she needed as a fetus in the cognitively unchallenging womb—far more, even, than she would need as an adult.

Then, just weeks away from birth, the trend reversed. Groups of neurons competed with each other to recruit other neurons into expanding circuits with specific functions. Those that lost died off in a pruning process scientists call "neural Darwinism."

The circuits that survived were already partly tuned to the world beyond. At birth, she was already predisposed to the sound of her mother's voice over that of strangers; to the cadence of nursery rhymes she might have overheard in the womb; and perhaps to the tastes of her mother's Mexican cuisine, which she had sampled generously in the amniotic fluid. The last of her senses to develop fully was her vision. Even so, she clearly recognized her mother's face at just two days old.

For the next 18 months, Corina was a learning machine. While older brains need some sort of context for learning—a reason, such as a reward, to pay attention to one stimulus over another—baby brains soak up everything coming through their senses.

"They may look like they're just sitting there staring at things," says Mark Johnson of the Centre for Brain and Cognitive Development at Birkbeck, University of London. "But right from the start, babies are born to seek information." As Corina experienced her new world, neural circuits that received repeated stimulation developed stronger synaptic connections, while those that lay dormant atrophied. At birth, for instance, she was able to hear every sound of every language on Earth. As the syllables of Spanish (and later English) filled her ears, the language areas of her brain became more sensitive to just those sounds, while losing their responsiveness to the sounds of, say, Arabic or Swahili.

If there is one part of the brain where the "self" part of Corina's mind began, it would be in the prefrontal cortex—a region just behind her forehead that extends to about her ears. By the age of two or so, circuits here have started to develop. Before the prefrontal cortex comes on line, a child with a smudge on her cheek will try to wipe the spot off her reflection in a mirror, rather than understand that the image in the mirror is herself, and wipe her own cheek.

But as scientists are learning about all higher cognitive functions, they're discovering that a sense of self is not a discrete part of the mind that resides in a particular location, like the carburetor in a car, or that matures all at once, like a flower blooming. It may involve various regions and circuits in the brain, depending on what specific sense one is talking about, and the circuits may develop at different times.

So while Corina may have recognized herself in a mirror before she was three years old, it might have been another year before she understood that the self she saw in the mirror persists intact through time. In studies conducted by Daniel Povinelli and his colleagues at the University of Louisiana at Lafayette, young children were videotaped playing a game, during which an experimenter secretly put a large sticker in their hair. When shown the videotape a few minutes later, most children over the age of three reached up to their own hair to remove the sticker, demonstrating that they understood the self in the video was the same as the one in the present moment. Younger children did not make the connection.

If Corina had a sticker caught in her hair when she was three, she doesn't remember it. Her first memory is of the thrill of going to the store with her mother to pick out a special dress, pink and lacy. She was four years old. She does not recall anything earlier because her hippocampus, part of the limbic system deep in the brain that stores long-term memories, had not yet matured.

That doesn't mean earlier memories don't exist in Corina's mind. Because her father left when she was just two, she can't consciously remember how he got drunk sometimes and abused her mother. But the emotions associated with the memory might be stored in her amygdala, another structure in the brain's limbic system that may be functional as early as birth. While highly emotional memories etched in the amygdala may not be accessible to the conscious mind, they might still influence the way we act and feel beyond our awareness.

Different areas of the brain develop in various ways at different rates into early adulthood. Certainly the pruning and shaping of Corina's brain during her early months as a learning machine were critical. But according to recent imaging studies of children conducted over a period of years at UCLA and the National Institute of Mental Health in Bethesda, Maryland, a second growth spurt in gray matter occurs just before puberty.

Assuming she was a typical girl, Corina's cortex was thickest at the age of 11. (Boys peak about a year and a half later.) This wave of growth was followed by another thinning of gray matter that lasted throughout her teen years, and indeed has only recently been completed. The first areas of her brain to finish the process were those involved in basic functions, such as sensory processing and movement, in the extreme front and back of the brain. Next came regions governing spatial orientation and language in the parietal lobes on the sides of the brain.

The last area of the brain to reach maturity is the prefrontal cortex, where the so-called executive brain resides—where we make social judgments, weigh alternatives, plan for the future, and hold our behavior in check.

"The executive brain doesn't hit adult levels until the age of 25," says Jay Giedd of the National Institute of Mental Health, one of the lead scientists on the neuroimaging studies. "At puberty, you have adult passions, sex drive, energy, and emotion, but the reining in doesn't happen until much later." It is no wonder, perhaps, that teenagers seem to lack good judgment or the ability to restrain impulses. "We can vote at 18," says Giedd, "and drive a car. But you can't rent a car until you're 25. In terms of brain anatomy, the only ones who have it right are the car-rental people."

Gray-matter maturity, however, does not signal the end of mental change. Even now, Corina's brain is still very much a work in progress. If there is a single theme that has dominated the past decade of neurological research, it is the growing appreciation of the brain's plasticity—its ability to reshape and reorganize itself through adulthood. Blind people who read Braille show a remarkable increase in the size of the region of their somatosensory cortex—a region on the side of the brain that processes the sense of touch—devoted to their right index finger. Violin players show an analogous spread of the somatosensory region associated with the fingers of their left hand, which move about the neck of the instrument playing notes, as opposed to those of their right hand, which merely holds the bow.

"Ten years ago most neuroscientists saw the brain as a kind of computer, developing fixed functions early," says Michael Merzenich of the University of California, San Francisco, a pioneer in understanding brain plasticity. "What we now appreciate is that the brain is continually revising itself throughout life."

While the brain's plasticity begins to degrade in later life, it may never be too late to teach an old brain new tricks. According to preliminary studies in Merzenich's lab, even the memories of pre-senile individuals in their 60s and 70s can, with focused training, be dramatically rejuvenated. Plasticity does have limits, however. If certain critical areas of the cortex—Broca's area, for instance—are destroyed by stroke or tumor, the patient will probably never recover the function once performed by the now silent circuits.

Which brings us back to Corina today. Her tumor has already demolished an egg-size portion of her left frontal lobe containing circuits important to personality, planning, and drive. Fortunately, the brain has some built-in redundancy in these higher functions, and her family has not noticed any change in her personality: The corresponding region of her right frontal lobe is probably shouldering much of the extra load.

But the tumor must be removed now as quickly as possible. The scientists have finished the optical intrinsic imaging of her brain, plus another experimental scanning technique using infrared light. The camera's boom has been rolled back.

The operating room empties of all but the personnel critical to the operation itself. Corina is very tired, but she must stay awake just a little longer. Using an electronic scalpel, Dr. Liau carefully begins to cut into the brain flesh at the border between the tumor and Corina's Broca's area. Under the tent, Dr. Bookheimer flashes more cards in front of her face.

"What's this? A door? Good!"

"¿Y éste?…"

As the scalpel cuts deeper, Liau's eyes are tense above her surgical mask. She must excise every scrap of cancerous tissue. Yet one slip, and the damage cannot be undone. Once the cut along the border is finished, Corina's consciousness is no longer needed, and she can rest.

"How's she doing?" asks Liau.

"Perfect," says Bookheimer. "No problems at all."

"Good," says Liau. "Let's put her back to sleep." An anesthesiologist makes the required adjustment to the chemical mix trickling through Corina's IV. I walk around to where I can see her face.

"Corina," I say, as her eyes begin to close, "you have a beautiful brain." She smiles faintly.

"Thank you," she says.

BIGGER BRAIN

Every day, Glen McNeil spends six or seven hours buzzing about the streets of London on his motorbike with a map clipped to the handlebars. McNeil, 28, is a “knowledge boy,” engaged in the years-long memory training required to earn his green badge and become a licensed London taxi driver, like his father.

If McNeill fulfills his dream, his brain may be the bigger for it, at least in one part. The hippocampus, a seahorse-shaped structure that is part of the brain's limbic system, is critical to many functions of memory and learning, including processing spatial relationships in the environment. An MRI study published in 2000 by scientists at University College, London, showed that in London taxi drivers the rear portion of the hippocampus was enlarged compared with those of control subjects, confounding the long-held notion that the adult human brain cannot grow. But the bonus in brain tissue may not have come free of charge. On average, the front portion of the hippocampus was smaller than normal in the taxi drivers, suggesting that the effort to build an increasingly detailed mental map of the city had recruited neighboring regions to the cause.

If the hippocampus can grow in human adults, what about other parts of the brain? According to a recent study in Germany, learning how to juggle for three months resulted in an increase in the amount of gray matter in two areas involved in visual and motor activity. When the newly trained jugglers stopped practicing, however, these regions shrank back. Furthermore, neither the driver study nor the juggler study could discern whether the growth in brain volume was due to the reorganization of existing circuits, an increased number of neural connections, or most intriguingly, the birth of actual new brain cells—an idea thought preposterous until recently. In 1998 Fred H. Gage of the SaIk Institute in La JoIIa, California, showed that new cells can indeed grow in the adult human hippocampus. Gage believes that stem cells, capable of developing into functioning new neurons, may exist elsewhere in the brain. Better understanding of such nerve regeneration could provide hope for the treatment of Alzheimer's disease, Parkinson's disease, and a host of other degenerative brain disorders.

Meanwhile, Glen McNeilI has more work to do with his hippocampus. He has to pass three sets of examinations testing his knowledge of London streets—and then prove familiarity with the surrounding towns.

"The suburbs is the last 'urdle," he says in his thick Cockney accent. "After that you get your lit'l green badge."

IN YOUR FACE

Forty years ago, psychologist Paul Ekman of the University of California, San Francisco, showed photographs of Americans expressing various emotions to the isolated Fore people in New Guinea. Though most of the Fore had never been exposed to Western faces, they readily recognized expressions of anger, happiness, sadness, disgust, and fear and surprise (which are difficult to differentiate). When Ekman conducted the experiment in reverse, showing Fore faces to Westerners, the emotions were again unmistakable. Ekman's now classic study gave powerful support to the notion that the facial expressions of basic emotions are universal, an idea first put forth by Charles Darwin.

According to Ekman, these six emotions (plus contempt) are themselves universal, evolved to prepare us to deal quickly with circumstances we believe will affect our welfare. Some emotional triggers are universal as well. A sudden invasion of your field of vision triggers fear, for instance. But most emotional triggers are learned. The smell of newly mowed hay will conjure up different emotions in someone who spent idyllic childhood summers in the country and someone who was forced to work long hours on a farm. Once such an emotional association is made, it is difficult, if not impossible, to unmake it.

"Emotion is the least plastic part of the brain," says Ekman. But we can learn to manage our emotions better. For instance, the shorter the time between the onset of an emotion and when we become consciously aware of it—what Ekman calls the refractory period—the more likely we are to double-check to see if the emotion is appropriate to the situation. One way to shorten the refractory period is to be aware of what triggers our various emotions.

AFRAID OF WHAT?

It's a jungle out there, or at least it used to be. The prehistoric environment that shaped our brains sizzled with snakes, growling beasts, swooping birds, and other natural perils. Individuals who retreated spontaneously from such threats survived, while those who cogitated—maybe it's a friendly snake, but why is it coiling up like that?—did not live long enough to pass on their genes.

Natural selection may thus long ago have hard-wired the primate brain with a fear response to such dangers. On the other hand, experiments have shown that laboratory-raised monkeys that have never seen a snake in a natural setting show no more alarm at the sight of one than the young primate in the picture below, blithely nestled in the coils of an 11-foot (3.4-meter) python. So is fear of snakes in our nature or a product of our nurture?

In a series of studies at the University of Wisconsin-Madison in the 1980s, researchers tested the question by comparing lab-raised monkeys with monkeys born in the wild. Labraised monkeys with no previous fear of snakes began to show fear after watching wild-born monkeys, both live and in videotapes, show fear of snakes. But when videos were manipulated so that wild-born monkeys appeared to be afraid of flowers, lab-raised monkeys watching them didn't take the cue. It seems likely that there is indeed, etched into the primate brain, a predisposition to dread natural phenomena that can hurt us, but no predisposition to learn to fear something that will not. But the predisposition requires social experience to be activated. As the lab-raised monkeys learned fear of snakes from other monkeys, the baby stands a good chance of acquiring a fear of snakes after watching other humans.

In more recent research, scientists have traced the neural pathways of fear to a small, almond-shaped structure in the brain's emotional system called the amygdala. It appears to translate the perception of danger into action in two ways.

Our cortex is constantly bombarded with information from our eyes, ears, and other sensory organs. One route for this pipeline sends a torrent of detailed, refined information from high-level cortical regions to the amygdala, which turbocharges the processing of fearful or other emotional stimuli at the expense of less urgent information. If you're driving and listening to the news on the radio when the driver in front of you taps the brakes, your attention will quickly drop off the news report and signal your foot to move to the brake pedal. A second, even faster pathway sends crude information from the senses through subcortical regions directly to the amygdala, bypassing the cortex altogether. If the car in front makes a full-fledged panic stop, this more primitive pathway signals you to slam on the brakes—even before the information reaches the cortical regions that make you conscious of your actions. The milliseconds' head start on your reaction may be enough to mean the difference between life and death.

AUTISTIC GENIUS

Fifteen-year-old Tito Mukhopadhyay squats beside his mother on his bed, rocking, his hands flapping wildly. The gestures are typical of a severely autistic individual, as are his avoidance of eye contact and his unintelligible grunts and moans. But Tito is far from inarticulate. A visitor asks him why he is moving about so much.

"I know it looks different," he answers, using a pencil and paper to scrawl his reply. "But I got into this habit to find and feel my own scattered self."

Initially diagnosed as mentally retarded, he was dragged from one doctor to another in his native India by a mother desperate to find the cause of her son's abnormal behavior and language impairment. Through relentless, sometimes unorthodox, training she broke through the barrier of silence, teaching Tito to add and subtract, to enjoy literature, and eventually to communicate by writing, at first by tying a pencil to his hand. Because of her efforts Tito, rare among low-functioning autistics, can describe with powerful clarity what the condition feels like from the inside.

Tito's vivid autobiographical reflections reveal a sensibility and intelligence greater than his years. In Beyond the Silence, written between the ages of eight and eleven and published in England in 2000 (published as The Mind Tree in the U.S. in 2003) he chronicles his early attempts to cope with the cacophony of disconnected information arriving through his senses and his profound struggle to control his own body and behavior. He wrote of two distinct selves, a thinking self "which was filled with learnings and feelings," and an acting self that was "weird and full of actions," over which he had no more control than if it belonged to another person altogether. "The two selves stayed in their own selves, isolated from each other."

"Tito's remarkable achievements haven't overcome his autism," says Michael Merzenich, a neuroscientist at the University of California, San Francisco, who has studied Tito. "There is still chaos occurring in his brain." Where does that chaos come from? There is no doubt that genes play a role in at least some forms of the disorder. Also, infants who later develop autism often undergo a period of abnormal rapid brain growth in the first year of life, which may be related to an overproduction of cells that carry nerve impulses in the brain's white matter.

Researchers Chris and Uta Frith at University College in London have pinpointed a suite of structures—one above the eyes, another near the ear, and a third high up on the sides of the brain—that allow us to infer what others are thinking and relate to people accordingly. These regions appear to be less active in individuals with autism and Asperger's syndrome, a milder form of the disorder. But other parts of the brain may also be involved, including the amygdala and the hippocampus. It is doubtful that a disorder with such a broad spectrum of symptoms and pathologies has any single cause.

"Men and women are puzzled by what I do," writes Tito. "Doctors use different terminologies to describe me. I just wonder."

PERFECT PITCH

Music is native to the human mind. There is not a culture on Earth that does not have it, and our brains are wired to apprehend and be moved by its magic. By contrast, absolute or perfect pitch—the ability to identify a specific musical tone without hearing it in relation to another one—is an exceedingly rare gift, found in as few as one in 10,000 individuals in Western societies.

People who possess the trait can identify the sound of an E flat or G sharp as effortlessly as anyone else can see that a fire engine is red or the sky is blue. Not surprisingly, it is more common among musicians. Mozart had it, and so did Beethoven. But what accounts for this peculiar faculty?

Some research suggests the phenomenon may not be so unusual after all. Investigators at the University of California, San Diego, found that many people who speak tonal languages, such as Mandarin Chinese and Vietnamese, possess a form of absolute pitch, speaking words and repeating them days later at the same pitch. Another study found that 7 percent of non-Asian freshmen at the Eastman School of Music in Rochester, New York, were endowed with absolute pitch, as opposed to fully 63 percent of their Asian counterparts at the Central Conservatory of Music in Beijing.

But the relationship between absolute pitch and language cannot be the whole story. Not all tonal language speakers have absolute pitch, and not all absolute pitch possessors speak tonal languages. In Japan the trait is relatively common compared with the West, and Japanese is not a tonal language. Perhaps a genetic predisposition for absolute pitch is more common among Asian populations. But a more likely explanation for its prevalence in Japan may be the value the culture places on early music training, exemplified by these young violinists undergoing Suzuki Method training.

BABY KNOWS

What goes on inside a baby's head? Infants cannot communicate their thoughts directly, of course, nor are they likely to lie still in the earsplitting confinement of an MRI machine long enough for researchers to map activity in their brains. At Babylab, part of the Centre for Brain and Cognitive Development at Birkbeck, University of London, researcher Jordy Kaufman takes a direct route to reading a baby's mind.

Kaufman outfits six-month-olds with helmets of electrodes to record electrical activity in their brains while they watch a video cartoon of a train disappearing into a tunnel.

Traditional behavioral studies have implied that infants lack a sense of object permanence: When an object they've been looking at is suddenly hidden from view, they behave as if the object no longer exists. But Babylab's high-tech hairnet records a burst of activity in babies' right temporal lobes as they watch the train disappear, similar to activity measured in adults who are asked to keep an unseen object in mind. And when the tunnel is lifted to reveal no train inside—a violation of object permanence—the electrical activity spikes upward, suggesting that the babies are trying to maintain a mental representation of the train in the face of visual evidence to the contrary.

Does this mean that object permanence is prewired in the brain? Perhaps. But Kaufman prefers to see the development of mind as a fecund interaction between nature and nurture, as an infant's innate predispositions guide it to seek out experience that in turn nourishes and tunes specialized neural networks.

A predisposition to look at faces, for instance, seems to be innate, involving primitive brain regions. But Babylab's Hanife Halit has demonstrated that regions in the higher level temporal cortex become more specialized in facial recognition through the first year of life, at first responding to upright and upside-down monkey and human faces, and finally just to upright human faces. Normal babies also prefer faces that are looking back at them, while autistic children do not. Halit speculates that without an initial predisposition for engaging faces, a baby's brain might fail to be enriched by the social interactions that guide normal development—leading to the wholesale indifference to social stimuli that is one of the hallmarks of autism.

ALTERED MIND

As recently as the late 1980s the human brain was considered to be a sort of biological computer that, as one scientist put it, "secretes thoughts the way kidneys secrete urine." We now know that the brain is much more malleable and fluidly organized than the analogy to computer hardware suggests, and that it changes with every perception and every action.

Over the past decade compelling evidence for neuroplasticity has come from studies of the blind by Alvaro Pascual-Leone, now a professor of neurology at Harvard University and Boston's Beth Israel hospital.

In the early 1990s Pascual-Leone and his colleagues at the National Institutes of Health showed that as blind adults learned to read Braille, the region of the somatosensory (touch-sensitive) cortex responding to input from the reading finger greatly enlarged. In 1996 the researchers made a more startling discovery: Input from the sensitized finger was lighting up not only the somatosensory cortex on the side of the brain, but parts of the visual cortex near the back of the brain as well.

Could it be that in adult blind people, new nerve connections were reaching out across the brain to occupy neural real estate left vacant by the lack of input from their sightless eyes? Pascual-Leone tests that notion by blindfolding sighted individuals for five days. After as few as two days, fMRI scans showed bursts of activity in their visual cortex when they performed tasks with their fingers, or even when they listened to tones or words. This was far too short a time for any nerve connections to grow from the touch and hearing regions of the cortex to the area processing sight. And after just a few hours with the blindfold removed, the visual cortex again responded only to input from the eyes.

So what accounts for this sudden ability of the brain to "see" with input from the fingers and ears? Pascual-Leone suggests the connections from these senses to the visual cortex may already be there but remain unused so long as the eyes are doing their job. When the eyes shut down, the next best way of getting the same information springs into action.

"It's provocative, but we're arguing that the brain may not be organized into sensory modalities at all," he says. What neuroscientists have been calling the visual cortex for the past century might not be devoted exclusively to the eyes, but should more accurately be defined as the area of the brain best able to discriminate spatial relationships—and it will grab whatever input is available to perform that task.

EXTREME EXPRESSION

On any given morning Alice Flaherty, a neurologist at Massachusetts General Hospital in Boston, is writing at her computer by 4:30 a.m. During the day she may also write on scrap paper, toilet paper, her surgical scrubs, and if nothing else is handy, on her own skin. Some of her best ideas come when she's in the shower, so she keeps a waxed pencil there and writes on the walls. She also has a pen attached to her bicycle, just in case the muse hits her in mid-pedal stroke.

These days Flaherty's writing obsession is as much pleasure as compulsion. But it grew out of the painful loss she experienced in 1998 at the deaths of her prematurely born twin boys. Already prolific, Flaherty developed a full-blown case of hypergraphia, a manic disorder characterized by an irrepressible urge to write—and write, and write. Her writing increased 20-fold. The nagging need to write something down would wake her up in the middle of the night to scribble in the dark, surrounding herself with a litter of scrawled notes. A second episode followed the birth of twin daughters, now healthy five-year-olds.

Flaherty's professional and personal interest in hypergraphia eventually led her to take the perhaps not surprising step of writing a book about it. While she speculates that her own worst episodes were triggered by the hormonal pandemonium accompanying birth, the condition is more commonly a symptom of manic depression, mania, and other mood disorders. It is most often associated with temporal lobe epilepsy, a disorder that may also lead to hyper-religious feelings and a sense that even the most trivial events are filled with heightened meaning and cosmic importance. The hypergraphic patient's compulsion to write all the time is not, alas, accompanied by any increase in talent. The diatribes of the Unabomber, Theodore Kaczynski, (or the verbal pablum of some Internet blogs) are typical output.

Nevertheless, the key role of the temporal lobe in hypergraphia may offer a window into the neural underpinnings of literary creativity, and creativity in general. According to popular wisdom, the right hemisphere of the brain is more creative, while the left brain is more logical and objective. While there is some basis for this belief, it is certainly an oversimplification. As Flaherty discusses in her book The Midnight Disease: The Drive to Write, Writer's Block, and the Creative Brain, more important to creativity may be the connections through the limbic system—the more primitive, emotional part of the brain—between the temporal lobes on the sides of the brain and the frontal lobes behind the forehead. While the frontal lobes may be important for providing the judgment and flexibility of thought that underlies talent, structures in the temporal lobes and limbic system supply drive and motivation, which Flaherty believes are more important parts of the creative equation than talent itself. This applies not only to writing, but to all kinds of creative activity.

"To be a truly creative chess player," she says, "probably just loving the game and playing it ten hours a day may be more important than having some special pattern recognition ability in your brain."

SPIRITUAL STATE

For 2,500 years Buddhists have employed strict training techniques to guide their mental state away from destructive emotions and toward a more compassionate, happier frame of being. Spurred by the cascade of new evidence for the brain's plasticity, Western neuroscientists have taken a keen interest. Can meditation literally change the mind?

For the past several years Richard Davidson and his colleagues at the University of Wisconsin-Madison have been studying brain activity in Tibetan monks, both in meditative and non-meditative states. Davidson's group had shown earlier that people who are inclined to fall prey to negative emotions displayed a pattern of persistent activity in regions of their right prefrontal cortex. In those with more positive temperaments the activity occurred in the left prefrontal cortex instead. When Davidson ran the experiment on a senior Tibetan lama skilled in meditation, the lama's baseline of activity proved to be much farther to the left of anyone previously tested. Judging from this one study, at least, he was quantifiably the happiest man in the world.

Davidson recently tested the prefrontal activity in some volunteers from a high-tech company in Wisconsin. One group of volunteers then received eight weeks of training in meditation, while a control group did not. All the participants also received flu shots.

By the end of the study, those who had meditated showed a pronounced shift in brain activity toward the left, "happier," frontal cortex. The meditators also showed a healthier immune response to the flu shot, suggesting that the training affected the body's health as well as the mind's.

"You don't have to become a Buddhist," says the Dalai Lama himself, who is closely folowing the work of Western cognitive scientists like Davidson. "Everybody has the potential to lead a peaceful, meaningful life."

Related Topics

Go Further