When Did the Human Mind Evolve to What It is Today?
Archaeologists are finding signs of surprisingly sophisticated behavior in the ancient fossil record
- By Erin Wayman
- Smithsonian.com, June 26, 2012, Subscribe
(Page 2 of 2)
Working memory allows the brain to retrieve, process and hold in mind several chunks of information all at one time to complete a task. A particularly sophisticated kind of working memory “involves the ability to hold something in attention while you’re being distracted,” Wynn says. In some ways, it’s kind of like multitasking. And it’s needed in problem solving, strategizing, innovating and planning. In chess, for example, the brain has to keep track of the pieces on the board, anticipate the opponent’s next several steps and prepare (and remember) countermoves for each possible outcome.
Finding evidence of this kind of cognition is challenging because humans don’t use advanced working memory all that much. “It requires a lot of effort,” Wynn says. “If we don’t have to use it, we don’t.” Instead, during routine tasks, the brain is sort of on autopilot, like when you drive your car to work. You’re not really thinking about it. Based on frequency alone, behaviors requiring working memory are less likely to be preserved than common activities that don’t need it, such as making simple stone choppers and handaxes.
Yet there are artifacts that do seem to relate to advanced working memory. Making tools composed of separate pieces, like a hafted spear or a bow and arrow, are examples that date to more than 70,000 years ago. But the most convincing example may be animal traps, Wynn says. At South Africa’s Sibudu cave, Lyn Wadley, an archaeologist at the University of the Witwatersrand, has found clues that humans were hunting large numbers of small, and sometimes dangerous, forest animals, including bush pigs and diminutive antelopes called blue duikers. The only plausible way to capture such critters was with snares and traps.
With a trap, you have to think up a device that can snag and hold an animal and then return later to see whether it worked. “That’s the kind of thing working memory does for us,” Wynn says. “It allows us to work out those kinds of problems by holding the necessary information in mind.”
It may be too simple to say that symbolic thinking, language or working memory is the single thing that defines modern cognition, Marean says. And there still could be important components that haven’t yet been identified. What’s needed now, Wynn adds, is more experimental archaeology. He suggests bringing people into a psych lab to evaluate what cognitive processes are engaged when participants make and use the tools and technology of early humans.
Another area that needs more investigation is what happened after modern cognition evolved. The pattern in the archaeological record shows a gradual accumulation of new and more sophisticated behaviors, Brooks says. Making complex tools, moving into new environments, engaging in long distance trade and wearing personal adornments didn’t all show up at once at the dawn of modern thinking.
The appearance of a slow and steady buildup may just be a consequence of the quirks of preservation. Organic materials like wood often decompose without a trace, so some signs of behavior may be too ephemeral to find. It’s also hard to spot new behaviors until they become widely adopted, so archaeologists are unlikely to ever locate the earliest instances of novel ways of living.
Complex lifestyles might not have been needed early on in the history of Homo sapiens, even if humans were capable of sophisticated thinking. Sally McBrearty, an archaeologist at the University of Connecticut in Storrs, points out in the 2007 book Rethinking the Human Revolution that certain developments might have been spurred by the need to find additional resources as populations expanded. Hunting and gathering new types of food, such as blue duikers, required new technologies.
Some see a slow progression in the accumulation of knowledge, while others see modern behavior evolving in fits and starts. Archaeologist Franceso d’Errico of the University of Bordeaux in France suggests certain advances show up early in the archaeological record only to disappear for tens of thousands of years before these behaviors—for whatever reason—get permanently incorporated into the human repertoire about 40,000 years ago. “It’s probably due to climatic changes, environmental variability and population size,” d’Errico says.
He notes that several tool technologies and aspects of symbolic expression, such as pigments and engraved artifacts, seem to disappear after 70,000 years ago. The timing coincides with a global cold spell that made Africa drier. Populations probably dwindled and fragmented in response to the climate change. Innovations might have been lost in a prehistoric version of the Dark Ages. And various groups probably reacted in different ways depending on cultural variation, d’Errico says. “Some cultures for example are more open to innovation.”
Perhaps the best way to settle whether the buildup of modern behavior was steady or punctuated is to find more archaeological sites to fill in the gaps. There are only a handful of sites, for example, that cover the beginning of human history. “We need those [sites] that date between 125,000 and 250,000 years ago,” Marean says. “That’s really the sweet spot.”
Erin Wayman writes Smithsonian.com's Homind Hunting blog.
Subscribe now for more of Smithsonian's coverage on history, science and nature.









Comments