For the vast majority of humanity’s existence, and that of humanity’s close relatives and predecessors in the human lineage, we’ve been foragers. For the most part, our ancestors wandered from place to place gathering edible plants, collecting shellfish from rivers and coastlines, hunting wild game, and scavenging the kills of more accomplished predators.
That started to change a little more than 10,000 years ago, when a few groups of people started intentionally growing their own plant foods and domesticating animals, settling down in permanent villages to better do so. First in the Fertile Crescent of the Near East, a bit later in East Asia, a little after that in New Guinea and the Americas, and perhaps another couple of millennia further on in West Africa, some portion of humanity became farmers. They cleared forests to plant fields, displaced native animal species with their herds of cattle and sheep, and altered their environments in profound ways.
The Neolithic had been born. Slowly but inexorably, civilization became visible in the distance.
This is the story, anyway. There are a few kernels of truth to it: In most parts of the world, urbanism, monumental buildings, elaborate social hierarchies, and complex forms of political organization - what we think of as civilization - did follow the emergence of farming.
The rest of it is wrong. The dichotomy between “wild” and “managed” environments is misleading. Foragers have all sorts of impacts on their environment, both intentional and unintentional. Farming is simply an extreme form of what ecologists call “niche construction,” when an animal modifies its environment to better suit its needs. Beavers build dams. Ants build anthills. Humans, even seemingly simple foragers in the distant past, burned foliage, cut down trees, and selectively culled prey animals. There’s no such thing as a fully wild environment, at least not anywhere people have stepped foot.
Moreover, the adoption of farming wasn’t like flipping an on-off switch; people didn’t wake up one day deciding to stake their lives on tossing some barley seeds into the ground or a few surly aurochs they’d managed to lure into a pen. In most of the places this transition took place, there were long periods in which intentional food production coexisted with gathering wild resources and hunting wild game. For example, groups might sow some seeds in a plot they’ve cleared from a forest, move to a nice river nearby to catch some fish, move again to hunt deer and gather nuts and berries, and then return to harvest their grain. Are these folks farmers? Not really, but food production is one part of their subsistence strategy.
Finally, the narrative is wrong because there is no archetype of the hunter-gatherer, no primeval state of nature we can hold up to our own modern existence as a mirror. People made their living by foraging for much of humanity’s past. Some still do so today, either in whole or in part. Yet there is no single hunter-gatherer lifestyle, subsistence strategy, social organization, or way of child-rearing. There’s no single hunter-gatherer diet that humans evolved to eat. Nor can we simply look at today’s hunter-gatherers and assume they’re living fossils who provide a clear view of humanity’s distant past. They don’t.
Why? Because there are many, many different ways to do all of those things. Subsistence strategies depend on the environment in which a group of people live. If you live in a desert, you’re probably not going to get a lot of your food by fishing. If you live along a river that’s packed with salmon, by contrast, you’ll probably get a lot of your food that way. The environments in which today’s hunter-gatherers live aren’t the environments in which past foragers lived. Their lifestyles are necessarily different because of that.
In fact, most of the environments in which people have lived over the past several hundred thousand years don’t exist at all. There is no mammoth steppe any longer, once the most common biome on Earth, filled with herds of bison and mammoth. The huge expanses of low-lying coastal grassland that existed 10,000 years ago in northern Europe and southeast Asia are now under dozens of feet of water, inundated as the glaciers melted and sea levels rose. The lifestyles of today’s foragers in the Kalahari Desert or the Arctic don’t directly tell you much of anything about shellfish-eating coast-dwellers or the Paleolithic hunters who stalked mammoth across treeless plains in what’s now France. The environments don’t exist, and neither do the subsistence strategies that allowed people to survive in them.
Surviving in different environments requires different subsistence strategies. People choose different sources of food depending on what’s available. They move more or less often, longer or shorter distances, depending on the distribution of food and other resources in their environment. They use different technological toolkits depending on what’s necessary to procure and process their food, and what they can afford to carry with them.
People aren’t idiots. Hunter-gatherers past and present, being people, aren’t and weren’t idiots. They want to survive, and they tend to pick effective ways of doing just that.
This is the cornerstone of human behavioral ecology, an approach to understanding human behavior and survival that focuses on the choices people make about how to pursue resources and spend their time. Time spent making a fishing net is time not spent actually fishing; time spent tracking a deer is time not spent looking for berries or gathering acorns. Every activity has a potential return, but also a trade-off in terms of energy expenditure and time.
This is the valuable insight we can glean from close observation of today’s hunter-gatherers: not a direct window onto the deep past, but a sense for how environments, subsistence strategies, technological toolkits, and the wild card of inherited culture all interact to drive behavior at both the individual and group levels.
When we systematically examine hunter-gatherers in the ethnographic record of the present and near past, what we see is variability. Some foragers, like groups living in Amazonia today, tend to move every few days; others, like the Indigenous groups of the Pacific Northwest, were almost entirely sedentary. Some foragers get 80 percent of their calories from plant foods; others get 80 percent from fish or hunted game. Some, like certain groups of Inuit, have elaborate technological packages, with multi-part harpoons, sophisticated boats, and ingenious clothing; others, like the Juǀʼhoansi of southern Africa, use bows and arrows and digging sticks. Some are avowedly egalitarian and no concept of property; others have complex social hierarchies and institutionalized inequality.
If we see that kind of variability in the present and very near past, we should expect to find it in the more distant past as well. In fact, we should anticipate even more variability, because the range of possible environments was greater. So too were the timescales, which allowed for even greater cultural drift. If we see that variability in the material record, and we do, we shouldn’t be surprised if it existed in things that are harder to see as well, in intangible ideas about gender, spirituality, fairness, and hierarchy. These things aren’t hard-wired into us by some primeval set of circumstances hundreds of thousands of years ago.
There are many ways of being human today, many ways of surviving, and we should expect that in the past as well. Beware the textbook explanation of small bands of archetypal hunter-gatherers who behaved in timeless ways; they never existed, and still don’t today.
I wrote a book! It's called The Verge: Reformation, Renaissance, and Forty Years that Shook the World. The book comes out in July, but you can pre-order it here.