But that’s not the case in the modern developed world. If you are really indifferent to status, you can easily get enough food, housing, and medical care to survive by sheer freeloading. This is true even in the U.S., let alone in more extensive welfare states.
I’m not sure this is true; I know little about welfare politics, but I was under the impression there was a major shift over the last ten years toward limiting the amount of welfare benefits available to people who are “abusing the system” by not looking for work.
One could probably remain alive for long periods just by begging and being homeless, but this raises the question of what, exactly, is a “life worth living”, such that we could rest content that people were working because they enjoy status competitions and not because they can’t get a life worth living without doing so.
This is probably way too subjective to have an answer, but one thing that “sounds right” to me is that the state of nature provides a baseline. Back during hunter-gatherer times we had food, companionship, freedom, et cetera without working too hard for them (the average hunter-gatherer only hunted-gathered a few hours a day). Civilization made that kind of lifestyle impossible by killing all the megafauna and paving over their old habitat, but my totally subjective meaningless too-late-at-night-to-think-straight opinion is that we can’t say that people can opt-out of society and still have a “life worth living” unless they have it about as good as the hunter-gatherers they would be if society hadn’t come around and taken away that option.
The average unemployed person in a developed country has a lot of things better than hunter-gatherers, but just the psychological factors are so much worse that it’s no contest.
The specific situation in the U.S. or any other individual country doesn’t really matter for my point. Even if I’m wrong about how easy freeloading is in the U.S., it’s enough that we can point to some countries whose welfare systems are (or even just were at some point) generous enough to enable easy freeloading.
Ironically, in my opinion, in places where there exists a large underclass living off the welfare state, it is precisely their reversal to the forager lifestyle that the mainstream society sees as rampant social pathology and terrible deprivation of the benefits of civilized life. I think you’re committing the common error of idealizing the foragers. You imagine them as if you and a bunch of other highly intelligent and civilized people had the opportunity to live well with minimal work. In reality, however, the living examples of the forager lifestyle correctly strike us as frightfully chaotic, violent, and intellectually dead.
(Of course, it’s easy to idealize foragers from remote corners of the world or the distant prehistory. One is likely to develop a much more accurate picture about those who live close enough that one has to beware not to cross their path.)
You are not wrong about “freeloading,” though that term is probably (unnecessarily pejorative). The Developed world is so obscenely wasteful that it is not necessary to beg. You can get all the food you want, much of it very nice—often much nicer than you could afford to buy by simply going out and picking it up. Of course, you don’t get to pick and choose exactly what you want when you want it.
Clothing, with the exception of jeans, is all freely available. The same is true of appliances, bedding and consumer electronics of many kinds. The one commodity that is is very, very difficult to get at no cost is lodging. You can get books, MP3 players, CDs, printers, scanners, and often gourmet meals, but lodging is tough. The problem with housing and why it is qualitatively different that the other things I’ve cited is that while it is technically illegal to dustbin dive, in practice it is easy to do and extremely low risk. It is incredibly easy in the UK, if you get a dustbin key (easy to do).
However, the authorities take a very dim view of vagrancy, and they will usually ticket or arrest the person who has either “failure to account,” or is clearly living in a vehicle or on the street. This is less true in the UK than the US. However, get caught on the street as a vagrant AND as a foreigner in the UK (or in the US, or in any Developed country) and you are in a world of hurt—typically you will be deported with prejudice and be unable to renter the country either “indefinitely,” or for some fixed period of time.
If you can swing lodging, then the world is your oyster (for now). I travel with very little and within 2 weeks of settling on a spot in large city, I have cookware, flatware, clothing, a CD player, a large collection of classical CDs, and just about anything else I want to go looking for. There is an art to it, but the waste is so profligate that it is not hard to master, and absolutely no begging is required (except for lodging ;-))
Speaking from a lifetime of experience on welfare in the US (I’m disabled, and have gotten work from time to time but usually lost it due to factors stemming either from said disability, or the general life instability that poverty brings with it), your impressions are largely correct.
I’m not sure this is true; I know little about welfare politics, but I was under the impression there was a major
shift over the last ten years toward limiting the amount of welfare benefits available to people who are
“abusing the system” by not looking for work.
What I’d say is that the shift (and it’s been more like the last forty years, albeit the pace has picked up since Reagan) is towards “preventing abuse” as a generic goal of the system; the result has been that the ability to deliver the services that ostensibly form the terminal goal of welfare-granting organizations is significantly diminished—there’s a presumption of suspicion the moment you walk in the door. Right now, SSI applicants are auto-denied and have to appeal if they want to be considered at all, even if all their administrative ducks are otherwise in a row; this used to be common practice, but now it’s standard.
This also means that limits are fairly low. I can’t receive more than 40 dollars a month in food stamps right now because my apartment manager won’t fill out a form on my behalf stating the share of rent and other services I pay in my unit. He has an out; he’s not involved in the household finances. But without that in writing, from that person, the office presumes that since I have roommates declared, my share of the household expenses is zero, ergo I’m entitled to the minimum allowable (they can’t just deny me since I’m on SSDI).
And having been homeless for a little while (thankfully a friend helped me get the down payment on a place I could just barely afford), yeah...Vladimir_M’s comments are based more on rhetoric than substance. One thing I observe is that many people who are long-term impoverished or homeless (self included) will project a bit of being inured to status as a way of just securing ourselves some dignity in our interactions with others—but nobody in that situation could miss how deeply that status differential cuts whenever it’s used against us, even implicitly in the way people just ignore or dismiss them,
As luck would have it, I have some limited experience with living for periods of about a month at a time in a household where we gathered about 80 percent of the food we ate (no exaggeration). Rich in what the land around of offered, rich in the basic assets needed to make use of it, rich in ability to keep ourselves entertained and occupied during our copious free time.
I could easily see the typical hunter-gatherer experience being very, very good. Certainly, I’d rather be financially and material poor under the conditions I described above, than in my present circumstances.
Then what limited the growth of forager peoples so substantially? There had to be a mechanism to prevent them from exceeding their region’s carrying capacity. If a tribe of 50 people grew at a rate of 1% for 2000 years there would 24 billion people in it. Clearly that didn’t happen; in fact there have been massive die-offs from starvation due to cyclical climate change, or to resource warfare (sometimes fought to extinction) between neighboring tribes.
I could easily see the typical hunter-gatherer experience being very, very good. Certainly, I’d rather be financially and material poor under the conditions I described above, than in my present circumstances.
You cannot be considered financially and materially impoverished if you have access to abundant natural resources. Nevermind if you own that or can enforce the exclusive status of your rights to it—if you have those resources available to you they at least count as cash flow if not assets.
Limited access to limited resources is far more typical, and life is not so leisurelly when you spend every hour of daylight working to procure food that still isn’t enough to provide for you and your family. That is also the state of nature and was a situation that a great many people have found themselves in for the brief time that they managed to survive it.
Limited access to limited resources is far more typical, and life is not so leisurelly when you spend every hour of > daylight working to procure food that still isn’t enough to provide for you and your family. That is also the state of > nature and was a situation that a great many people have found themselves in for the brief time that they
managed to survive it.
That...actually doesn’t represent the human condition for most of our ancestral history, nor the current state of surviving forager peoples for the most part.
Resources are limited, but you only need about 15 hours of work a week per hunter-gatherer individual devoted to food-producing activities. Overdo that and you may well tax your ecosystem past carrying capacity. This is why foragers wander a migratory circuit (although they tend to keep to a known, fixed route) or live in areas where there’s sufficient ecological abundance to allowed for a sedentary lifestyle while still using hunter-gatherer strategies. It’s also why they tended to have small populations. Scarcity was something that could happen, but that’s why people developed food preservation technologies and techniques that you can assemble with nothing more than accumulated oral tradition and some common sense. Tie a haunch of meat down to some stones and toss it down to the bottom of a cold lake. That meat will keep for months, longer if the lake freezes over. It’ll be gamy as hell, but you won’t starve—and this is a pretty typical solution in the toolkit of prehistoric humans from Northern regions. Drying, salting (sometimes using methods that would squick you—one branch of my ancestors comes from a culture that used to preserve acorns by, kid you not, burying them in a corner of the home and urinating over the cache), chemical preservation, favoring foods that store long-term well in the first place, fermentation, and a flexible diet are all standard knowledge.
In the American Southwest (a hot, harsh, dry and ecologically-poor climate), Pueblo people and many others used to rely on the seasonal abundance of Mormon Crickets for protein. You can gather eighteen pounds of them an hour when they pass through, basically just by walking around and picking up bugs. The nutritional profile beats the hell out of any mammal meat, and they can be preserved like anything else. Think about that for a second—one person, in one hour, can provide enough of these bugs to feed an entire village for a day, or their own household for weeks (and that’s without preservation). It’s not desperation; it’s a sound food-gathering strategy, and a lot more palatable when you don’t come from a culture used to think of insects as a culinary taboo.
Starving to death is more of an issue for low-tech pastoralists and agriculturalists—people who use just a small fraction of the available edible resources to support populations that wouldn’t be able to forage on the available resources. The relationship of effiort to output for them is linear; work your farm harder, get more food in proportion—and you need to run a surplus every year in most cases because there is non-negotiable downtime during which it’s going to be hard to switch to another food source (and even if you do, you’ll be competing with your neighbors for it).
In my own case, I’ve taken part in of a family of five supplying themselves with only a few culturally-specific dietary staples (powdered milk, spices, flour, rice, things that we could easily have done without had they not been available) doing most of their food-production by just going out and getting it somewhere within a mile of home. Clams. squid and oysters were for storing (done with a freezer or by canning with water and salt) and cooking up into dishes we could eat for the rest of the month; small fish were gathered day-by-day, large fish stored (one salmon or sturgeon can feed five people for over a month when you have a freezer), crabs and similar gathered on a case-by-case basis. I personally wasn’t fond of frog legs, but a nearby pond kept up with a whole lot of demand for frogs in my family and others. We never bothered with anything like deer or bird hunting, but we’d gather berries, tree fruits (apple, plum, pear) and mushrooms, grow garden veggies and basically just keep ourselves supplied.
I’m not saying everyone on Earth could switch back today—heck no. A whole lot of people would starve to death after destroying the ecosystems they need. But my ancestors lived in that place for thousands of years and starving to death was not a common experience among them, because they weren’t used to the population densities that only come with intensive agriculture. And there are people descended from foragers of even more remote and desolate climes—some of them STILL living that way—who can say the same thing.
Then what limited the growth of forager peoples so substantially? There had to be a mechanism to prevent them from exceeding their region’s carrying capacity. If a tribe of 50 people grew at a rate of 1% a year for 2000 years there would 24 billion people in it. Clearly that didn’t happen; in fact there have been massive die-offs from starvation due to cyclical climate change, or to resource warfare (sometimes fought to extinction) between neighboring tribes.
Then what limited the growth of forager peoples so substantially?
I am so glad you asked, because the answer to your question reveals a fundamental misapprehension you have about forager societies and indeed, the structure and values of ancestral human cultures.
The fact is that forager populations don’t grow as fast as you think in the first place, and that across human cultures still living at or near forager methods of organization, there are many ways to directly and indirectly control population.
It starts with biology. Forager women reach menarche later, meaning they’re not fertile until later in life. Why? Largely, it’s that they tend to have much lower body fat percentages due to diet and the constant exercise of being on the move , and that’s critical for sustaining a pregnancy, or even ovulating in the first place once you’ve reached the (much higher) age where you can do that. Spontaneous abortions or resorption of the fetus are rather common. Women in an industrial-farming culture attain menarche quite a bit earlier and are more likely to be fertile throughout their active years—it only looks normal to you because it’s what you’re close to. So right out of the gate, forager women are less likely to get pregnant, and less likely to stay that way if they do.
Next biological filter: breastfeeding. Forager women don’t wean their children onto bottles and then onto solid food the way you experienced growing up. Breastfeeding is the sole means for a much longer period, and it’s undertaken constantly throughout the day—sleeping with the baby, carrying them around during the daily routine. It goes on for years at a time even after the child is eating solid food. This causes the body to suppress ovulation—meaning that long after you’re technically able to get pregnant again, the body won’t devote resources to it. All the hormonal and resource-delivery cues in your body point to an active child still very much in need of milk! Not only that, but it’s routine in many such societies for women to trade off breastfeeding duty with one another’s children—the more kids there are, the more likely it is that every woman in the proximate social group will have moderately suppressed fertility. It’s a weak effect, but it’s enough to lengthen the birth interval considerably. In the US, a woman can have a baby just about every year—for modern-day foragers, the birth interval is often two to five years wide. It’s harder to get pregnant, and once you do, the kids come more slowly.
The next layer is direct means of abortion. In the US that tends to be pretty traumatic if it’s not performed by a medical specialist. In some cases it still is for forager women—the toolkit of abortives across all human cultures is very wide. Midwives and herbalists often have access to minimally-invasive methods, but they also often have painful or dangerous ones. What you won’t find is many that are truly ineffective. Methods range from the unpleasant (direct insertion of some substance to cause vaginal bleeding and fetal rejection), to the taxing or dangerous (do hard work, lift heavy objects, jump from a high place) to fasting and ingestible drugs that can induce an abortion or just raise the likelihood of miscarriage.
The last layer is infanticide (and yes, we have this too, though it’s a deprecated behavior). In all cultures that practice it it’s considered a method of last resort, and it’s usually done by people other than the mother, quickly and quietly. Forager cultures are used to having to do this from time to time, but it’s still a rare event—certainly not a matter of routine expedience.
The point I’m making is that population growth unto itself is not a goal or a value of forager societies like those every human being on earth is descended from (and which some still occupy today). Growth, as an ideological goal, is a non-starter for people living this way. Too many mouths to feed means you undercut the abundance of your lifestyle (and yes, it truly is abundance most of the time, not desperate Malthusian war of all against all) -- and forager lives tend to be pretty good on the whole, filled with communitas and leisure and recreation aplenty as long as everybody meets a modest commitment to generating food and the supporting activities of everyday life. I’m not making it out to be paradise; this is just really what it’s like, day to day, to live in a small band of mostly close relatives and friends gathering food from what’s available in the environment.
I’ve heard claims like these several times, but this situation where individuals voluntarily limit their reproduction for the common good can’t possibly be a stable equilibrium. It faces a coordination problem, more specifically a tragedy of the commons. As soon as even a small minority of the forager population starts cheating and reproducing above the replacement rate (by evolving either cultural memes or hereditary philoprogenitive behaviors that motivate them to do so), in a few generations their exponential growth will completely swamp everyone else. The time scales on which forager societies have existed are certainly more than enough for this process to have taken place with certainty.
In order for such equilibrium to be stable, there would have to exist some fantastically powerful group selection mechanism that operates on the level of the whole species. I find this strikingly implausible, and to my knowledge, nobody has ever proposed how something like that might work.
It happened in the real world, ergo the issue lies with your understanding of the system we’re talking about and not with its inability to conform to your model.
I’ve heard claims like these several times, but this situation where individuals voluntarily limit their
reproduction for the common good can’t possibly be a stable equilibrium.
You’re looking at this backwards. This is the reproductive context in which humanity evolved, and the Malthus-driven upward spiral of population and competition is the result of comparitively recent cultural shifts brought on by changing lifestyles that made it viable to do that. You don’t need to invoke group selection in the form you’re thinking of—the cultural “mutations” you’re positing can’t gain a foothold until some branch of humanity has access to a lifestyle that makes it advantageous to defect like that. Forager societies don’t have that incentive because if they overtax their resource base here and now they have to move, and for most of human prehistory (and the modern history of hunter-gatherers) the population densities were low enough that this gave the affected area time to recover, so when someone came back, things were fine again. A long-term climatic shift alters the range of viable habitats near you, but it takes something pretty darn catastrophic (more than just a seasonal or decadal shift) to entirely render a region uninhabitable to a group of size n.
The biggest filters to population growth in this system are entirely passive ones dictated by biology and resources—the active ones are secondary measures, and they’re undertaken because in a system like this, the collective good and the individual good are inextricably linked. It was a stable equilibrium for most of our evolution, and it only broke when and where agriculture became a viable option that DIDN’T immediately overtax the environment.
That’s a state of affairs that took most of human existence to come into being.
It happened in the real world, ergo the issue lies with your understanding of the system we’re talking about and not with its inability to conform to your model.
You assert these things very confidently, but without any evidence. How exactly do we know that this state of affairs existed in human prehistory?
You say:
You don’t need to invoke group selection in the form you’re thinking of—the cultural “mutations” you’re positing can’t gain a foothold until some branch of humanity has access to a lifestyle that makes it advantageous to defect like that. Forager societies don’t have that incentive because if they overtax their resource base here and now they have to move, and for most of human prehistory (and the modern history of hunter-gatherers) the population densities were low enough that this gave the affected area time to recover, so when someone came back, things were fine again.
This, however, provides no answer to the question why individuals and small groups wouldn’t defect, regardless of the subsequent collective consequences of such defection. You deny that you postulate group selection, but you keep talking in a very strong language of group selection. Earlier you asserted that “population growth unto itself is not a goal or a value of forager societies,” and now you say that “[f]orager societies don’t have that incentive.” How can a society, i.e. a group, have “values” and “incentives,” if you’re not talking about group selection? And if you are, then you need to answer the standard objection to arguments from group selection, i.e. how such group “incentives” can stand against individual defection.
I have no problem with group selection in principle—if you think you have a valid group-selectionist argument that invalidates my objections, I’d be extremely curious to hear it. But you keep contradicting yourself when you deny that you’re making such an argument while at the same time making strong and explicit group-selectionist assertions.
You assert these things very confidently, but without any evidence. How exactly do we know that this state of affairs existed in human prehistory?
Archaeological evidence regarding the health and population density of human beings and their dietary habits. Inference from surviving examples. The null hypothesis, that we didn’t start with agriculture and therefore must have been hunter-gatherers for most of our existence as a species. The observatiion that the traits generally associated with the Malthusian trap are common experiences of agricultural societies and dependent upon conditions that don’t obtain in predominantly and purely hunter-gatherer societies.
This, however, provides no answer to the question why individuals and small groups wouldn’t defect,
regardless of the subsequent collective consequences of such defection.
They might defect, but it’d gain them nothing. Their cultural toolkits and food-gathering strategies were dependent upon group work at a set quota which it was maladaptive to under- or overreach. An individual can″t survive for long like this compared to a smallish group; a larger group will split when it gets too big for an area, a big group can’t sustainably form.
How can a society, i.e. a group, have “values” and “incentives,” if you’re not postulating group selection?
The answer to this lies in refuting the following:
As soon as even a small minority of the forager population starts cheating and reproducing above the
replacement rate (by evolving either cultural memes or hereditary philoprogenitive behaviors that motivate them
to do so), in a few generations their exponential growth will completely swamp everyone else.
“A small minority of the forager population” has to be taken in terms of each population group, and those are small. A small percentage of a given group might be just one or two people every handful of generations, here. A social umbrella-group of 150 scattered into bands of 10-50 throughout an area, versus just one or two people? Where’s the exponential payoff? The absolute numbers are too low to support it, and the defectors are stuck with the cultural biases and methodologies they know. They can decide to get greedy, but they’re outnumbered by the whole tribe, who are more than willing to provide censure or other forms of costly social signalling as a means of punishing defectors. They don’t even have to kill the defectors or drive them out; the defectors are critically dependent on the group for their lifestyle. The alternatiive will be unappealing in all but a vast majority of cases.
You need the kind of population densities agriculture allows to start getting a really noticeable effect. It’s not to say people don’t ever become tempted to defect, but it’s seldom a beneficial decision. And many cultures, such as the San ones in South Africa, have cultural mechanisms for ensuring nobody’s ego gets too big for their britches, so to speak. Teasing and ribbing in place of praise when someone gets a big head about their accomplishments, passive reminders that they need the group more than they individually benefit it.
This isn’t so much about group selection,as it is about all the individuals having their raft tied to the same ship—a group big enough to provide the necessities of life, which also provides a lot of hedonic reinforcement for maintaining that state of affairs, and a lot of non-coercive negative signalling for noncompliance, coupled with the much more coercive but morally neutral threat presented by trying to make a living in this place all by yourself.
If you break a leg in a small group, the medical practitioner splints it and everyone keeps feeding you. If you do that by yourself, it probably never heals right and the next leopard to come along finds you easy pickings. That’s what defection buys you in the ancestral environment.
a larger group will split when it gets too big for an area
Say there are two kinds of forager groups, one which limits reproduction of its members by various means, and another that does not limit reproduction and instead constantly grows and splits and invades other groups’ territories if needed. Naively I would expect that the latter kind of group would tend to drive the former kind out of existence. Why didn’t this happen?
Archaeological evidence regarding the health and population density of human beings and their dietary habits. Inference from surviving examples.
This isn’t necessarily evidence against a Malthusian equilibrium. It could be that the subsequent farmer lifestyle enabled survival for people with much poorer health and physical fitness, thus lowering the average health and fitness of those who managed to survive in the Malthusian equilibrium.
Can you give a reference that specifically discusses how a non-Malthusian situation of the foragers can be inferred from the existing archaeological evidence?
The observatiion that the traits generally associated with the Malthusian trap are common experiences of agricultural societies and dependent upon conditions that don’t obtain in predominantly and purely hunter-gatherer societies.
This is not true. Humans are (more or less) the only species that practices agriculture, but the Malthusian trap happens to non-human animals too. As long as reproduction above the replacement rate is possible, it will happen until the resource limit is reached. (Admittedly, for animals that aren’t apex predators, the situation is more complicated due to the predator-prey dynamics.)
Regarding the foragers’ supposed cooperation on keeping the population stable, I honestly don’t see how what you write makes sense, for at least two reasons:
The defectors would not need to reproduce in blatantly extraordinary numbers. It would be enough to reproduce just slightly above the replacement rate, so slightly that it might be unnoticeable for all practical purposes. The exponential growth would nevertheless explode their population in not very many generations and lead to them overwhelming others. So even if we assume that blatantly excessive reproduction would be punished, it would still leave them more than enough leeway for “cheating.”
How did this punishment mechanism evolve, and how did it remain stable? You can postulate any group selection mechanism by assuming altruistic punishment against individuals who deviate from the supposed group-optimal behavior. But you can’t just assert that such a mechanism must have existed because otherwise there would have been defection.
Moreover, you are now talking about group selection with altruistic punishment. There’s nothing inherently impossible or absurd about that, but these are very strong and highly controversial claims, which you are asserting in a confident and authoritative manner as if they were well-known or obvious.
I’d like to remind you that the ancestral environment was not completely stable, and no one is disputing that exponentially-expansive Malthusian agriculture happened. The question is why it took as long as it did, not why it was possible at all.
Essentially human for our first 2 million years of existence, human population worldwide went from about 10,000 to 4 million. Given that virtually all major models of long-run human population converge very closely, and they all assume a relatively steady growth rate, we’re talking a doubling period of 250,000 years.
Malthus’ estimates assume a doubling rate of 25 years, or a single human generation. The difference is a factor of 10,000. World population simply did not grow as fast as you’re assuming, and humanity did not start outstripping local carrying capacities in a major, systematic way until we’d developed technologies that allowed us to make those sorts of population growth leaps.
According to Michael Kremer in “Population Growth and Technological Change: One Million BC to 1990”, the base rate of technological change in human societies scales proportional to population—small population, slow technological change. This equals very long inferential distances to the sorts of techniques and behaviors that make agriculture a viable prospect.
You need intermediate steps, in the form of settled horticulture or nomadic pastoralism, to really concentrate the population enough to have a chance at developing agriculture in the intensive sense. Those sorts of cultural developments took a long time to come into being, and it was a gradual process at that.
So, yes, it’s true that if you grow certain grasses and just harvest their seeds reliably, grinding them into a fine powder and mixing that with water and then heating the whole mixture somehow without actually burning it in your fire directly, you can produce a food source that will unlock access to population-doubling intervals closer to the Malthusian assumption of one doubling per generation.
But that is a series of nested behaviors, NONE of which is intuitively obvious by itself from the perspective of a forager in a world full of nothing but other foragers. Which is why the entire chain took a long, long time to develop, and why agriculture was invented just a few times throughout human history.
This is not true. Humans are (more or less) the only
species that practices agriculture, but the Malthusian
trap happens to non-human animals too. As long as
reproduction above the replacement rate is possible,
it will happen until the resource limit is reached.
Termites, leafcutter ants, certain damselfish, ambrosia beetles, and certain marsh snails all practice agriculture. But yes, it’s certainly an uncommon behavior.
What if reproduction above the replacement rate isn’t possible for the period of human evolution we’re talking about? What if the human population simply isn’t reproducing fast enough for most of prehistory to reach the resource limit? Those are the conditions I’m suggesting here—that reaching local resource limits was not the norm for much of our evolution, due to our inherent long gestation times and strong k-selection, the inherent metabolic requirements for fertility taking a long time to satisfy compared to modern conditions, the birth interval being very wide compared to Malthusian assumptions, and the techniques of food acquisition being of necessity limited by the the ease of satisfying everybody’s requirements (if everyone has a fully tummy and all their kids do too, going out and gathering MORE food at the expense of one’s kinsmen won’t do you any good anyway).
What you get is abundance—there’s room to grow, but we can only do it so fast, and when we start to reach the point where we might overtax our resource base, we’ve moved on and there weren’t enough of us using it in the first place to compromise it.
The defectors would not need to reproduce in blatantly extraordinary numbers.
It would be enough to reproduce just slightly above the
replacement rate, so slightly that it might be
unnoticeable for all practical purposes.
That kind of statistical hackery might work in a large population, but not very well in a small one. In a group of 100 humans, ANY population gain is noticeable.
The exponential growth would nevertheless explode their
population in not very many generations and lead to them
overwhelming others
Except all evidence suggests it wasn’t possible to have a population explosion, if you assume humans must have reproduced at the fastest allowable rate. Populations doubled in a quarter-million years, not 25.
How did this punishment mechanism evolve, and how did it > remain stable?
It didn’t evolve genetically, it’s a cultural punishment I’m talking about. Ju/’hoansi hunters are taken down a notch whenever they make a kill. Certain Australian aboriginal groups have meat-sharing customs where one hunter goes out and gets a kangaroo (say), and his share of the meat is the intestines or penis—the choicer cuts get distributed according to a set of other rules. Except, then people invite the hunter over to dinner; he’s not forced to actually eat crow every time he succeeds, but he’s also socially aware that he depends upon the others for it (and he gets to receive a choicer share when some other hunter makes a kill).
World population simply did not grow as fast as you’re assuming, and humanity did not start outstripping local carrying capacities in a major, systematic way until we’d developed technologies that allowed us to make those sorts of population growth leaps.
I don’t understand your argument here at all. Earlier you said that growth to the Malthusian limit was prevented by a cooperative strategy of restraining reproduction. Now you say that lack of food production technology was limiting population growth. But if foragers did breed up to the limit where food became the limiting resource, that’s by definition a Malthusian equilibrium.
You are also presenting a strawman caricature of Malthus. His claim about a 25-year doubling period refers to agricultural societies with an ample supply of land, such as existed in North America of his day. He presents it as an empirical finding. When he discusses foragers, he notes that they’ll reproduce to the point where they run against the limited food supply available from foraging, which given the low supply of food relative to farming, means a much less dense population.
Some of his discussions of foragers are actually quite interesting. He notes that among the North American hunter-gatherers, resource limitations lead to constant disputes and warfare. He also cites accounts of European explorers’ contacts with forager peoples that seem to have been on the Malthusian limit.
It didn’t evolve genetically, it’s a cultural punishment I’m talking about.
It doesn’t matter—it still needs to be explained. Humans don’t just magically develop cultural norms that solve collective action problems.
Earlier you said that growth to the Malthusian limit was prevented by a cooperative strategy of restraining
reproduction.
What I said was that growth to the point of constant warfare, competition and struggle for enough food to subsist wasn’t an accurate picture of ancestral forager lifestyles.
Some of his discussions of foragers are actually quite interesting.
He notes that among the North American hunter-gatherers, resource limitations lead to constant disputes
and warfare.
He also says that smallpox was endemic among the Indians of all these cultures. Smallpox originated in Eurasia, thrived among farmers, and Native Americans had no immunity to it. His example of the squallor and disease these people live in is an example of the conditions they were subjected to at the hands of an invading power with novel biological agents their immune systems simply weren’t adapted to handle. The nastiest conflicts.
Warfare among Northwest Coast Natives, prior to colonization, was usually over petty disputes (that is, interpersonal ones) between peoples who had long-standing trade and treaty relationships, and only occasionally over resources (usually slaves, and the institution of slavery as it was practiced here does not compare readily with slavery as it was practiced by agriculturalists in Eurasia and Africa). The bloodier wars of the inland northwest are similarly a historical novelty, unparalleled in scope or stakes until the ravages of introduced diseases and the dislocation of various tribes by white invaders into territories they’d never been in competition for caused clashes that simply hadn’t occured at such a level of intensity prior to that point. The formation of reservations only exacerbated this—we’re talking about groups with age-old rivalries who had never seen fit to exterminate one another or conquer one another’s lands, but who would happily send a war canoe full of men to go steal things because of a petty vendetta between two people that started long ago.
This isn’t war of extermination. Don’t get me wrong, it’s violent, people die, the stakes are real, but it’s not a zero-sum, winner-take-all competition for survival. A direct translation out of Old Chinook from Franz Boas’ ethnography, regarding the rules of warfare should make this clearer:
“Before the people go to war they sing. If one of them sees blood, he will be killed in battle. When two see blood, they will be killed. They finish their singing. When they sing, two long planks are put down parallel to each other. All the warriors sing. They kneel [on the planks]. Now they go to war and fight. When people of both parties have been killed, they stop. After some time the two parties exchange presents and make peace. When a feud has not yet been settled, they marry a woman to a man of the other town and they make peace.”
The fight ends when both sides have taken casualties. The opposing sides exchange gifts and make peace. They resolve outstanding feuds by diplomatic marriage. This is the Chinook idea of war, the way it was practiced with all but their very worst enemies (who lived rather a long way from Chinook territory—the Quileute weren’t exactly next door given the pace of travel in those days, and even then the wars between them were not genocidal in intent). This is completely different from war as most Eurasian-descended cultures knew it. And it was typical of forager warfare in North America before Columbus showed up.
Malthus, in looking at the conditions of North American natives during the 19th century, reports on the dire conditions of a people devastated by introduced diseases, direct conquest by white settlers, and the disruption of their social fabric and ways of life. Whole culture groups pushed beyond the breaking point and very much outside their typical context, and most of their actual problems direct effects of colonization.
Malthus, in looking at the conditions of North American natives during the 19th century, reports on the dire conditions of a people devastated by introduced diseases, direct conquest by white settlers, and the disruption of their social fabric and ways of life.
Some of the accounts presented by Malthus were given by very early explorers and adventurers who ended up deep in unexplored territory, far ahead of European conquest and colonization. For example, the one by Cabeca de Vaca would be circa 1530.
The only way these societies could have already been devastated is if epidemics had ravaged the whole continent immediately in the first decades after the first Europeans landed, ahead of any European contact with the inland peoples. I don’t know enough about the relevant history to know how plausible this is, but even if it happened, there are two problems with your claim:
Diseases wouldn’t cause famine, at least in the long run. These early explorers describe peoples who had problems making ends meet during bad seasons due to insufficient food, and who fought bitterly over the existing limited supply. If the population had already been thinned down by disease by the time they came, we’d expect, if anything, the per capita food supply from foraging to be greater than before.
If even the earliest accounts are of devastated societies, then how do we know anything about the better life they led before that? Where does this information come from? You cite an ethnography by Boas, who was born in 1858, as authoritative, but dismiss a compilation of far older accounts compiled by Malthus in the early 19th century.
Smallpox emerged in the Old World around 10,000 BC and is believed to have originated via cattle farming. It reached very high concentrations in Europe and became a common plague there; it was spread around the world to peoples who had never encountered it by European exploration and conquest. It and other Old World disease spread very rapidly among American native populations, rendering whole cultures extinct and reducing others to scattered survivors often incapable of rebuilding. The total population of the Americas lost to European diseases after the arrival of Columbus and Cortez is estimated at 90 to 95 percent.
Given that many Native nations were at least modestly dependent on agriculture (the Iroquois, Navajo, Aztecs, Incas, Mississipians—indeed, most of the well-known groups), such population losses coming so quickly are nothing short of catastrophic. Most of your resource base collapses because one person is going to have to work MUCH harder to provide enough food for themselves—fields go unplanted, vegetables don’t get tended, wild game is much more dangerous to hunt by oneself, and one cannot expect any assistance with gathering. Even a small number of people used to an agriculture-enriched lifestyle are going to be hit much harder.
It’s also worth noting that Cabeza da Vaca actually described the Coahuiltic as a healthy and prosperous people—and ant eggs, lizards and so on were just normal parts of their diet. Ant eggs in particular are STILL a cultural delicacy among the Latino groups descended from the Coahuiltecs (escamole taco, anyone?). Diet adapts to local circumstances.
The only way these societies could have already been devastated is if epidemics had ravaged the whole
continent immediately in the first decades after the first Europeans landed, ahead of any European contact > with the inland peoples.
That is precisely what happened. One infected slave from Spanish-held Cuba is believed to be the Patient Zero that transmitted an infection which would go on to wipe out about fifty percent of the Aztec population. Hernando de Soto, exploring the southeast, encountered many towns and villages abandoned just two years prior when most of their inhabitants died of the plagues. Isolated survivors often just abandoned their homes outright, since in many cases a handful of people or even a single survivor were all that was left out of a village of hundreds or thousands. Neighbors who showed up, unaware of what happened, might contract disease from the corpses in some cases, or simply welcome in the survivors who’d start the cycle anew. North America had extensive trade routes linking all major regions, from coast to coast. Foot and boat traffic carried diseases quite far from their initial outbreak sites.
If even the earliest accounts are of devastated societies, then how do we know anything about the better life
they led before that?
Because they’re not all dead, and they left their own records of what happened and there are records of contact with them in much better conditions*, and there are still plenty of Native people alive today, who often know rather more about said records of their lives before than the typical Euro-American? And because it’s generally acknowledged within anthropological, archaeological and historical fields now that modern research bears out a picture of generally healthy, sustainable populations for most of the foragers of the Americas? And quite large, complex societies that were generally not recognized as such by early Anglo scholars into the matter?
(Malthus seriously* misrepresents Cabeza de Vaca’s case—the Floridians were in a bad way, but they were also right next door to Spanish early conquest—his accounts of the Coahuiltecs of coastal and inland Texas describe them as a healthy and prosperous people...and their descendents STILL enjoy ant eggs as a dietary item; you don’t have to be desperate to eat insects and many human groups actively enjoy it .
Where does this information come from? You cite an ethnography by Boas, who was born in 1858, as
authoritative, but dismiss a compilation of far older accounts compiled by Malthus in the early 19th century.
Boas actually travelled to the civilizations he wrote about, lived among them, recorded their oral traditions and analyzed their languages, investigated their history and their environmental circumstances. For many people, especially in the Northwest, far North and other relatively late-contacted areas, these events occured within the living memory of their elders.
Malthus wasn’t an expert on Native American civilizations or history, and basically went with the prevailing account available at the time. He relied on a consensus that wasn’t yet well-understood to be false. So I reject Malthus’ picture of pre-Columbian America for the same reason I reject Lysenko’s account of evolution. The difference is that Malthus was an influential thinker within the development of Western thought, and his role means that a lot of people who agree with what insights he did make are unwittingly buying into cached arguments about related subjects (often ones that don’t support his case) which hadn’t yet been discovered as such when Malthus wrote in the first place.
Scholarship in the field since Malthus’ time has seriously changed the outlook—Charles C. Mann and Jared Diamond are good, accessible sources for a summary overview (“1491” and “Guns, Germs and Steel”). If I seem to be vague, it’s mostly because this is domain-specific knowledge that’s not widely understood outside the domain, but as domain insider it’s fairly basic stuff.
And because it’s generally acknowledged within anthropological, archaeological and historical fields now that modern research bears out a picture of generally healthy, sustainable populations for most of the foragers of the Americas?
How exactly does this modern research reconstruct the life of American foragers centuries ago, and based on what evidence? Could you cite some of this work? (I’d like to see the original work that presumably explains its methodology rigorously, not popular summaries.)
Malthus *seriously misrepresents Cabeza de Vaca’s case—the Floridians were in a bad way, but they were also right next door to Spanish early conquest—his accounts of the Coahuiltecs of coastal and inland Texas describe them as a healthy and prosperous people...
On closer look, it turns out that de Vaca’s description cited by Malthus actually refers to a people from southeastern Texas, not Florida. So while Malthus apparently mixed up the location by accident, his summary is otherwise accurate. Your above claims are therefore completely incorrect—the description is in fact of a people from Texas, living very far from the boundary of Spanish conquest at the time.
For reference, I quote de Vaca’s account at length (all emphasis mine):
Castillo and Estevanico went inland to the Iguaces. [...] Their principal food are two or three kinds of roots, which they hunt for all over the land; they are very unhealthy, inflating, and it takes two days to roast them. Many are very bitter, and with all that they are gathered with difficulty. But those people are so much exposed to starvation that these roots are to them indispensable and they walk two and three leagues to obtain them. Now and then they kill deer and at times get a fish, but this is so little and their hunger so great that they eat spiders and ant eggs, worms, lizards and salamanders and serpents, also vipers the bite of which is deadly. They swallow earth and wood, and all they can get, the dung of deer and more things I do not mention; and I verily believe, from what I saw, that if there were any stones in the country they would eat them also. They preserve the bones of the fish they eat, of snakes and other animals, to pulverize them and eat the powder. [...] Their best times are when “tunas” (prickly pears) are ripe, because then they have plenty to eat and spend the time in dancing and eating day and night. [...] While with them it happened many times that we were three or four days without food. Then, in order to cheer us, they would tell us not to despair, since we would have tunas very soon and eat much and drink their juice and get big stomachs and be merry, contented and without hunger. But from the day they said it to the season of the tunas there would still elapse five or six months, and we had to wait that long.
Also, regarding this:
Boas actually travelled to the civilizations he wrote about, lived among them, recorded their oral traditions and analyzed their languages, investigated their history and their environmental circumstances. For many people, especially in the Northwest, far North and other relatively late-contacted areas, these events occured within the living memory of their elders.
Earlier you claimed that the native population of the entire American continent was devastated by epidemics immediately after the first European contacts in the late 15th/early 16th century, so that even the accounts of very early European explorers who traveled deep into the continent ahead of European colonization do not present an accurate picture of the native foragers’ good life they had lived before that. But now you claim that in the late 19th century, this good life was still within living memory for some of them.
It seems like you’re accepting or discounting evidence selectively. I can’t believe that all those accounts cited by Malthus refer to societies devastated by epidemics ahead of European contact, but on the other hand, the pre-epidemic good times were still within living memory for the people studied by Boaz centuries later.
I reject Malthus’ picture of pre-Columbian America for the same reason I reject Lysenko’s account of evolution.
Lysenko was motivated by politics. Baez was motivated by politics.
Physics improves, but history deteriorates. Those writers closest to events give us the most accurate picture, while later writers merely add political spin. Since 1830, history has suffered increasingly drastic, frequent, and outrageous politically motivated rewrites, has become more and more subject to a single monolithic political view, uniformly applied to all history books written in a particular period.
If you read old histories, they explain that they know such and such, because of such and such. If you read later histories, then when they disagree with older histories, check the evidence cited by older histories, you usually find that the newer histories are making stuff up. The older history says X said Y, and quotes him. The newer history say that X said B, and fails to quote him, or fails to quote him in context, or just simply asserts B, without any explanation as to how they can possibly know B.
Most of your resource base collapses because one person is going to have to work MUCH harder to provide enough food for themselves—fields go unplanted, vegetables don’t get tended, wild game is much more dangerous to hunt by oneself, and one cannot expect any assistance with gathering. Even a small number of people used to an agriculture-enriched lifestyle are going to be hit much harder.
Both Clark and Tainter (Collapse of Complex Civilizations) disagree with this claim as stated. A massive reduction in the population means that the survivors get increased per-capitas because the survivors move way back along the diminishing marginal returns curve and now have more low-hanging fruit (sometimes literally). In fact, Tainter argues that complexity often collapses because the collapse is the only way to increase per-capita wealth. Hunter-gatherers spend much less time per calorie than do advanced agriculturalists eg.
The surprise here is that while there is wild variation across forager and shifting cultivation societies, many of them had food production systems which yielded much larger numbers of calories per hour of labor than English agriculture in 1800, at a time when labor productivity in English agriculture was probably the highest in Europe. In 1800 the total value of output per man-hour in English agriculture was 6.6 pence, which would buy 3,600 kilocalories of flour but only 1,800 kilocalories of fats and 1,300 kilocalories of meat. Assuming English farm output was then half grains, onequarter fats, and one-quarter meat, this implies an output of 2,600 calories per worker-hour on average.32 Since the average person ate 2,300 kilocalories per day (table 3.6), each farm worker fed eleven people, so labor productivity was very high in England. Table 3.13 shows in comparison the energy yields of foraging and shifting cultivation societies per worker-hour. The range in labor productivities is huge, but the minimum average labor productivity, that for the Ache in Paraguay, is 1,985 kilocalories per hour, not much below England in 1800. The median yield per labor hour, 6,042 kilocalories, is more than double English labor productivity.
Or
...ranging from a modest 1,452 kilocalories per person per day for the Yanomamo of Brazil to a kingly 3,827 kilocalories per person per day for the Ache of Paraguay. Some of this is undoubtedly the result of errors in measuring food consumption. But the median is 2,340, implying that hunter-gatherers and subsistence agriculturalists ate as many calories as the median person in England or Belgium circa 1800. Primitive man ate well compared with one of the richest societies in the world in 1800. Indeed British farm laborers by 1863 had just reached the median consumption of these forager and subsistence societies.
(Quotes brought to you by my Evernote; it’s a pain in the ass to excerpt all the important bits from a book, but it certainly pays off later if you want to cite it for various assertions.)
Some quotes from Clark’s Farewell to Alms (he also covers the very high age of marriage in England as one way England held down population growth):
Fertility was also probably high among the precontact Polynesians. Sexual activity among women was early and universal. Why then was Tahiti such an apparent paradise to the visiting English sailors, rather than a society driven to the very subsistence margin of material income, as in Japan? The answer seems to be that infanticide was widely practiced...The estimates from the early nineteenth century are that between two-thirds and three-quarters of all children born were killed immediately.27...One sign of the practice of infanticide was the agreement by most visitors that there were more men than women on the islands. …In preindustrial China and Japan the gender ratio of the population shows that there was significant female infanticide. In these Malthusian economies infanticide did raise living standards.
An additional factor driving down birth rates (and also of course driving up death rates) was the Chinese practice of female infanticide. For example, based on the imbalance between recorded male and female births an estimated 20–25 percent of girls died from infanticide in Liaoning. Evidence that the cause was conscious female infanticide comes from the association between the gender imbalance of births and other factors. When grain prices were high, more girls are missing. First children were more likely to be female than later children. The chance of a female birth being recorded for later children also declined with the numbers of female births already recorded for the family. All this suggests female infanticide that was consciously and deliberately practiced.13
…
Female infanticide meant that, while nearly all women married, almost 20 percent of men never found brides. Thus the overall birth rate per person, which determines life expectancy, was reduced. The overall birth rate for the eighteenth century is unclear from the data given in this study, but by the 1860s, when the population was stationary, it was around 35 per thousand, about the same as in preindustrial Europe, and less than in many poor countries today. Earlier and more frequent marriage than in northwestern Europe was counteracted by lower marital fertility and by female infanticide, resulting in equivalent overall fertility rates.
Just to be clear, and so everyone knows where the goalposts are: as per the definition here: http://en.wikipedia.org/wiki/Hunter-gatherer , a forager society relies principally or entirely on wild-gathered food sources. Modern examples include the Pila Nguru, the Sentinelese of the Andaman Islands, the Pirahã, the Nukak, the Inuit until the mid-20th century, the Hadza and San of southern Africa, and others.
To those not deeply familiar with anthropology this can lead to some counterintuitive cases. The Yanomamo, who depend mainly on domesticated bananas supplemented by hunting and fishing, aren’t foragers in the strict sense. The modern Maya, and many Native American groups in general weren’t pure foragers. The Salish and Chinook peoples of the Pacific Northwest of the United States were sedentary foragers.
The Polynesians and Chinese of those periods were not foragers—both societies practiced extensive agriculture supplemented by hunting and gathering, as in preindustrial Europe.
My apologies—skimmed rather than read in detail and missed the purpose of your comment. Reply left up anyway since it may clarify terminology and definitions re: foragers for anyone who happens uipon the thread later. Thank you for clarifying!
Well that is certainly a lot for me to learn more about. Sorry I missed this post. How much of this has been directly observed in modern forager societies versus inferences from archaeology?
There’s a lot of other studies about different passive fertility in forager groups that bear out the cross-cultural applicability of the San studies as well. Forgot to add that.
The bits about breastfeeding and the other biological limiting factors (the indirect controls, basically) came to light during Richard Lee’s fieldwork with the San and Ju/’hoansi peoples of South Africa in the 1960s.
The bit about active measures is available if you peruse the anthropological literature on the subject (I don’t have a specific citation in mind), and the sort of thing covered in introductory classes to the field—it’s common knowledge within that domain.
As to resource warfare, it’s a non-starter for most foragers. You walk away, or you strike an agreement about the use of lands. There are conflicts anyway, but they’re infrequent—the incentive isn’t present to justify a bloody battle most of the time. And it doesn’t come up as often as you think, either, because as I’ve stated, forager populations don’t grow as quickly (they tend to stay around carrying capacity when different groups are summed over a given area) and indeed, devote active effort to keeping it that way, which supplements the tremendous passive biases in favor of slow growth.
Where it does come into prominence is with low-tech agriculturalists, pastoralists and horticulturalists. Those people have something to fight over (a stationary, vulnerable or scarce landbase, that rewards their effort with high population growth and gives incentive to expand or lock down an area for their exclusive use).
Sorry, I don’t see where you do. Food preservation techniques, migratory habits, gathering crabs or berries doesn’t tell me anything at all about how people avoided population growth.
I’m not sure this is true; I know little about welfare politics, but I was under the impression there was a major shift over the last ten years toward limiting the amount of welfare benefits available to people who are “abusing the system” by not looking for work.
One could probably remain alive for long periods just by begging and being homeless, but this raises the question of what, exactly, is a “life worth living”, such that we could rest content that people were working because they enjoy status competitions and not because they can’t get a life worth living without doing so.
This is probably way too subjective to have an answer, but one thing that “sounds right” to me is that the state of nature provides a baseline. Back during hunter-gatherer times we had food, companionship, freedom, et cetera without working too hard for them (the average hunter-gatherer only hunted-gathered a few hours a day). Civilization made that kind of lifestyle impossible by killing all the megafauna and paving over their old habitat, but my totally subjective meaningless too-late-at-night-to-think-straight opinion is that we can’t say that people can opt-out of society and still have a “life worth living” unless they have it about as good as the hunter-gatherers they would be if society hadn’t come around and taken away that option.
The average unemployed person in a developed country has a lot of things better than hunter-gatherers, but just the psychological factors are so much worse that it’s no contest.
The specific situation in the U.S. or any other individual country doesn’t really matter for my point. Even if I’m wrong about how easy freeloading is in the U.S., it’s enough that we can point to some countries whose welfare systems are (or even just were at some point) generous enough to enable easy freeloading.
Ironically, in my opinion, in places where there exists a large underclass living off the welfare state, it is precisely their reversal to the forager lifestyle that the mainstream society sees as rampant social pathology and terrible deprivation of the benefits of civilized life. I think you’re committing the common error of idealizing the foragers. You imagine them as if you and a bunch of other highly intelligent and civilized people had the opportunity to live well with minimal work. In reality, however, the living examples of the forager lifestyle correctly strike us as frightfully chaotic, violent, and intellectually dead.
(Of course, it’s easy to idealize foragers from remote corners of the world or the distant prehistory. One is likely to develop a much more accurate picture about those who live close enough that one has to beware not to cross their path.)
You are not wrong about “freeloading,” though that term is probably (unnecessarily pejorative). The Developed world is so obscenely wasteful that it is not necessary to beg. You can get all the food you want, much of it very nice—often much nicer than you could afford to buy by simply going out and picking it up. Of course, you don’t get to pick and choose exactly what you want when you want it.
Clothing, with the exception of jeans, is all freely available. The same is true of appliances, bedding and consumer electronics of many kinds. The one commodity that is is very, very difficult to get at no cost is lodging. You can get books, MP3 players, CDs, printers, scanners, and often gourmet meals, but lodging is tough. The problem with housing and why it is qualitatively different that the other things I’ve cited is that while it is technically illegal to dustbin dive, in practice it is easy to do and extremely low risk. It is incredibly easy in the UK, if you get a dustbin key (easy to do).
However, the authorities take a very dim view of vagrancy, and they will usually ticket or arrest the person who has either “failure to account,” or is clearly living in a vehicle or on the street. This is less true in the UK than the US. However, get caught on the street as a vagrant AND as a foreigner in the UK (or in the US, or in any Developed country) and you are in a world of hurt—typically you will be deported with prejudice and be unable to renter the country either “indefinitely,” or for some fixed period of time.
If you can swing lodging, then the world is your oyster (for now). I travel with very little and within 2 weeks of settling on a spot in large city, I have cookware, flatware, clothing, a CD player, a large collection of classical CDs, and just about anything else I want to go looking for. There is an art to it, but the waste is so profligate that it is not hard to master, and absolutely no begging is required (except for lodging ;-))
Speaking from a lifetime of experience on welfare in the US (I’m disabled, and have gotten work from time to time but usually lost it due to factors stemming either from said disability, or the general life instability that poverty brings with it), your impressions are largely correct.
What I’d say is that the shift (and it’s been more like the last forty years, albeit the pace has picked up since Reagan) is towards “preventing abuse” as a generic goal of the system; the result has been that the ability to deliver the services that ostensibly form the terminal goal of welfare-granting organizations is significantly diminished—there’s a presumption of suspicion the moment you walk in the door. Right now, SSI applicants are auto-denied and have to appeal if they want to be considered at all, even if all their administrative ducks are otherwise in a row; this used to be common practice, but now it’s standard.
This also means that limits are fairly low. I can’t receive more than 40 dollars a month in food stamps right now because my apartment manager won’t fill out a form on my behalf stating the share of rent and other services I pay in my unit. He has an out; he’s not involved in the household finances. But without that in writing, from that person, the office presumes that since I have roommates declared, my share of the household expenses is zero, ergo I’m entitled to the minimum allowable (they can’t just deny me since I’m on SSDI).
And having been homeless for a little while (thankfully a friend helped me get the down payment on a place I could just barely afford), yeah...Vladimir_M’s comments are based more on rhetoric than substance. One thing I observe is that many people who are long-term impoverished or homeless (self included) will project a bit of being inured to status as a way of just securing ourselves some dignity in our interactions with others—but nobody in that situation could miss how deeply that status differential cuts whenever it’s used against us, even implicitly in the way people just ignore or dismiss them,
As luck would have it, I have some limited experience with living for periods of about a month at a time in a household where we gathered about 80 percent of the food we ate (no exaggeration). Rich in what the land around of offered, rich in the basic assets needed to make use of it, rich in ability to keep ourselves entertained and occupied during our copious free time.
I could easily see the typical hunter-gatherer experience being very, very good. Certainly, I’d rather be financially and material poor under the conditions I described above, than in my present circumstances.
Then what limited the growth of forager peoples so substantially? There had to be a mechanism to prevent them from exceeding their region’s carrying capacity. If a tribe of 50 people grew at a rate of 1% for 2000 years there would 24 billion people in it. Clearly that didn’t happen; in fact there have been massive die-offs from starvation due to cyclical climate change, or to resource warfare (sometimes fought to extinction) between neighboring tribes.
You cannot be considered financially and materially impoverished if you have access to abundant natural resources. Nevermind if you own that or can enforce the exclusive status of your rights to it—if you have those resources available to you they at least count as cash flow if not assets.
Limited access to limited resources is far more typical, and life is not so leisurelly when you spend every hour of daylight working to procure food that still isn’t enough to provide for you and your family. That is also the state of nature and was a situation that a great many people have found themselves in for the brief time that they managed to survive it.
That...actually doesn’t represent the human condition for most of our ancestral history, nor the current state of surviving forager peoples for the most part.
Resources are limited, but you only need about 15 hours of work a week per hunter-gatherer individual devoted to food-producing activities. Overdo that and you may well tax your ecosystem past carrying capacity. This is why foragers wander a migratory circuit (although they tend to keep to a known, fixed route) or live in areas where there’s sufficient ecological abundance to allowed for a sedentary lifestyle while still using hunter-gatherer strategies. It’s also why they tended to have small populations. Scarcity was something that could happen, but that’s why people developed food preservation technologies and techniques that you can assemble with nothing more than accumulated oral tradition and some common sense. Tie a haunch of meat down to some stones and toss it down to the bottom of a cold lake. That meat will keep for months, longer if the lake freezes over. It’ll be gamy as hell, but you won’t starve—and this is a pretty typical solution in the toolkit of prehistoric humans from Northern regions. Drying, salting (sometimes using methods that would squick you—one branch of my ancestors comes from a culture that used to preserve acorns by, kid you not, burying them in a corner of the home and urinating over the cache), chemical preservation, favoring foods that store long-term well in the first place, fermentation, and a flexible diet are all standard knowledge.
In the American Southwest (a hot, harsh, dry and ecologically-poor climate), Pueblo people and many others used to rely on the seasonal abundance of Mormon Crickets for protein. You can gather eighteen pounds of them an hour when they pass through, basically just by walking around and picking up bugs. The nutritional profile beats the hell out of any mammal meat, and they can be preserved like anything else. Think about that for a second—one person, in one hour, can provide enough of these bugs to feed an entire village for a day, or their own household for weeks (and that’s without preservation). It’s not desperation; it’s a sound food-gathering strategy, and a lot more palatable when you don’t come from a culture used to think of insects as a culinary taboo.
Starving to death is more of an issue for low-tech pastoralists and agriculturalists—people who use just a small fraction of the available edible resources to support populations that wouldn’t be able to forage on the available resources. The relationship of effiort to output for them is linear; work your farm harder, get more food in proportion—and you need to run a surplus every year in most cases because there is non-negotiable downtime during which it’s going to be hard to switch to another food source (and even if you do, you’ll be competing with your neighbors for it).
In my own case, I’ve taken part in of a family of five supplying themselves with only a few culturally-specific dietary staples (powdered milk, spices, flour, rice, things that we could easily have done without had they not been available) doing most of their food-production by just going out and getting it somewhere within a mile of home. Clams. squid and oysters were for storing (done with a freezer or by canning with water and salt) and cooking up into dishes we could eat for the rest of the month; small fish were gathered day-by-day, large fish stored (one salmon or sturgeon can feed five people for over a month when you have a freezer), crabs and similar gathered on a case-by-case basis. I personally wasn’t fond of frog legs, but a nearby pond kept up with a whole lot of demand for frogs in my family and others. We never bothered with anything like deer or bird hunting, but we’d gather berries, tree fruits (apple, plum, pear) and mushrooms, grow garden veggies and basically just keep ourselves supplied.
I’m not saying everyone on Earth could switch back today—heck no. A whole lot of people would starve to death after destroying the ecosystems they need. But my ancestors lived in that place for thousands of years and starving to death was not a common experience among them, because they weren’t used to the population densities that only come with intensive agriculture. And there are people descended from foragers of even more remote and desolate climes—some of them STILL living that way—who can say the same thing.
Then what limited the growth of forager peoples so substantially? There had to be a mechanism to prevent them from exceeding their region’s carrying capacity. If a tribe of 50 people grew at a rate of 1% a year for 2000 years there would 24 billion people in it. Clearly that didn’t happen; in fact there have been massive die-offs from starvation due to cyclical climate change, or to resource warfare (sometimes fought to extinction) between neighboring tribes.
I am so glad you asked, because the answer to your question reveals a fundamental misapprehension you have about forager societies and indeed, the structure and values of ancestral human cultures.
The fact is that forager populations don’t grow as fast as you think in the first place, and that across human cultures still living at or near forager methods of organization, there are many ways to directly and indirectly control population.
It starts with biology. Forager women reach menarche later, meaning they’re not fertile until later in life. Why? Largely, it’s that they tend to have much lower body fat percentages due to diet and the constant exercise of being on the move , and that’s critical for sustaining a pregnancy, or even ovulating in the first place once you’ve reached the (much higher) age where you can do that. Spontaneous abortions or resorption of the fetus are rather common. Women in an industrial-farming culture attain menarche quite a bit earlier and are more likely to be fertile throughout their active years—it only looks normal to you because it’s what you’re close to. So right out of the gate, forager women are less likely to get pregnant, and less likely to stay that way if they do.
Next biological filter: breastfeeding. Forager women don’t wean their children onto bottles and then onto solid food the way you experienced growing up. Breastfeeding is the sole means for a much longer period, and it’s undertaken constantly throughout the day—sleeping with the baby, carrying them around during the daily routine. It goes on for years at a time even after the child is eating solid food. This causes the body to suppress ovulation—meaning that long after you’re technically able to get pregnant again, the body won’t devote resources to it. All the hormonal and resource-delivery cues in your body point to an active child still very much in need of milk! Not only that, but it’s routine in many such societies for women to trade off breastfeeding duty with one another’s children—the more kids there are, the more likely it is that every woman in the proximate social group will have moderately suppressed fertility. It’s a weak effect, but it’s enough to lengthen the birth interval considerably. In the US, a woman can have a baby just about every year—for modern-day foragers, the birth interval is often two to five years wide. It’s harder to get pregnant, and once you do, the kids come more slowly.
The next layer is direct means of abortion. In the US that tends to be pretty traumatic if it’s not performed by a medical specialist. In some cases it still is for forager women—the toolkit of abortives across all human cultures is very wide. Midwives and herbalists often have access to minimally-invasive methods, but they also often have painful or dangerous ones. What you won’t find is many that are truly ineffective. Methods range from the unpleasant (direct insertion of some substance to cause vaginal bleeding and fetal rejection), to the taxing or dangerous (do hard work, lift heavy objects, jump from a high place) to fasting and ingestible drugs that can induce an abortion or just raise the likelihood of miscarriage.
The last layer is infanticide (and yes, we have this too, though it’s a deprecated behavior). In all cultures that practice it it’s considered a method of last resort, and it’s usually done by people other than the mother, quickly and quietly. Forager cultures are used to having to do this from time to time, but it’s still a rare event—certainly not a matter of routine expedience.
The point I’m making is that population growth unto itself is not a goal or a value of forager societies like those every human being on earth is descended from (and which some still occupy today). Growth, as an ideological goal, is a non-starter for people living this way. Too many mouths to feed means you undercut the abundance of your lifestyle (and yes, it truly is abundance most of the time, not desperate Malthusian war of all against all) -- and forager lives tend to be pretty good on the whole, filled with communitas and leisure and recreation aplenty as long as everybody meets a modest commitment to generating food and the supporting activities of everyday life. I’m not making it out to be paradise; this is just really what it’s like, day to day, to live in a small band of mostly close relatives and friends gathering food from what’s available in the environment.
I’ve heard claims like these several times, but this situation where individuals voluntarily limit their reproduction for the common good can’t possibly be a stable equilibrium. It faces a coordination problem, more specifically a tragedy of the commons. As soon as even a small minority of the forager population starts cheating and reproducing above the replacement rate (by evolving either cultural memes or hereditary philoprogenitive behaviors that motivate them to do so), in a few generations their exponential growth will completely swamp everyone else. The time scales on which forager societies have existed are certainly more than enough for this process to have taken place with certainty.
In order for such equilibrium to be stable, there would have to exist some fantastically powerful group selection mechanism that operates on the level of the whole species. I find this strikingly implausible, and to my knowledge, nobody has ever proposed how something like that might work.
It happened in the real world, ergo the issue lies with your understanding of the system we’re talking about and not with its inability to conform to your model.
You’re looking at this backwards. This is the reproductive context in which humanity evolved, and the Malthus-driven upward spiral of population and competition is the result of comparitively recent cultural shifts brought on by changing lifestyles that made it viable to do that. You don’t need to invoke group selection in the form you’re thinking of—the cultural “mutations” you’re positing can’t gain a foothold until some branch of humanity has access to a lifestyle that makes it advantageous to defect like that. Forager societies don’t have that incentive because if they overtax their resource base here and now they have to move, and for most of human prehistory (and the modern history of hunter-gatherers) the population densities were low enough that this gave the affected area time to recover, so when someone came back, things were fine again. A long-term climatic shift alters the range of viable habitats near you, but it takes something pretty darn catastrophic (more than just a seasonal or decadal shift) to entirely render a region uninhabitable to a group of size n.
The biggest filters to population growth in this system are entirely passive ones dictated by biology and resources—the active ones are secondary measures, and they’re undertaken because in a system like this, the collective good and the individual good are inextricably linked. It was a stable equilibrium for most of our evolution, and it only broke when and where agriculture became a viable option that DIDN’T immediately overtax the environment.
That’s a state of affairs that took most of human existence to come into being.
You assert these things very confidently, but without any evidence. How exactly do we know that this state of affairs existed in human prehistory?
You say:
This, however, provides no answer to the question why individuals and small groups wouldn’t defect, regardless of the subsequent collective consequences of such defection. You deny that you postulate group selection, but you keep talking in a very strong language of group selection. Earlier you asserted that “population growth unto itself is not a goal or a value of forager societies,” and now you say that “[f]orager societies don’t have that incentive.” How can a society, i.e. a group, have “values” and “incentives,” if you’re not talking about group selection? And if you are, then you need to answer the standard objection to arguments from group selection, i.e. how such group “incentives” can stand against individual defection.
I have no problem with group selection in principle—if you think you have a valid group-selectionist argument that invalidates my objections, I’d be extremely curious to hear it. But you keep contradicting yourself when you deny that you’re making such an argument while at the same time making strong and explicit group-selectionist assertions.
Archaeological evidence regarding the health and population density of human beings and their dietary habits. Inference from surviving examples. The null hypothesis, that we didn’t start with agriculture and therefore must have been hunter-gatherers for most of our existence as a species. The observatiion that the traits generally associated with the Malthusian trap are common experiences of agricultural societies and dependent upon conditions that don’t obtain in predominantly and purely hunter-gatherer societies.
They might defect, but it’d gain them nothing. Their cultural toolkits and food-gathering strategies were dependent upon group work at a set quota which it was maladaptive to under- or overreach. An individual can″t survive for long like this compared to a smallish group; a larger group will split when it gets too big for an area, a big group can’t sustainably form.
The answer to this lies in refuting the following:
“A small minority of the forager population” has to be taken in terms of each population group, and those are small. A small percentage of a given group might be just one or two people every handful of generations, here. A social umbrella-group of 150 scattered into bands of 10-50 throughout an area, versus just one or two people? Where’s the exponential payoff? The absolute numbers are too low to support it, and the defectors are stuck with the cultural biases and methodologies they know. They can decide to get greedy, but they’re outnumbered by the whole tribe, who are more than willing to provide censure or other forms of costly social signalling as a means of punishing defectors. They don’t even have to kill the defectors or drive them out; the defectors are critically dependent on the group for their lifestyle. The alternatiive will be unappealing in all but a vast majority of cases.
You need the kind of population densities agriculture allows to start getting a really noticeable effect. It’s not to say people don’t ever become tempted to defect, but it’s seldom a beneficial decision. And many cultures, such as the San ones in South Africa, have cultural mechanisms for ensuring nobody’s ego gets too big for their britches, so to speak. Teasing and ribbing in place of praise when someone gets a big head about their accomplishments, passive reminders that they need the group more than they individually benefit it.
This isn’t so much about group selection,as it is about all the individuals having their raft tied to the same ship—a group big enough to provide the necessities of life, which also provides a lot of hedonic reinforcement for maintaining that state of affairs, and a lot of non-coercive negative signalling for noncompliance, coupled with the much more coercive but morally neutral threat presented by trying to make a living in this place all by yourself.
If you break a leg in a small group, the medical practitioner splints it and everyone keeps feeding you. If you do that by yourself, it probably never heals right and the next leopard to come along finds you easy pickings. That’s what defection buys you in the ancestral environment.
Say there are two kinds of forager groups, one which limits reproduction of its members by various means, and another that does not limit reproduction and instead constantly grows and splits and invades other groups’ territories if needed. Naively I would expect that the latter kind of group would tend to drive the former kind out of existence. Why didn’t this happen?
This isn’t necessarily evidence against a Malthusian equilibrium. It could be that the subsequent farmer lifestyle enabled survival for people with much poorer health and physical fitness, thus lowering the average health and fitness of those who managed to survive in the Malthusian equilibrium.
Can you give a reference that specifically discusses how a non-Malthusian situation of the foragers can be inferred from the existing archaeological evidence?
This is not true. Humans are (more or less) the only species that practices agriculture, but the Malthusian trap happens to non-human animals too. As long as reproduction above the replacement rate is possible, it will happen until the resource limit is reached. (Admittedly, for animals that aren’t apex predators, the situation is more complicated due to the predator-prey dynamics.)
Regarding the foragers’ supposed cooperation on keeping the population stable, I honestly don’t see how what you write makes sense, for at least two reasons:
The defectors would not need to reproduce in blatantly extraordinary numbers. It would be enough to reproduce just slightly above the replacement rate, so slightly that it might be unnoticeable for all practical purposes. The exponential growth would nevertheless explode their population in not very many generations and lead to them overwhelming others. So even if we assume that blatantly excessive reproduction would be punished, it would still leave them more than enough leeway for “cheating.”
How did this punishment mechanism evolve, and how did it remain stable? You can postulate any group selection mechanism by assuming altruistic punishment against individuals who deviate from the supposed group-optimal behavior. But you can’t just assert that such a mechanism must have existed because otherwise there would have been defection.
Moreover, you are now talking about group selection with altruistic punishment. There’s nothing inherently impossible or absurd about that, but these are very strong and highly controversial claims, which you are asserting in a confident and authoritative manner as if they were well-known or obvious.
I’d like to remind you that the ancestral environment was not completely stable, and no one is disputing that exponentially-expansive Malthusian agriculture happened. The question is why it took as long as it did, not why it was possible at all.
Estimates of world population growth come from:
http://faculty.plattsburgh.edu/david.curry/worldpop.htm
Essentially human for our first 2 million years of existence, human population worldwide went from about 10,000 to 4 million. Given that virtually all major models of long-run human population converge very closely, and they all assume a relatively steady growth rate, we’re talking a doubling period of 250,000 years.
Malthus’ estimates assume a doubling rate of 25 years, or a single human generation. The difference is a factor of 10,000. World population simply did not grow as fast as you’re assuming, and humanity did not start outstripping local carrying capacities in a major, systematic way until we’d developed technologies that allowed us to make those sorts of population growth leaps.
According to Michael Kremer in “Population Growth and Technological Change: One Million BC to 1990”, the base rate of technological change in human societies scales proportional to population—small population, slow technological change. This equals very long inferential distances to the sorts of techniques and behaviors that make agriculture a viable prospect.
You need intermediate steps, in the form of settled horticulture or nomadic pastoralism, to really concentrate the population enough to have a chance at developing agriculture in the intensive sense. Those sorts of cultural developments took a long time to come into being, and it was a gradual process at that.
So, yes, it’s true that if you grow certain grasses and just harvest their seeds reliably, grinding them into a fine powder and mixing that with water and then heating the whole mixture somehow without actually burning it in your fire directly, you can produce a food source that will unlock access to population-doubling intervals closer to the Malthusian assumption of one doubling per generation.
But that is a series of nested behaviors, NONE of which is intuitively obvious by itself from the perspective of a forager in a world full of nothing but other foragers. Which is why the entire chain took a long, long time to develop, and why agriculture was invented just a few times throughout human history.
Termites, leafcutter ants, certain damselfish, ambrosia beetles, and certain marsh snails all practice agriculture. But yes, it’s certainly an uncommon behavior.
What if reproduction above the replacement rate isn’t possible for the period of human evolution we’re talking about? What if the human population simply isn’t reproducing fast enough for most of prehistory to reach the resource limit? Those are the conditions I’m suggesting here—that reaching local resource limits was not the norm for much of our evolution, due to our inherent long gestation times and strong k-selection, the inherent metabolic requirements for fertility taking a long time to satisfy compared to modern conditions, the birth interval being very wide compared to Malthusian assumptions, and the techniques of food acquisition being of necessity limited by the the ease of satisfying everybody’s requirements (if everyone has a fully tummy and all their kids do too, going out and gathering MORE food at the expense of one’s kinsmen won’t do you any good anyway).
What you get is abundance—there’s room to grow, but we can only do it so fast, and when we start to reach the point where we might overtax our resource base, we’ve moved on and there weren’t enough of us using it in the first place to compromise it.
That kind of statistical hackery might work in a large population, but not very well in a small one. In a group of 100 humans, ANY population gain is noticeable.
Except all evidence suggests it wasn’t possible to have a population explosion, if you assume humans must have reproduced at the fastest allowable rate. Populations doubled in a quarter-million years, not 25.
It didn’t evolve genetically, it’s a cultural punishment I’m talking about. Ju/’hoansi hunters are taken down a notch whenever they make a kill. Certain Australian aboriginal groups have meat-sharing customs where one hunter goes out and gets a kangaroo (say), and his share of the meat is the intestines or penis—the choicer cuts get distributed according to a set of other rules. Except, then people invite the hunter over to dinner; he’s not forced to actually eat crow every time he succeeds, but he’s also socially aware that he depends upon the others for it (and he gets to receive a choicer share when some other hunter makes a kill).
I don’t understand your argument here at all. Earlier you said that growth to the Malthusian limit was prevented by a cooperative strategy of restraining reproduction. Now you say that lack of food production technology was limiting population growth. But if foragers did breed up to the limit where food became the limiting resource, that’s by definition a Malthusian equilibrium.
You are also presenting a strawman caricature of Malthus. His claim about a 25-year doubling period refers to agricultural societies with an ample supply of land, such as existed in North America of his day. He presents it as an empirical finding. When he discusses foragers, he notes that they’ll reproduce to the point where they run against the limited food supply available from foraging, which given the low supply of food relative to farming, means a much less dense population.
Some of his discussions of foragers are actually quite interesting. He notes that among the North American hunter-gatherers, resource limitations lead to constant disputes and warfare. He also cites accounts of European explorers’ contacts with forager peoples that seem to have been on the Malthusian limit.
It doesn’t matter—it still needs to be explained. Humans don’t just magically develop cultural norms that solve collective action problems.
What I said was that growth to the point of constant warfare, competition and struggle for enough food to subsist wasn’t an accurate picture of ancestral forager lifestyles.
He also says that smallpox was endemic among the Indians of all these cultures. Smallpox originated in Eurasia, thrived among farmers, and Native Americans had no immunity to it. His example of the squallor and disease these people live in is an example of the conditions they were subjected to at the hands of an invading power with novel biological agents their immune systems simply weren’t adapted to handle. The nastiest conflicts.
Warfare among Northwest Coast Natives, prior to colonization, was usually over petty disputes (that is, interpersonal ones) between peoples who had long-standing trade and treaty relationships, and only occasionally over resources (usually slaves, and the institution of slavery as it was practiced here does not compare readily with slavery as it was practiced by agriculturalists in Eurasia and Africa). The bloodier wars of the inland northwest are similarly a historical novelty, unparalleled in scope or stakes until the ravages of introduced diseases and the dislocation of various tribes by white invaders into territories they’d never been in competition for caused clashes that simply hadn’t occured at such a level of intensity prior to that point. The formation of reservations only exacerbated this—we’re talking about groups with age-old rivalries who had never seen fit to exterminate one another or conquer one another’s lands, but who would happily send a war canoe full of men to go steal things because of a petty vendetta between two people that started long ago.
This isn’t war of extermination. Don’t get me wrong, it’s violent, people die, the stakes are real, but it’s not a zero-sum, winner-take-all competition for survival. A direct translation out of Old Chinook from Franz Boas’ ethnography, regarding the rules of warfare should make this clearer:
“Before the people go to war they sing. If one of them sees blood, he will be killed in battle. When two see blood, they will be killed. They finish their singing. When they sing, two long planks are put down parallel to each other. All the warriors sing. They kneel [on the planks]. Now they go to war and fight. When people of both parties have been killed, they stop. After some time the two parties exchange presents and make peace. When a feud has not yet been settled, they marry a woman to a man of the other town and they make peace.”
The fight ends when both sides have taken casualties. The opposing sides exchange gifts and make peace. They resolve outstanding feuds by diplomatic marriage. This is the Chinook idea of war, the way it was practiced with all but their very worst enemies (who lived rather a long way from Chinook territory—the Quileute weren’t exactly next door given the pace of travel in those days, and even then the wars between them were not genocidal in intent). This is completely different from war as most Eurasian-descended cultures knew it. And it was typical of forager warfare in North America before Columbus showed up.
Malthus, in looking at the conditions of North American natives during the 19th century, reports on the dire conditions of a people devastated by introduced diseases, direct conquest by white settlers, and the disruption of their social fabric and ways of life. Whole culture groups pushed beyond the breaking point and very much outside their typical context, and most of their actual problems direct effects of colonization.
Some of the accounts presented by Malthus were given by very early explorers and adventurers who ended up deep in unexplored territory, far ahead of European conquest and colonization. For example, the one by Cabeca de Vaca would be circa 1530.
The only way these societies could have already been devastated is if epidemics had ravaged the whole continent immediately in the first decades after the first Europeans landed, ahead of any European contact with the inland peoples. I don’t know enough about the relevant history to know how plausible this is, but even if it happened, there are two problems with your claim:
Diseases wouldn’t cause famine, at least in the long run. These early explorers describe peoples who had problems making ends meet during bad seasons due to insufficient food, and who fought bitterly over the existing limited supply. If the population had already been thinned down by disease by the time they came, we’d expect, if anything, the per capita food supply from foraging to be greater than before.
If even the earliest accounts are of devastated societies, then how do we know anything about the better life they led before that? Where does this information come from? You cite an ethnography by Boas, who was born in 1858, as authoritative, but dismiss a compilation of far older accounts compiled by Malthus in the early 19th century.
Smallpox emerged in the Old World around 10,000 BC and is believed to have originated via cattle farming. It reached very high concentrations in Europe and became a common plague there; it was spread around the world to peoples who had never encountered it by European exploration and conquest. It and other Old World disease spread very rapidly among American native populations, rendering whole cultures extinct and reducing others to scattered survivors often incapable of rebuilding. The total population of the Americas lost to European diseases after the arrival of Columbus and Cortez is estimated at 90 to 95 percent.
Given that many Native nations were at least modestly dependent on agriculture (the Iroquois, Navajo, Aztecs, Incas, Mississipians—indeed, most of the well-known groups), such population losses coming so quickly are nothing short of catastrophic. Most of your resource base collapses because one person is going to have to work MUCH harder to provide enough food for themselves—fields go unplanted, vegetables don’t get tended, wild game is much more dangerous to hunt by oneself, and one cannot expect any assistance with gathering. Even a small number of people used to an agriculture-enriched lifestyle are going to be hit much harder.
It’s also worth noting that Cabeza da Vaca actually described the Coahuiltic as a healthy and prosperous people—and ant eggs, lizards and so on were just normal parts of their diet. Ant eggs in particular are STILL a cultural delicacy among the Latino groups descended from the Coahuiltecs (escamole taco, anyone?). Diet adapts to local circumstances.
That is precisely what happened. One infected slave from Spanish-held Cuba is believed to be the Patient Zero that transmitted an infection which would go on to wipe out about fifty percent of the Aztec population. Hernando de Soto, exploring the southeast, encountered many towns and villages abandoned just two years prior when most of their inhabitants died of the plagues. Isolated survivors often just abandoned their homes outright, since in many cases a handful of people or even a single survivor were all that was left out of a village of hundreds or thousands. Neighbors who showed up, unaware of what happened, might contract disease from the corpses in some cases, or simply welcome in the survivors who’d start the cycle anew. North America had extensive trade routes linking all major regions, from coast to coast. Foot and boat traffic carried diseases quite far from their initial outbreak sites.
Because they’re not all dead, and they left their own records of what happened and there are records of contact with them in much better conditions*, and there are still plenty of Native people alive today, who often know rather more about said records of their lives before than the typical Euro-American? And because it’s generally acknowledged within anthropological, archaeological and historical fields now that modern research bears out a picture of generally healthy, sustainable populations for most of the foragers of the Americas? And quite large, complex societies that were generally not recognized as such by early Anglo scholars into the matter?
(Malthus seriously* misrepresents Cabeza de Vaca’s case—the Floridians were in a bad way, but they were also right next door to Spanish early conquest—his accounts of the Coahuiltecs of coastal and inland Texas describe them as a healthy and prosperous people...and their descendents STILL enjoy ant eggs as a dietary item; you don’t have to be desperate to eat insects and many human groups actively enjoy it .
Boas actually travelled to the civilizations he wrote about, lived among them, recorded their oral traditions and analyzed their languages, investigated their history and their environmental circumstances. For many people, especially in the Northwest, far North and other relatively late-contacted areas, these events occured within the living memory of their elders.
Malthus wasn’t an expert on Native American civilizations or history, and basically went with the prevailing account available at the time. He relied on a consensus that wasn’t yet well-understood to be false. So I reject Malthus’ picture of pre-Columbian America for the same reason I reject Lysenko’s account of evolution. The difference is that Malthus was an influential thinker within the development of Western thought, and his role means that a lot of people who agree with what insights he did make are unwittingly buying into cached arguments about related subjects (often ones that don’t support his case) which hadn’t yet been discovered as such when Malthus wrote in the first place.
Scholarship in the field since Malthus’ time has seriously changed the outlook—Charles C. Mann and Jared Diamond are good, accessible sources for a summary overview (“1491” and “Guns, Germs and Steel”). If I seem to be vague, it’s mostly because this is domain-specific knowledge that’s not widely understood outside the domain, but as domain insider it’s fairly basic stuff.
How exactly does this modern research reconstruct the life of American foragers centuries ago, and based on what evidence? Could you cite some of this work? (I’d like to see the original work that presumably explains its methodology rigorously, not popular summaries.)
I also note that you haven’t answered Wei Dai’s question.
Regarding Malthus and de Vaca, you say:
Here is a translation of de Vaca’s original account:
http://www.pbs.org/weta/thewest/resources/archives/one/cabeza.htm
On closer look, it turns out that de Vaca’s description cited by Malthus actually refers to a people from southeastern Texas, not Florida. So while Malthus apparently mixed up the location by accident, his summary is otherwise accurate. Your above claims are therefore completely incorrect—the description is in fact of a people from Texas, living very far from the boundary of Spanish conquest at the time.
For reference, I quote de Vaca’s account at length (all emphasis mine):
Castillo and Estevanico went inland to the Iguaces. [...] Their principal food are two or three kinds of roots, which they hunt for all over the land; they are very unhealthy, inflating, and it takes two days to roast them. Many are very bitter, and with all that they are gathered with difficulty. But those people are so much exposed to starvation that these roots are to them indispensable and they walk two and three leagues to obtain them. Now and then they kill deer and at times get a fish, but this is so little and their hunger so great that they eat spiders and ant eggs, worms, lizards and salamanders and serpents, also vipers the bite of which is deadly. They swallow earth and wood, and all they can get, the dung of deer and more things I do not mention; and I verily believe, from what I saw, that if there were any stones in the country they would eat them also. They preserve the bones of the fish they eat, of snakes and other animals, to pulverize them and eat the powder. [...] Their best times are when “tunas” (prickly pears) are ripe, because then they have plenty to eat and spend the time in dancing and eating day and night. [...] While with them it happened many times that we were three or four days without food. Then, in order to cheer us, they would tell us not to despair, since we would have tunas very soon and eat much and drink their juice and get big stomachs and be merry, contented and without hunger. But from the day they said it to the season of the tunas there would still elapse five or six months, and we had to wait that long.
Also, regarding this:
Earlier you claimed that the native population of the entire American continent was devastated by epidemics immediately after the first European contacts in the late 15th/early 16th century, so that even the accounts of very early European explorers who traveled deep into the continent ahead of European colonization do not present an accurate picture of the native foragers’ good life they had lived before that. But now you claim that in the late 19th century, this good life was still within living memory for some of them.
It seems like you’re accepting or discounting evidence selectively. I can’t believe that all those accounts cited by Malthus refer to societies devastated by epidemics ahead of European contact, but on the other hand, the pre-epidemic good times were still within living memory for the people studied by Boaz centuries later.
Lysenko was motivated by politics. Baez was motivated by politics.
Physics improves, but history deteriorates. Those writers closest to events give us the most accurate picture, while later writers merely add political spin. Since 1830, history has suffered increasingly drastic, frequent, and outrageous politically motivated rewrites, has become more and more subject to a single monolithic political view, uniformly applied to all history books written in a particular period.
If you read old histories, they explain that they know such and such, because of such and such. If you read later histories, then when they disagree with older histories, check the evidence cited by older histories, you usually find that the newer histories are making stuff up. The older history says X said Y, and quotes him. The newer history say that X said B, and fails to quote him, or fails to quote him in context, or just simply asserts B, without any explanation as to how they can possibly know B.
Both Clark and Tainter (Collapse of Complex Civilizations) disagree with this claim as stated. A massive reduction in the population means that the survivors get increased per-capitas because the survivors move way back along the diminishing marginal returns curve and now have more low-hanging fruit (sometimes literally). In fact, Tainter argues that complexity often collapses because the collapse is the only way to increase per-capita wealth. Hunter-gatherers spend much less time per calorie than do advanced agriculturalists eg.
Or
(Quotes brought to you by my Evernote; it’s a pain in the ass to excerpt all the important bits from a book, but it certainly pays off later if you want to cite it for various assertions.)
Some quotes from Clark’s Farewell to Alms (he also covers the very high age of marriage in England as one way England held down population growth):
Just to be clear, and so everyone knows where the goalposts are: as per the definition here: http://en.wikipedia.org/wiki/Hunter-gatherer , a forager society relies principally or entirely on wild-gathered food sources. Modern examples include the Pila Nguru, the Sentinelese of the Andaman Islands, the Pirahã, the Nukak, the Inuit until the mid-20th century, the Hadza and San of southern Africa, and others.
To those not deeply familiar with anthropology this can lead to some counterintuitive cases. The Yanomamo, who depend mainly on domesticated bananas supplemented by hunting and fishing, aren’t foragers in the strict sense. The modern Maya, and many Native American groups in general weren’t pure foragers. The Salish and Chinook peoples of the Pacific Northwest of the United States were sedentary foragers.
The Polynesians and Chinese of those periods were not foragers—both societies practiced extensive agriculture supplemented by hunting and gathering, as in preindustrial Europe.
I never said they were foragers; I thought the quotes were interesting from the controlling population perspective.
My apologies—skimmed rather than read in detail and missed the purpose of your comment. Reply left up anyway since it may clarify terminology and definitions re: foragers for anyone who happens uipon the thread later. Thank you for clarifying!
Well that is certainly a lot for me to learn more about. Sorry I missed this post. How much of this has been directly observed in modern forager societies versus inferences from archaeology?
There’s a lot of other studies about different passive fertility in forager groups that bear out the cross-cultural applicability of the San studies as well. Forgot to add that.
Studies of forager groups on several continents have come to the same basic conclusions around that. Some of those findings are summarized here: http://books.google.com/books?id=grrA421tRNkC&pg=PA431&lpg=PA431&dq=foragers+and+menarche&source=bl&ots=WNuoQO-gYV&sig=h1ahBo5ApBv4Q9uYxD47pM_whNM&hl=en&ei=NtBNTpzkFeOssALYip3rBg&sa=X&oi=book_result&ct=result&resnum=3&ved=0CDAQ6AEwAg#v=onepage&q=foragers%20and%20menarche&f=false
The bits about breastfeeding and the other biological limiting factors (the indirect controls, basically) came to light during Richard Lee’s fieldwork with the San and Ju/’hoansi peoples of South Africa in the 1960s.
The bit about active measures is available if you peruse the anthropological literature on the subject (I don’t have a specific citation in mind), and the sort of thing covered in introductory classes to the field—it’s common knowledge within that domain.
As to resource warfare, it’s a non-starter for most foragers. You walk away, or you strike an agreement about the use of lands. There are conflicts anyway, but they’re infrequent—the incentive isn’t present to justify a bloody battle most of the time. And it doesn’t come up as often as you think, either, because as I’ve stated, forager populations don’t grow as quickly (they tend to stay around carrying capacity when different groups are summed over a given area) and indeed, devote active effort to keeping it that way, which supplements the tremendous passive biases in favor of slow growth.
Where it does come into prominence is with low-tech agriculturalists, pastoralists and horticulturalists. Those people have something to fight over (a stationary, vulnerable or scarce landbase, that rewards their effort with high population growth and gives incentive to expand or lock down an area for their exclusive use).
So in a forager society, population growth is managed how, specifically? Abstinence?
See my other reply, the long one, which goes into some detail answering that question.
Sorry, I don’t see where you do. Food preservation techniques, migratory habits, gathering crabs or berries doesn’t tell me anything at all about how people avoided population growth.
http://lesswrong.com/lw/6vq/on_the_unpopularity_of_cryonics_life_sucks_but_at/4ny0 Right here.
It turns out that homelessness, in and of itself, approximately quadruples one’s mortality risk: study pointer: