PALEODIET Archives

Paleolithic Diet Symposium List

PALEODIET@LISTSERV.ICORS.ORG

Options: Use Forum View

Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
Loren Cordain <[log in to unmask]>
Date:
Sat, 29 Mar 1997 16:04:00 -0700
Content-Type:
text/plain
Parts/Attachments:
text/plain (175 lines)
A few comments from the last paleodigest:

1.      Dean writes: I wish we could get Boyd Eaton to join us, as he seems
to have done an awful lot of work in this department.

I am in regular communication with Boyd ,and I hope you all will read
our most recent piece on ancestral exercise patterns (Cordain L,
Gotshall RW, Eaton SB.  Evolutionary aspects of exercise.  World Review
of Nutrition and Dietetics 1997 81:in press.).   As far as I know, Boyd
is not yet on line, but I am hopeful that this will occur shortly, as I
know he now has a computer.

2.      David Ross writes: In light of some of the upwardly revised estimates
of the percentage of protein in the paleo-diet (50-60%), could someone
comment on the possibility that early man (or contemporary hunter-gatherers
for that
matter) was a victim of acidosis problems. I believe that there is a fair
amount of evidence that the acid-ash of excess protein (roughly, any more
than 50 gms/day) can not, in the long run, be neutralized by the buffering
systems that maintain fluid PH levels at beneficial levels.

Speth has written extensively on this topic.   A good starting point is
(Speth JD.  Early hominid hunting and scavenging: the role of meat as an
energy source.  J Hum Evol 1989 18:329-43)  Dr. Speth suggests that 300
g/day or roughly 50% of one+s normal total daily caloric intake would be
the upper limit of protein that could be safely consumed on a regular
daily basis without impairing health.   Speth points out that the liver
apparently has difficulty metabolising excessive dietary amino acids and
the kidneys may be unable to adequately excrete the urea and purine
by-products of excessive dietary protein intake.   However, there is
scant experimental evidence in  humans to critically confirm or deny
this.   The classic study of Stefansson+s all meat diet  indicated that
when carbohydrates were excluded from the diet, the ad-libitum macro
nutrient intake was ~80% fat and 20% protein.   Protein levels above 20%
produced feelings of nausea and un-ease  (Stefansson V.  The Fat of the
Land. MacMillan, New York, 1960, p60-89).   Note that this experiment
was conducted in Bellevue Hospital under metabolic ward conditions, and
the results of this prolonged (1yr) dietary trial were published in most
of the major scientific journals of that era including JAMA,  J Am
Dietetic Assn,  and the J Biol Chem.

3.      Staffan writes:    As far as I can figure out we don't even know for
sure that there was savanna at the places in question and if so what kind
of savanna.

Obviously, we can never know with absolute certainty the climatic
conditions and the vegetation associated with certain geographic
locations 2-4 million years ago (MYA).   However, recent studies of
marine eolian dust records corroborate marine oxygen isotopic records
and  clearly show that more arid conditions occurred in East Africa near
2.8 MYA, and  1.7 MYA (deMenocal PB.  Plio-pleistocene African climate.
Science 1995;270:53-59.).   These time frames coincide with the first
appearance of H. habilis (2.4 -2.2 MYA) and H. erectus/ergaster (1.9-1.7
MYA).   The marine climatic data correlate well with core pollen data
from east Africa from this time as well as with fossils of herbivores
known to inhabit savanna grassland.   All of these pieces of the puzzle
point to a reduction of tropical forest and woodland and increases in
open savanna areas dominated by graminae species.

4.      Dean writes: Earliest evidence for use of fire for cooking among
humans seems to be 25,000 years (I have no reference for that handy, let
me know if that's in dispute).

I refer Dean to the classic paper (James SR.  Hominid use of fire in the
lower and middle pleistocene. Current Anthropology 1989 30:1-26. ).
There is no single date at which fire was mastered by all hominids at
all locations on the earth.   Obviously, dated evidence for early fire
use varies upon geographic locale.   Certainly, the ability to make fire
(flint stones or drills) came much later in man+s evolution in some
locales and never in others.   The Efe still rely upon collecting
naturally occurring lightening fires and transporting it from hearth to
hearth, and the Tasmanian aborigines at first contact did not even use
fire.   This method (collecting lightening fires) of fire control was
likely the procedure first used by our ancestors.   Fire probably was
not used initially to cook meat -  let alone plant foods, but was
initially a strategy for overwintering in more northern latitudes
(keeping warm, thawing frozen scavenged meat) or perhaps utilized in
hunting.   James (1989) suggests that fire was being used by hominids
between 230,000-400,000 years ago in Europe and that the date of
400,000-500,000 years would have been too early for the evidence at
Zhoukoudian in China. The first appearance of hearths (burned stones
arranged in a circle or semi-circle) represents unequivocal use of fire,
and no actual hearths have been found until  the appearance of the
neanderthals at the end of the middle pleistocene (~200,000 years ago).
The Lehringen wooden spear (dated to 125,000 years ago in Germany)
recovered from between the ribs of an extinct straight tusked elephant
has been reported to be fire hardened (Movius HL.  A wooden spear of
third interglacial age from lower saxony . Southwestern J Anthropol 1950
6:139-42).   Actually, it is thought that burning of  wooden spear tips
was done  to ease its carving with stone shavers as well as to make it
more hard (Oakley KP et al.  A reappraisal of the clacton spearpoint.
Proc Prehistoric Soc 1977 43:13-30).   Fire then was certainly part of
most of our species technological repertoire by the appearance of
behaviorally modern men 35,000-40,000 years ago.   The manner in which
it was controlled varied by geographic locale over time.

Dean further comments: Although it's hard to imagine them making up the
majority of the diet, if people can comfortably eat wild grains without
technology then there's not much reason to think they wouldn't, is
there?

People can put many plant items as well as non-edible items (stones,
bones, feathers, cartilage etc) into their gastrointestinal tracts by
way of putting them  into their mouths.   The key here is the ability of
the GI tract to extract the nutrients (calories, protein, carbohydrate,
fat vitamins and minerals).  Bi-gastric herbivores have evolved an
efficient second gut with bacteria that can ferment the fiber found in
leaves, shrubs, forbs and grasses and thereby extract nutrients in an
energetically efficient manner (that is, there is more energy in the
food than in the energy required to digest it).    Humans can clearly
put grasses and grass seeds into our mouths, however  we do not have a
GI tract which can efficiently extract the energy and nutrients.   The
starch and hence carbohydrate  and protein calories in cereal grains
occur inside the cell walls of the grain.  Because the cell walls of
cereal grains are almost completely resistant to the mechanical and
chemical action of  the human GI tract, cereal grains have been shown to
pass through the entire GI tract and appear intact in the feces (Stephen
A.  Whole grains - impact of consuming whole grains on physiological
effects of dietary fiber and starch. Crit Rev Food Sci Nutr 1994
34:499-511).    In order to make the nutrients in cereal grains
available for digestion, the cell walls must first be broken (by
milling) to liberate their contents and then the resultant flour must be
cooked.    Cooking causes the starch granules in the flour to swell and
be disrupted by a process called gelatinization which renders the starch
much more accessible to digestion by pancreatic amylase (Stephen A,
1994).   It has been shown that the protein digestibility of raw rice
was only 25%  whereas cooking increases it to 65% (Bradbury JH et al.
Digestibility of proteins of the histological components of cooked and
raw rice. Brit J Nutr 1984 52:507-513).
        The main cereal grains that humans now eat (wheat, rice, corn, barley,
rye, oats, millet, sorghum) are quite different from their wild,
ancestral counterparts from which all were derived in the past 10,000
years.   We have deliberately selected for large grains, with minimal
chaff and which are easily harvestable.  The wild counterpart of these
grains were smaller and difficult to harvest.  Further, separation of
the chaff from the grain was time consuming and required fine baskets
for the winnowing process.   Once the chaff is separated from the grain,
the grains have to be milled and the resultant flour cooked.   This
process is time consuming and obviously could have only come about in
very recent geologic times.   Further, the 8 cereal grains now commonly
eaten are endemic to very narrow geographic locations and consequently
by their geographic isolation would have been unavailable to all but a
selected few populations of hominids.
        Now Dean, I haven't even touched upon the issue of antinutrients in raw
cereal grains, and believe me this is an issue.   There are components
in raw cereal grains which wreak absolute havoc with human health and
well being.   The primary storage form of phosphorous in cereal grains
is phytate and phytates bind virtually all divalent ions.   Excessive
(50-60% of total calories) consumption of whole grain, unleavened breads
commonly results in rickets and hypogonadal dwarfism, and iron
deficiency anemia (will provide the references upon request).   The main
lectin in wheat (wheat germ agglutinin) has catastrophic effects upon
the gastrointestinal tract (Pusztai A et al.  Antinutritive effects of
wheat germ agglutinin and other N-acetylglucosamine-specific lectins.
Brit J Nutr 1993 70:313-21).   Additionally the alkylrescorcinols of
cereals influence prostanoid tone and  induce a more inflammatory
profile (Hengtrakul P et al.  Effects of cereal alkylresorcinols on
human platelet thromboxane production. J Nutr Biochem 1991 2:20-24)  as
well as depressing growth (Sedlet K et al.  Growth depressing effects of
5-n-pentadecylresorcinol: a model for cereal alkylresorcinols. Cereal
Chem 1984 61:239-41.
        So, Dean if you choose to eat raw cereal grains, please perform this
experiment.   Go out and buy  some whole grain wheat seeds and swallow a
handful of them.   Monitor your fecal contents over the next couple of
days and report to this forum what you have observed.   If you choose to
eat raw legumes (beans) you can probably smell the results of this
experiment before you see it.

Once in a while you get shown the light/
 In the strangest of places if you look at it right   ---Robert Hunter

        Sometimes you get shown the smell before you see the light and it
oftentimes occurs in the strangest of places.   What a long strange trip
its been.

ATOM RSS1 RSS2