Fasting from Bread: My 24th Day Milestone
Tuesday, the 24th day of the Bread Fast
These are some of the notes I've taken, as I continue on this fast from bread (and all grains, and cheese, and dairy, and meat, and caffeine, and alcohol, and all cooked foods), eating only a raw food diet of fruits and vegetables for a 30 day detox. This is going to ramble a bit.
These are some of the notes I've taken, as I continue on this fast from bread (and all grains, and cheese, and dairy, and meat, and caffeine, and alcohol, and all cooked foods), eating only a raw food diet of fruits and vegetables for a 30 day detox. This is going to ramble a bit.
Recap: Why I'm doing this
As a self-confessed exorphin junkie, I have temporarily adopted a low-fat diet of raw fruits and vegetables in order to detox from bread. In the beginning of this fast, it was to test myself, to see if I could go a mere 3 full days without eating my favourite food -- a food that I love, and that has never given me any problems that I know of, a food that I believe in, and a food that has kept me nourished, and amused for over 5 years, as I have taught myself how to bake it, in this blog. And yet, from the very beginning I called myself an "exorphin junkie," wondering whether bread was a real addiction for me. I certainly ate a lot of it.
As a form of detox, I pledged to eat only a raw food diet of fruits and vegetables for 3 days. At the end of those first 3 days, I extended this fast to 1 week, and after that week, to 10 days, and it was only after 10 days I felt that a month (30 days) might be possible.
This is still a day-to-day trial for me. I have witnessed my brain telling me different things, on different days -- that I should stop this, that I should finish this, that I should eat this thing, that I should eat that thing. Some of the brain's messages are simply habit. Some are pure willpower. Some are pure craving. Some are the result of my research or my attempt to come up with some sort of an archetypal ideal foodstuff for humans. It is difficult to sort it all out sometimes.
As I have continued to research the detox or all raw diet, I believe I've found evidence that I ought to stop at the end of 30 days, and reintroduce some cooked foods. As I indicated last blog entry, it is now my belief that humans require cooked starches in their diet. However, I would like to continue to eschew (rather than chew) grains, to see if I can extend this fast from bread to one full year. The cooked starches I intend to reintroduce to my diet will be things like potatoes, sweet potatoes, legumes etc., rather than the starches of rice, barley, oats, corn and wheat. At least for now.
If I manage to go without bread for a whole year, perhaps I can say with certainty that this exorphin junkie has kicked the habit. I might reintroduce bread at that time. At this time, I feel that it is not the bread that has ever given me any trouble, but what I have put on it that has seen my weight creep up gently over the years.
But it is all just an experiment, and I've got a lot to learn.
Learning from my Patients
Working on the palliative care ward of a tertiary hospital as I do, you get to observe first hand something very curious. Many patients come to us from the acute care wards, after a prolonged battle with their illness or disease. When medical science has exhausted its repertoire of drugs or therapies, or when patients have become exhausted with the endless battery of tests and treatments, and when there has been no improvement, or even a worsening of the condition, the palliative care team may be called in to transition patients to a attend to symptom control. Patients only come to us from the acute care settings when they agree that comfort based care is the next step. Many patients and families have to be convinced that the care will not end when treatment of the acute illness is eased out of the equation before that step is taken. And so they come to us -- and that's when we begin to notice something odd. Dealing with symptoms only, and providing some pain relief, we palliators often see a dramatic change in our patients: before they die, many of them actually improve somewhat. Taken off most medications, and often no longer eating anything, some patients start getting better. Inexplicably, the body will begin to heal, provided it receives good nursing care. We see wounds knit, and close. The body does not know that this patient has come here to die, it continues to do what it has always tried to do -- heal itself. Yes, oftentimes the illness or disease is unstoppable, death is inevitable. But it is not inevitably so. Some people will get better enough to make a further transition, either back to home, or to a long term care facility.
It is not my place to discuss nosocomial illnesses or iatrogenic causes of health problems, those disasters of medicine and the hospital environment. Instead, I want to talk about diet -- especially about the lack of diet, or what we call fasting. I want to examine what is happening to people who come to us, too sick to eat, on death's door, who actually do improve their condition somewhat, even before death. Let's talk about what happens to the body when we fast.
Fasting
We are all familiar with breakfast. It is what we eat first thing in the morning, after we have fasted all night. So most of us can go without food from 8 pm (after supper) until 8 am (breakfast) -- 12 hours -- with little discomfort. And the body uses this time of sleep to readjust itself. When we take away our continual food source, the body has to live on its reserves. What are those reserves? The things you have eaten during the day, and perhaps weeks, months, or years before falling asleep last night. Many of our patient's families are astonished and worried that their loved ones are going so long without eating. With just a little bit of water, but no food, one cannot live indefinitely, but one can live a lot longer than you'd think. Recently the world was amazed that a living person was pulled out of the rubble of a collapsed factory building, 17 days following a structural collapse in Bangladesh. She had access to rainwater and a few biscuits, and rationed it sparingly. News reports said that her ordeal was a testimony to the human spirit of survival, but then went on to list several other similar examples of seemingly miraculous survivals for unlikely numbers of days on little more than water.
So what does the body and brain use for fuel, when it isn't eating?
To answer that, first a reminder of what food is.
Eating
When we eat, we take in plants or animal products and digest them into nutrients that the cells of our bodies require for energy. The main energy that cells run on is a diet of glucose, and this is provided through the breakdown of macronutrients. The macronutrients that we need have been called by our scientists water, carbohydrates, proteins and fats -- and only the last three provide energy, which is measured in calories. There are many micronutrients that we also need, the most famous being vitamins, minerals and amino acids. When you eat an entire diet of nothing but fruits and vegetables, as I have been while on this month-long fast from bread, you are taking in a lot of vitamins and minerals, but very little fat and protein. And the body does quite well on fruits and vegetables, because these provide carbohydrates, vitamins (especially the fruits) and minerals (especially the veggies). With less than a week to go on this experiment, I can tell you that you would not want to live on a raw diet of fruits and vegetables for an extended period of time. Because on this diet (along with my usual complete fasting of 2 days a week), I am losing about 1 pound a day. The raw fruit and veggies do provide some minimal protein and fat, but these, along with the carbs, do not really provide enough calories to maintain my weight. I really have to push myself to eat more fruit, during the day, to get my calorie levels higher. But I'm finding it very difficult to eat that much, and so my total calories for the day is likely to be deficient -- hence the weight loss.
I even see a small amount of weight loss each night when I fast during sleep (as much as 2 pounds, during the sleep that follows a total fast of 24 hours). My body, during the resting phase, is breaking down the stored fats in my body for fuel. Let me reiterate: I am virtually unable to push myself to eat enough raw fruits and veggies to obtain enough carbohydrate during the day to maintain my current energy level (and I don't consider myself particularly active); and at night, when I stop eating, my body quickly looks for other sources of energy, and breaks down my fat stores. In other words, carbs from fruit and veggies are used almost immediately by the body and surprisingly very little of that energy is stored. The liver can store some glucose from carbs in the form of glycogen, and some of this can be stored in the liver or in muscle tissue for use when the constant supply of fruit is not there. But not a lot is stored, and some of the rest of those carb calories are going to be burned off as heat (more on this later). It is my understanding that fat cells get no more glucose than any other cells of the body. Provided you are eating good carbs, and not empty calorie carbs (more on this later, too), you are not going to be storing all of your carbs as fat.
The body also has an ability to break down protein as well as fat, and my body might eventually do that, if I continue too long on this diet of raw fruit and veg, but for now I'm pretty confident that I still have a bit more fat on this body of mine before I get into muscle wasting. However, Lyle McDonald, author of "The Ketogenic Diet" claims that during a total fast, "up to one half of the total weight lost during a complete fast is muscle and water." It is unclear to me whether McDonald is citing studies that examined starvation fasting ("although protein losses decrease rapidly as starvation continues"), or whether this "unacceptable ratio" also is found in intermittent fasting, as I have been doing. That will have to be my research topic next time.
Sidebar: Ideal weight
Incidentally, the last couple of days I've been wondering what my ideal weight might be. We all know of the BMI scale -- the index that shows, based on your height and weight, whether you are underweight, normal weight, overweight, or obese. There are several calculators online, like this one at the NHLBI. But the BMI does not tell you what the optimal weight for your height and gender might be -- it gives a range of normal. My weight is now in the middle of this range; by the 21st day of this experiment I was down to 165 pounds for a man who is 6 feet tall. I am skinnier than I have been for years. My new jeans that I bought recently -- when I had lost twenty pounds after fasting 2 days a week for about 6 months -- now fall off me unless I wear a belt. But all of my belts do not fit. I lost my ass.
And more and more people begin to be concerned for me. They can see I am losing weight, and openly ask me if I am sick. Which leads me to think that I'm starting to look not so good. I don't want to look sick. So why do I keep doing this?
I have decided to see if I can go without bread for an entire year. A test of my endurance. Because everyone knows that I love bread. But am I addicted to it? If I can go a year without grain, I will know for sure I am not. Even if I am the prototypical exorphin junkie.
But will I reintroduce bread into my diet after going without it for a whole year? At this point, I don't know. Some days I say no; other days I say, yes, certainly.
Sidebar 2: A Moment of Weakness
I've been taking a ton of fruit with me to work in my lunchbag/knapsack, when I'm on the nightshift. Since I'm sleeping during the day, I don't have a chance to eat enough at home. So I cart several pounds of fruits and veg with me to work, and while many of my patients sleep, I work at shovelling it into my maw. It takes a lot of time to eat that much fruit and leafy greens, just to get your calorie content a bit higher.
I know that I can, through pure willpower, make it to the end of these 30 days -- or even go beyond that, if necessary. But I also know that I still occasionally think about bread, and how nice it will be when I can eat some again. I don't usually have cravings, but every once in a while, it will strike me.
Like it did on the morning of my 18th day of this fast, on the drive home after a nightshift. I was a couple of blocks from home and I turned the corner and saw a crow, in the middle of the road, picking at a substantial crust of bread. Irrationally, I thought about stopping the car, getting out, shooing the bird away, and wresting that prize away from it for myself.
That's what I mean when I say I'm still an exorphin junkie.
Ketosis
Before the sidebar interlude, I was talking about how the body uses carbs as fuel when we eat. We also can eat fats, and get fuel from them; and even protein can be used as fuel. But what happens when we aren't eating anything? First, the stores of glucose as glycogen are used up; and then the body turns to stored fats.
When the body uses body-fat as fuel, it breaks it down into free fatty acids, which can be used by almost all body tissue except the brain and nervous system. Does that mean that our brains don't run on any fuel when we sleep, or when we fast? No. If there are not enough carbs in the diet to provide glucose (and you only have to reduce carbs to about 10% of your caloric intake/energy needs), the brain lives on ketones. What are ketones? Well, when free fatty acids are broken down in the liver, they leave behind metabolites called ketones. If there are enough ketones in the bloodstream, glucose is no longer used, and neither is protein used. The brain uses ketones first, to get the levels of ketones in the blood down -- because too many ketones will push you into a state of metabolic ketoacidosis. Most commonly, we see this in undiagnosed type 1 diabetics.
And this is how the high protein, low-carb diets (like the Ketosis Diet, or Atkins, or Zone, or Paleo) all work: when you eat fewer carbs, your ketone levels rise, and that, along with the free fatty acids, give you energy. The details and mechanism of the diet were first worked out by Dr. Russell Wilder in the 1920's in the Mayo Clinic, after extensive work with diabetic (and then epileptic) children.
Dr. Russell Wilder, 1920s: from the Journal of Nutrition
There are several well-known complications of ketogenic diets. Some of these are also a symptom of starvation ketosis from extended fasts; others are also symptoms of diabetic-induced metabolic ketosis.
- hypoglycaemia -- causing sleepiness, vomiting, nervousness, trembling, sweating
- acidosis (from ketones in the bloodstream) -- panting, irritability, increased HR, facial flushing, fatigue, vomiting
- dehydration -- causing constipation (also caused by the low fiber)
- hyperlipidemia -- cholesterol and triglycerides are both elevated on this diet
- nutritional deficiency - lacks calcium, magnesium, phosphorus, vit A, D, E, zinc, selenium, carnitine, causing decreased bone density, cardiomyopathy
- kidney stones
I don't think that there have been enough studies done to show what happens, but something slightly different happens to the brain that is fuelled primarily on ketone bodies as opposed to glucose. We should expect this: at night when we fast, our brain dreams, and our experiences are quite different than in waking states. Also, we have the example of lots of mystics who have strange experiences while fasting for extended periods of time; the most famous might be Jesus, who after fasting 40 days in the desert reported an exchange with The Tempter. Today, the ketogenic diet is still sometimes used to treat epileptic children. These unfortunate children need to have food enough to grow, but when you give them glucose in the form of too many carbs, sometimes they have more seizures. Some of them do all right on a restrictive ketogenic diet. Extreme body builders often use some form of the ketosis diet as well, in order to contrast their huge muscles and lean bodies during a competition. Most endurance athletes, however, cannot use a ketogenic diet, because it is felt that you cannot sustain high aerobic levels of exercise with a low level of carbohydrate. The muscles of a marathon runner or a triathlete need those glycogen stores, and so they traditionally "carbo-load" before an event.
de novo lipogenesis
So where does fat come from in the first place, if the carbs we eat aren't making it? As I quoted John McDougall in a recent blog (day 15 of my fast), making fat out of glucose comes from de novo lipogenesis. McDougall said that this is done by pigs and cows, but he didn't say whether humans could do it. Well, we can. But it isn't our preferred use of carbs, as Hellerstein showed. (Hellerstein, M. (1999) De novo lipogenesis in humans: metabolic and regulatory aspects. Eur J Clin Nutr. 53(supple 1) p. S53-65). It is only when carb input is greater than total energy expended that fat is created. It is very difficult to do this on whole foods, as I hope I have shown (on day 10 of my fast, when I totalled the calories, carbs, protein and fats of a single day of raw food eating). You will be sated before you can do it. But if you eat processed foods -- anything with extra sugar in it, like cake, ice cream, soft drinks, and white bread -- suddenly you can build fat just like pigs and cows. Furthermore, Hellerstein proved that although we can make fats out of carbs (in times of excess carbs), we can't make carbs out of fats. We are missing that metabolic pathway. And there has to be a seasonal and an evolutionary reason for this. I suppose it is because proto-humans ate lots of fruit when it was ripe, so that they could live off stored fat when times were lean. When they had to fast, they lived a little like hibernating bears -- with a slower metabolism, and burning their own fat.
Lis Olesen Larsen from the August Krogh Institute at the University of Copenhagen provides a quick overview of where the science is taking us in our understanding of de novo lipogenesis and how it contributes to human obesity (Larsen, L. (2002). Nutrition Discussion Forum: the role of de novo lipogenesis in development of obesity in man. Brit J of Nutr. 88 pp. 331-332). From her own research in 2001, she determined that de novo lipogenesis is more likely to occur when we eat more than enough carbs for our total energy expenditure AND we also take in 30% of our calories from fats. The animal products that contain fat (cheese, butter), or the fatty plant oils (margarine, olive oil), we consume with our carbs (bread) are as much to blame as the carbs (bread). She also quotes a study she did with Lammert, which found that during sleep, our temperature does not rise high enough to burn off those extra calories provided by the glucose in the carbs. That study did not look at possible increased heat loss while study participants were awake and active, but it does suggest that John McDougall might have oversimplified things for us.
Luxus consumption
Do we burn off all of the calories we eat from starch (carb) as metabolic heat, as I reported McDougall said the other day? This idea was first proposed by Neumann in 1902, and called, in German, luxuskonsumption. We do burn off some of these "abundant" (luxus) calories, but it is still unclear yet to what degree, and a consensus is beginning to emerge in science that the effect is negligible. We still need to be diligent to not overeat: luxus consumption has also been been more recently defined as eating more food than you need (thus wasting it), and it has serious health and environmental effects (see, for example: Blair, D. and Sobal, J. (2006). Luxus consumption: wasting food resources through overeating. Agr and Human Values. 23. pp. 63-74).
Fasting Detox
What my patients are doing, when they come to our palliative care ward, and stop eating, is they begin to detox. When nothing is going in any longer, the body begins to break down these stored fats for energy, and it is in these fat cells where even more toxins are stored. The toxins in their body -- the metabolites from the drugs they have endured, the biproducts from the disease process that has overloaded their system, and the simple excretions of each cell in their body as they continue to live and use energy -- all begin to leave the tissue, if they remain hydrated and there are no blockages. Kidneys work harder, livers work harder, bowels work harder -- if they can -- to get this stuff out. And in palliative care, we nurses help this to happen.
And as so often happens, some people get a little bit better. It might just be a burst of energy before death; or it might be something else, a turn in the road. We never give up on anyone. I have seen people last far longer than the medical community's prognosis, when they stop eating food.
Wavering Resolve
Let's talk about willpower for a moment, since apparently it is the one thing that stops me from succumbing to habit due to my wavering resolve to eat no bread, no grains, no meat, no dairy, etc. for 30 days.
If your read over my last few blog entries, you can see the preparation for, the determination to trial a few days without bread, the happenstance that I then read something by someone who dared others to go a week without bread, then I found someone suggesting 10 days without bread, then another who advocated a month without bread, and finally someone who challenged people to go a year raw. But while attempting to meet each next goal, I also find those who say that bread is okay, it is the other things that are often consumed with it that are bad. And then my resolve to take the next challenge wavers. If it were not for the fact that I have committed to finishing the 30 day fast from bread, I would certainly have already gone back to my old habits. But I like to test myself, I like to experiment, I like to see what might happen.
The only thing that stops me from eating bread right now is my willpower to reach an arbitrary goal that I've set for myself. And I have set this goal to find out if indeed I'm addicted to bread.
You might think that you could not go an entire day without eating anything. But it is not really all that much longer than a single night without eating. When you are on your deathbed, you may be surprised by how long you are living, without eating. You can do it. You can.
The body is built to withstand short fasts. The body improves and detoxes on short fasts. But if you do it for too long, like anything else done in excess, it will harm you.
My own Fast from Bread
Since my last blog posting, I've been wavering over the question: should I continue beyond 30 days on the raw food diet, or should I now admit some cooked starches into my diet? After three weeks on the raw diet/fast from bread and dairy, my current feeling is that the raw diet will be unhealthy in the long run and exorbitantly expensive for me, so far north of the equator. I suspect that I would be able to live for some time eating only raw foods using shear willpower, but I would ultimately not be happy. But is that because I want to take the easy road (the McDougall diet of mostly cooked starch -- which includes whole grain bread), or because it really is the healthier alternative? Is my ambivalence toward the raw diet a symptom of my wavering resolve, or is it because I'm certain that McDougall's diet is actually healthier?
I've decided to continue to omit eggs and dairy from my diet, after this 30 day fast from bread is up; the only question now remains, which is the better vegan diet for me? Raw or cooked?
Enzymes
Ever since I learned about it -- in the context of John McDougall citation of it in "The Starch Solution" -- I have been intrigued with the work of Nathaniel J Dominy who claims that humans have evolved due to their exploitation of the foodsource of starches (see, for example, this publication,
When you compare our amylase production to that of other primates, humans have about 3 times more gene copies of AMY1 than chimps, and "~6-8 times higher" "salivary protein levels", and "bonobos may not have salivary amylase at all." Other primates, such as cercopithecines ("a subfamily of Old World monkeys including macaques and mamgabeys) have relatively high salivary amylase expression, even compared to humans…evolved to facilitate the digestion of starchy foods (such as the seeds of unripe fruits) stowed in the cheek pouch…")
"it is hypothesized that starch-rich plant underground storage organs (USOs) were a critical food resource for early hominids. Changes in USO consumption may even have facilitated the initial emergence and spread of Homo erectus out of Africa."
Are Humans Milk Eaters?
Dietary enzymes are highly specific. I'd like to see this studied far more, because in my opinion, the dietary enzymes that the human body produces will point to the ideal human diet from which we have evolved. For example, I recently read this summary in the book "Everyone Eats: understanding food and culture" (Anderson, E. 2005) about lactase, which enables us (some of us) to metabolize milk:
Human babies are born with this enzyme, which performs this cleavage. However, most humans stop producing this enzyme around age of six to ten. Thus most adult humans cannot digest lactose (Patterson 2000). Like other undigested sugars, it causes diarrhea and flatulence, and, in large quantities, outright sickness. Small amounts of milk are tolerated; more leads to indigestion. However, Europeans (especially north Europeans) and East Africans have depended on fresh milk so long that they have evolved the ability to keep producing lactase throughout life. Presumably, children without lactase did not thrive, as fresh dairy products became more and more vital as staple foods—though at least some humans can also adapt to high-milk diets by continuing to produce lactase when they would not otherwise have done so.
Outside of Europe and East Africa, most humans cannot eat fresh dairy foods. Even in Mediterranean Europe, most cannot; in East and Southeast Asia, virtually all cannot, even after long exposure. But they have learned to make microorganisms do the enzyme work. Fermenting milk into yogurt, cheese, and the like involves breakdown of lactose by Lactobacillus bacteria. Yogurt is generally made by L. bulgaricus. (Other Lactobacillus species give us salami, sauerkraut, and San Francisco sourdough bread.) Thanks to yogurt making and other processing, peoples in West and Central Asia and the Indian subcontinent depend on dairy foods, though only 10–20 percent of them can digest lactose (see Patterson 2000:1060). Some Arctic-dwelling humans—as well as some birds, such as starlings—have lost the ability to produce sucrase, and thus cannot digest ordinary sugar (sucrose; see Draper 2000).
There are longer-chain sugars, mostly indigestible. Stachyose and raffinose, in beans, cause the indigestion and flatulence associated with beans, because we can’t digest them.Still longer chain carbohydrates (polysaccharides) are starches, and these we can digest, breaking them into glucose. Potato starch is particularly easy to digest, and thus can cause a “sugar rush.”
Still longer chains include things like lignin and cellulose, indigestible to higher animals. Ruminant mammals, termites, and other such creatures have symbiotic microorganisms that do the digestive work.
The specificity of enzymes in the adaptive human digestion system leads me to suspect that each of us, depending on our genetics, will have an individual and perhaps cultural metabolic phenotype.
This may be true especially when it comes to starch. Which starch are we adapted to? Are all starches the same, or do we require different starch enzymes, for different starch sources? Which fibers do we metabolize, and which ones do we not digest? Which ones do we need, and which ones are harmful (if any)? These are the things I'd like to know.
It disturbed me to learn that when I ate that raw starch sweet potato the other day, I found it largely indigestible (even though my tongue indicated to me that it would be good to eat). By indigestible, I mean it caused a gut ache, and made me gassy; it slowed the passage of foodstuff through my bowels. But look at what happened: the gas is a result of the fermenting work of my bowel flora. They had more time to work on it, because the GI tract slowed. Were the fermentative metabolites good for me, or bad for me? I don't know. But obviously, we live in symbiosis with the bacteria in our guts. They can digest some things that we can't, and they can give us some benefits from being fed; it is possible that vitamin B12 might be one such reward (or is that conjecture true? Could it be that absorption of B12 must happen in the small intestine, and not in the large intestine where we'd be more likely to find the B12-producing bacteria? More questions...).
This may be true especially when it comes to starch. Which starch are we adapted to? Are all starches the same, or do we require different starch enzymes, for different starch sources? Which fibers do we metabolize, and which ones do we not digest? Which ones do we need, and which ones are harmful (if any)? These are the things I'd like to know.
It disturbed me to learn that when I ate that raw starch sweet potato the other day, I found it largely indigestible (even though my tongue indicated to me that it would be good to eat). By indigestible, I mean it caused a gut ache, and made me gassy; it slowed the passage of foodstuff through my bowels. But look at what happened: the gas is a result of the fermenting work of my bowel flora. They had more time to work on it, because the GI tract slowed. Were the fermentative metabolites good for me, or bad for me? I don't know. But obviously, we live in symbiosis with the bacteria in our guts. They can digest some things that we can't, and they can give us some benefits from being fed; it is possible that vitamin B12 might be one such reward (or is that conjecture true? Could it be that absorption of B12 must happen in the small intestine, and not in the large intestine where we'd be more likely to find the B12-producing bacteria? More questions...).
I suspect strongly that my bowel flora has changed drastically since starting this fast. And indeed, it didn't take me long to find a scientific article which showed precisely that.
Bowel Flora
Ling and Hanninen (Ling W. and Hanninen, O. (1992) Shifting from a conventional diet to an uncooked vegan diet reversiby alters fecal hydrolytic activities in humans. J Nutr 122(April) pp. 924-930 ) took 18 people and put them on a raw diet for a month (Note that the raw diet that was trialled contained some pre-fermented foods, so the food, although raw, was also rich in lactobacilli), followed by a conventional diet, and checked out some of the metabolites of the faecal bacteria to see how they changed.
Depending on what you eat, the bacteria in your GI tract will produce various enzymes, some of which will then cleave substances that you ingest, and cause them to travel through your bloodstream, to be scooped up by the kidney and excreted (or they can be also excreted in stool). In particular, the raw diet causes faecal urease to drop by 66%, and there were also significant drops in the enzymes chololglycine hydrolase, Beta-glucuronidase, and Beta-glucosidase within 7 days of the raw vegan diet. These enzymes have been implicated in generating toxins and carcinogens that the liver has trouble filtering; and urease increases ammonia content, which has been implicated with systemic toxicity, colon inflammation, genetic mutations, and GI tumour genesis. Furthermore, on the raw diet, concentrations of the metabolites phenol and p-cresol were lowered. The major species of gut bacteria is the anaerobe Bacterioides fragilis, and it produces p-cresol; and other anaerobes (e.g. E.coli) produce phenol.
According to this study, within 2 weeks of resuming a conventional diet, most of the benefits of the raw diet were obliterated; after 1 month, it was as if nothing had ever happened. Ling and Hanninen note that there are specific changes that will occur depending on the type of fiber passing through the colon, and they give an interesting comparison of some fibers (pectin, carrageenan, agar-agar, wheat bran, carrot fiber) on the levels of the enzyme metabolites, but they indicate much more work needs to be done in this area. The current thinking is that an increase in fiber, as that which naturally occurs in a diet rich in fruits and vegetables, will change the gut flora in such a way that toxins and mutagens are minimized -- lowering your chances of contracting cancer and other diseases.
Incidentally, this article says that pure "wheat bran and carrot fiber have an increasing effect on Beta-glucosidase activity and no effect on Beta-glucuronidase," but that a diet in varied mixed vegetables with wheat bran or carrot fiber would have quite a different effect entirely.
Genetic adaptations of humans to diets
The idea that some humans can metabolize milk, and some can't made me curious. I attempted to find the references cited by Anderson in his work on dietary enzymes (see the section, "Are Humans Milk eaters" above), and while browsing the scientific literature I found yet another article which critiques Cordain's view of humans as mostly paleolithic hunters.
Milton (Milton, K. (2000) Hunter-gatherer diets - a different perspective. Amer J Clin Nutr. 71(3). 3665-667) maintains that humans evolved on plant foods, just as the other primates did. As soon as the human brain developed in size, and stone tools were invented, animals as a food source became part of the human diet, but not to the extent Cordain suggested. She says typical contemporary hunter-gatherers get 33% of their calories from animal sources, and the rest comes from plant foods (virtually the reverse of what Cordain believes). Tubers, seeds of millet, nuts, and wild fruit seem to constitute the main source of their food -- and these cultures only thrived when these plant species could be adequately relied upon to provide food year round. The proper designation for these early tribes of humans ought to be "hunter-gatherer-agriculturalists" since some sort of cultivation of a "single starchy carbohydrate" was tied to their very existence.
Milton says that true genetic adaptations of humans to diet are few. The fact that some individuals of European descent continue to produce lactase in adulthood is merely a regulatory mutation, from a period in European human history when such a trait was selected for. But we do not have many other adaptations to flesh diets, such as we see in carnivores. We cannot synthesize vitamin A or niacin, for example. Certainly there are metabolic phenotypes which characterize humans from different regions of the world. For example, circumpolar people may have in some cases lost their intestinal sucrase -- but they are still unable to synthesize their own vitamin C. They have adapted, but they have not fully evolved to a complete carnivore diet.
In another article, Milton (Milton, K. (1999) Nutritional characteristics of wild primate foods: do our closest living relatives have lessons for us? Nutrition 15(6) pp. 488-498) says that there is a general consensus arising that "humans come from a strongly herbivorous ancestry." But is that true? At some time, humans ate meat. This became a regular part of the diet -- actually more regular, once agriculture started in earnest, some 12,000 years ago, and a domesticated animal food source became more easily available than chasing wild game. While there may be a consensus that we came from herbivores (although insectivores have also been proposed by some authorities), there is no consensus about the amount of meat in the earliest human diet.
At stake here is no less than what caused the increased brain size of our human ancestors, and when did we begin cooking: was it before hunted animals became part of our diet, or after?
Segue to Cooking
Milton indicates that "the proportion of the human gut appears to reflect the fact that many foods are 'pre-digested' by technology in one way or another before they ever enter the human digestive tract." In other words, cooking or fermenting.
Raw foodists frequently say, "no other animal on earth cooks its food," and that is given as a sort of proof that we have stepped away from our natural raw food (whatever it originally was). And it seems absurd on the face of it, that humans have evolved into a cooked food user. But according to Wrangham and Conklin-Brittain (Wrangham, R. and Conklin-Brittain, N. (2003) Cooking as a biological trait. Comparative Biochemistry and Physiology Part A: Molecular & INtegrative Physiology 136(1) pp 35-46), this is precisely what happened: Humans have had time to evolve the ability to exist primarily on cooked foods. According to them, cooking almost certainly predates meat eating. Furthermore, Wrangham (Harvard U. Primatologist) makes a strong argument that humans have largely lost the ability to subsist on raw food in the wild, whether it be a raw diet of fruits and greens or with the inclusion of raw meat.
Here are some highlights I enjoyed from this peer-reviewed article:
- "Other than … deliberate raw-foodists, we have not found any current or historical examples of individuals or small groups living for more than a few days without access to cooked foods."
- The inuit ("one of the most recently adopted human lifestyles, approximately 4000 years old") sometimes eat meat raw ("providing vitamin C") "but meat, blubber and even blood were sometimes cooked," even among the earliest studied unacculturated Inuit. No humans are fully adapted to a raw meat diet.
- "56% of 48 plant roots eaten by African foragers were sometimes eaten raw. But such items tend to provide snacks rather than meals."
- "no human populations are known to have lived without regular access to cooked food."
- "The typical duration of a speciation event is considered to be 15 000-25 000 years, and mammalian species can evolve in as little as 5000 years." It is estimated human LA, or lactase producing genes that afforded humans the ability to metabolize milk in adulthood, took a mere 5000 years to increase from 5% to 70% of the population.
- Evidence for cooking is older than 5000 years. "It is necessary for the processing of cereal grains, which were being harvested 20,000 years ago by people skilled in fire management and grinding."
- Earlier evidence of cooking by humans and hominids:
- Kebara Cave, Israel 60 000-48000 BP (bones) - (Speth and Tchernov, 2001)
- various European and Middle Eastern sites >250 000 BP (earth ovens) -- (Brace, 1987, 1999); (Ragir, 2000)
- Vertesszolos, Hungary 600 000 - 400 000 BP (control of fire) - (Kretzoi/Debosi, 1990)
- Swartkrans, South Africa - >1 million BP (Brain, 1993)
- Koobi Fora, Kenya - 1.6 million BP (Rowlett, 2000)
- Homo ergaster, east and south Africa - 1.9 million BP (oldest date suggested for adoption of cooking, based on biological evidence; "ergaster" is derived from an ancient greek word for 'workman') - (Wrangham, 1999; Leonard/Robertson, 1997; Aiello/Key, 2002; O'Connell 2002)
- "a strict raw food diet cannot guarantee an adequate energy supply" (citing Koebnick et al, 1999); almost 1/3 of the urban raw foodists Koebnick studied had Chronic Energy Deficiency, and half the women had menstrual disturbances. This "raises the question of whether people could survive on a raw food diet in the wild."
- "Most types of cooking tend to increase the digestibility of starch" (Holm, 1988; Kataria/Chauhan, 1988; Ayankubi 19991; Muir and O'Dea, 1992; Yiu, 1993; Kngman/Englyst, 1994; Ruales/Nair, 1994; Urooj/Puttaraj, 1994; Barampama/Simard, 1995; Periago 1996; Bravo, 1998; Marconi, 2000; Sagam/Arcot, 2000; Slavin, 2001; Smith, 2001). "The same is true of plant protein digestibility" (Rao, 1996; Chtra, 1996; Khalil, 2001)
- Cooking improves "the rate at which the teeth can process a given food." It takes less time to chew foods that have have been softened or gelatinized by cooking, so less expenditure of energy per intake of food.
- "human molar size started falling approximately 100 000 years ago" (citing Brace, 1991), probably due to a new type of cooking technology, i.e. boiling.
- Homo ergaster 1.9 million years ago already had a reduced tooth and jaw size, indicative of earlier cooking practices.
Raw Diet: Possible?
If anyone doubts Wrangham's conclusion that a raw diet cannot provide adequate calories in a timely way, or thinks that this is the way humans evolved, without cooking tubers and other veg, I challenge you to try a 30 day fast, of only eating raw fruits and veggies; for any random day, total up the amounts you eat and calculate the caloric intake (as I did, on the 10th day of this fast, see here), and how long it takes to eat it, without addition of modern knives and blenders and juicers. (Okay, I'll allow you to use any bone or stone knife you have made yourself. And you can also eat any wild animal raw that you hunt and kill yourself with nothing more than that same knife). At the end of those 30 days, tell me if you want to continue spending that much time eating. Tell me you have sustained your weight. Tell me you think that this is a healthy diet and that you could live on it indefinitely. Oh, you might see some benefits to doing it: you might lose some weight and also lose some of the modern health issues that run parallel with weight gain. But I think that most people who do not live at the equator and have not planted trees on their farm that provide them with year round fruit will discover this diet is unsustainable in terms of cost and long-term health benefits. Or, in place of performing that month long experiment, you can read Wrangham's article and see his analysis of what it takes for a woman who is 120 pounds to eat enough raw food to live indefinitely.
The Modern Raw Food Ideal
Wrangham cites the work of Kobenick to show that raw food diets are not an acceptable model for early humans. I looked closely at Kobenick's work. Corinna Koebnick is an epidemiologist at maastricht University in the Netherlands, and she is one of the few researchers who has been involved in the scientific research of several vegetarian diets. Among the 85 published scientific reports that she has authored or co-authored, I looked at these:
- In 1999, Koebnick examined long-term raw food diets and discovered that they were strongly associated with a high loss of body weight. Almost 15% of males, and 25% of females following this diet have a BMI that shows them to be underweight, and almost 1/3 of all women on a >90% raw diet have amenorrhea. (Koebnick C. et al. (1999) Consequences of a Long-Term Raw Food Diet on Body Weight and Menstruation: Results of a Questionnaire Survey. Ann Nutr Metab 1999;43:69–79). Also known as "The Gliessen Raw Food Study" this is one of Koebnick's most oft cited studies.
- 2001 saw Koebnick and her team assessing pregnant women's folate levels on long term high vegetable diets. Lacto-ovo vegetarians and low meat eaters had the lowest risk for folate deficiency. High vegetable intake ensured adequate folate only if intake of vitamin B-12 was also assured (Koebnick, C. et al Folate Status during Pregnancy in Women Is Improved by Long-term High Vegetable Intake Compared with the Average Western Diet(2001) J. Nutr. 131(3) pp. 733-739).
- In 2004, Koebnick's team examined lacto-ovo vegetarian diets, and found 22% of pregnant women on this diet to be deficient in vitamin B-12, and to increase homocysteine (whereas 3-10% of women who included differing amounts of meat in their diet were also deficient in B12) (Koebnick, C. (2004). Long-Term Ovo-Lacto Vegetarian Diet Impairs Vitamin B-12 Status in Pregnant Women J. Nutr. December 1, 2004 vol. 134 no. 12 3319-3326).
- Also in 2004, her team sampled the magnesium status of pregnant women on plant-based diets, and found the amount of magnesium in vegetarians significantly higher, reducing the frequency of calf cramps in the final trimester (Koebnick, C. et al.(2005). Long-term effect of a plant-based diet on magnesium status during pregnancy.European Journal of Clinical Nutrition 59 pp. 219–225).
- In 2005, she published another report on raw food diets. Health benefits include some reduced risk for cardiovascular disease, but 38% of raw foodists were vitamin B-12 deficient, and 12% had increased mean corpuscular volume (MCV). Blood concentrations of homocysteine were higher, and triglycerides were lower, presumably due to deficient levels of B12; total cholesterol levels were less, including the good HDLs (Koebnick, C. et al. (2005). Long-Term Consumption of a Raw Food Diet Is Associated with Favorable Serum LDL Cholesterol and Triglycerides but Also with Elevated Plasma Homocysteine and Low Serum HDL Cholesterol in Humans. J of Nutr. 135(10) pp 2372-2378).
- In 2007, she worked with Garcia and others to examine the levels of dietary carotenoids in those on a raw diet. Although increased carotenoid levels (eg. Beta-carotene, lycopenes) are associated with a reduced risk of chronic disease, "it is difficult to achieve a high carotenoid intake from mixed Western diets." Raw foodists do manage to obtain levels of >0.88 micrmoles/l through diet alone. Vitamin A levels were normal in 82% of those studied, but 77% of the subjects had low lycopene levels. Those with the lowest amount of fat and oil consumption had lower carotenoid levels, and were likely at risk of vitamin A deficiency in the long term. Fat primarily came from nuts and seeds (25% of total fat intake) and fruits (20%). "Among fruits and vegetables, avocados were the mains sources of fat." Cooking vegetables does increase the bioavailability of lycopenes (Garcia, A. et al . (2008) Long-term strict raw food diet is associated with favourable plasma beta-carotene and low plasma lycopene concentrations in Germans. Brit J of Nutr 99, pp. 1293-1300).
Boutenko's Story
Sure, I too have been astonished by the books of Victoria Boutenko (e.g. "Raw Family : a true story of awakening," 2000; "12 Steps to Raw Foods: how to end your dependency on cooked food," 2000; Raw Family Signature Dishes," 2009), inventor of the 'smoothie,' to learn of the amazing health recovery she and her family have experienced by switching to an all-raw diet. It seems likely that it saved their lives. As wonderful and inspiring as her story is, one only needs to look at her recipes to note that many involve blended veggies, dehydrated seed mixtures, processed oils, and occasionally Braggs liquid aminos, none of which were available to our hominid ancestors when they evolved and differentiated themselves by their diet from their primate cousins. Many of her recipes are not low in fat. She is not afraid to liberally use nuts or cacao butter. That's not a criticism, just an observation. After all, her raw diet is not the diet taught by Douglas Graham, of 80-10-10 fame. Could any sustainable raw human diet without a vitamin blender truly approximate 80-10-10, I wonder?
A few days after writing about Boutenko, I found yet another, later, book by her, co-written by a couple of other raw food gurus (Elaina Love and Chad Sarno) who were coming to the same conclusion: a raw food diet is wonderful as a detox from other unhealthy eating patterns, but it is ultimately missing something, and unsustainable. "Raw & Beyond: how Omega-3 nutrition is transforming the Raw Food Paradigm" (2012) contains the personal stories of the authors, along with some new raw recipes that attempt to incorporate more Omega-3 fats. Lots of fats, indeed: in the form of oils, nuts and seeds, coconut and avocado, and also more sweeteners like agave. And it even includes some lightly cooked foods. This is one of the examples of the so-called "High Raw" diet, one that is largely raw, mostly raw, but also includes some cooked foods.
Foods like starchy veggies. The very foods from which humans evolved -- or so claim people like Dr. John McDougall and Richard Wrangham.
Wrangham's References on Archaeology
There were so many references in that article by Wrangham, it kept me busy on Thursday, the 19th day of my fast from bread, checking up on them. After examining the Kobenick references, I knew that there would be some value in reading some more of Wrangham's source material:
- While Speth J and Tchernov (Speth J. and Tchernov, E. (2001) Neandertal Hunting and Meat-Processing in the Near East: evidence from Kebara Cave (Israel). Meat-Eating and Hman Evolution. ed. Bunn. 2001. Oxford Univ. Press) have done a lot of cataloguing of the bones found in Kebara cave on Mt. Carmel, and have determined that the ungulates found in the midden heap there were from cooked meat, according to Madella's team the neanderthals in the Amud Cave in Israel also used plants for many different purposes -- including fuel, bedding and food. "There is clear and repetitive evidence for the exploitation of mature grass panicles, inferred to have been collected for their seeds" (from the abstract of Madella, M. et al. (2002) The exploitation of plant resources by neanderthals in Amud Cave (Israel): the evidence from phytolith studies. J Arch Sci. 29(7) 703-719)
- Despite the fact that Wrangham cites Brace's work, Brace apparently had no sympathy for the view that early hominids had mastered cooking. See, for example, Brace CL (2000) The raw and the cooked: a Plio-Pleistocene Just So Story, or sex, food, and the origin of the pair bond. Soc Sci Inf 39:17–28. One of his criticisms is that Wrangham's team had extrapolated a great deal of speculation about early hominid social and psychological demeanour based on the shape of a few bone fragments, in their earlier work (Wrangham, R. et al (1999)The Raw and the Stolen: cooking and the ecology of human origins. Current Anthropology 40(5) pp. 567-). Wrangham, in another work, claims that Brace's position is an intermediate position, and he agrees that cooking has led to the evolutionary adaptation of smaller teeth in humans.
- Ragir (Ragir, S. (2000) Diet and Food Preparation: rethinking early hominid behaviour. Evolutionary Anthropology 153-155) follows the traditional assumption that fire technology followed the hunter stage in cultural adaptation in human diet, and from that he is also able to deduce some far reaching social and behavioural adaptations of early humans, based on little more than bone fragments. Compelling reading: but what if that basic assumption was wrong -- what if cooking preceded hunting? Ragir notes that tubers required processing before they could be used as a food source: digging, crushing, and soaking at a minimum (all performed by the female of the species, he assumed; but he also assumed that the invention of fire did not take place before the evidence of barbecues). Still, he draws some rather interesting conclusions based on the reduction of size dimorphism in humans from archaic Homo sapiens to late Homo erectus. He suggests that this is indicative of the sharing of food elements between males and females -- the assumption being that males would hunt meat, and females would put the work in at base camp to get the tubers edible by cooking or other processing. Once the protein in meat was shared, the dimorphism disappeared.
- Among the many interesting problems of archaeology is ascertaining when the use of fire became a human achievement, and when the migration out of Africa into the landmass of northern Europe could have been achieved. These things are related, as it has always been assumed that even a northern hunter on the retreating glacial edge must thaw his meat from the previous day's kill to eat it. M. Kretzoi, of Budapest University has been unravelling the clues for decades, with his careful study of the animal bones and hominid bones at the site of Vertesszollos in Hungary. I've read several of Kretzoi's articles online, but have yet to see the one that is most often cited, where Kretzoi and Dobosi assumed that the middle Pleistocene -- a time when the cranial capacity in hominins rapidly expanded -- was also a time when evidence is found of hearths (control of fire) . In some detailed catalogues, Kretzoi seems somewhat baffled by the bones which suggest that the climate of Europe was quite a bit more temperate than it is currently, or that has been presumed for it at various times. Meanwhile, the scarcity of sites due to the erosion of glaciers means that we must draw some exacting conclusions on very little evidence indeed. Among the questions that remain controversial: were there two parallel hominid species in Europe for several hundred thousand years -- neanderthals and homo erects -- or were they related?
- The earliest finds of bones that have been burnt are inconclusive and contentious, as James showed in James, S. (1989). Hominid Use of Fire in the Lower and Middle Pleistocene: a review of the evidence. Current Anthropology. 30(1) pp 1-
- I was not able to access the oft-cited article by Brain (Brain, C.K., 1993. The occurrence of burnt bones at Swartkrans and their implications for the control of fire by early hominids. In: Brain, C.K. (Ed.), Swartkrans. A Cave’s Chronicle of Early Man. Transvaal Museum Monograph No. 8, Transvaal, pp. 229–242) , although one can find an early report here, with Brain part of the 'et al' team: Susman, R. et al. (2001) Recently identified postcranial remains of Paranthropus and Early Homo from Scartkrans Cave, South Africa. Brain's suggestion that Australopithecus robustus used bone tools to dig for tubers was immediately challenged by Backwell, L. and d'Errico, F. (2001) Evidence of termite foraging by Sweartkrans early hominids. PNAS 98(4) pp1358-1363. If they were digging for tubers, they needed fire to process the food; if they were eating termites, they could eat raw. So much depends on why they were digging -- brain size, tool making and control of fire have often been considered tandem evolutionary events.
- Richard Wrangham also wrote "The Cooking Enigma", chapter 12 of Pasternak's book "What Makes us Human?" (Pasternak, C. (ed.) (2007) What makes us Human? One World Publications.). In this chapter, Wrangham raises the cooking enigma: if, as conventional archaeologists believe, cooking occurred in the Middle Paleolithic, why have there been no major evolutionary changes in the bone structures since then? There are sites that suggest cooking, but also some that suggest no cooking, previous to the Middle Paleolithic, but these have not convinced the skeptics. The "Basal Solution" which Wrangham supports and expands upon is the hypothesis that cooking originated around the same time as Homo erectus, and was directly responsible for the evolutionary changes seen in erectus, who arose from australopithecines (smaller jaw and teeth, smaller gut, higher energy expenditure). But the Basal Solution must explain why "evidence of control of fire is scarce before about 400,000 years ago" and "it must also be reconciled with the traditional idea that meat eating was the prime dietary mover of the evolution of the genus Homo."
- Rowlett's work in Koobi Fora, Kenya, suggest that H. erects "had the technological capability of cooking foodstuffs." From a site 1.6 million years ago, the only traces of fire now can only be found using "archaeomagnetic and thermoluminescent analysis." (Rowlett, R.M., 2000. Fire control by Homo erectus in East Africa and Asia. Acta Anthropol. Sin. 19, 198–208).
- Two hypotheses of quite different purport are found in Park's interesting review of the evolution of the human brain (Park, M. et al. (2007) Evolution of the Human Brain: changing brain size and the fossil record. Neurosurgery. 60(3) p. 555- ). Either we adapted to eating meat's higher nutrient density by evolving smaller colons and greater small intestines (compared to gorillas, whose plant based diet shows larger colons), or these physiological changes were a result of a diet of cooked foods -- whether they be tubers or meat. But was it the extra protein of meat that caused the increase in brain size, or the extra starch in tubers, released by cooking, that fuelled the brain?
- Ulijaszek doesn't appear to be leaning toward any single hypothesis, but instead argues that cooked food -- both tubers and meats -- likely explains the dominance of Homo erectus and the migration out of Africa and throughout Asia with control of a food source (Ulijaszek, S. (2002). Human eating behaviour in an evolutionary ecological context. Proceedings of the Nutrition Society. 61. pp. 517-526).
Although the conventional view is that fire making must have come after the introduction of stone tools, there could be an alternative hypothesis that fits the facts. It may be that the development of fire was a far earlier technology than the development of stone spears and other implements. And it makes sense, if you consider how humans may have adapted:
- Like their cousins, the great apes, proto-humans evolved in tropical forests rich in fruits and leafy greens. They could eat tubers, but only in times of little fruit, as it would be largely indigestible to them. The only meat they ate was insects, and perhaps the odd bird or other small animal that they could easily catch by chance. All food was eaten raw. Like other primates, they had a disgust of carrion left by carnivores.
- Up to this point, they have not differentiated their diet. But as they banded together for protection, they began to make opportunistic use of fire. As a sacred and social core of their tribe, hearths would allow individuals the ability to experiment with different food sources.
- Over the course of time they learned how to make fire and control it. Fire allowed them extra protection, the ability to make better tools of sticks, and to expand their food source into starch (the tubers, and perhaps some grain endosperms), as well as meat. Brain size expanded as food density and digestibility increased.
- On a cooked starch-based diet they were no longer tied to the forests, so Homo erectus left Africa and migrated throughout Asia. As they moved, they began to learn how to make stone implements and bring down large game.
- FInally, agriculture led to a more sedentary lifestyle, and also the development of human culture.
This is my current understanding, after reading several of Wrangham's sources and his analysis. Curious to find out what others in his field think of his work, I read Liesl Driver's analysis of it. Driver is from the Dept of Anthropology at the University of Pennsylvania. This (Driver, L. (2010) What made us human: analysis of Richard Wrangham's Cooking Hypothesis. Lambda Alpha Journal 40. p 21-) is her review of Wrangham's book, "Catching Fire: how cooking made us human;" it contains a quick synopsis of it, hitting the main points I've already discovered in his peer-reviewed studies. Her final conclusion is that Wrangham has successfully argued the thesis that "the behavioural adaptation of cooking food and the consistent use of controlled fire led to the transformation of modern humans."
It took me a couple of days to obtain Wrangham's book. By then, I had read most of his primary sources, and from my own experience eating a raw food fast for almost 30 days, I'd have to agree with him. Humans evolved on cooked food.
Catching Fire
Right now, I believe that cooked food is our most natural food, not raw fruits and veggies. Ever since we stood erect, we have also scrubbed around in the dirt for tubers, and banded together to hunt wild game. I don't know which came first, but it makes sense to me that we learned how to cook before we learned how to bring down big animals, and we learned how to eat starchy tubers after learning how to cook, and because of that food source our brain size increased, and we were enabled to communicate and hunt in groups better.
Catching Fire
I'm enjoying Wrangham's book "Catching Fire: how cooking made us human." This is a book for everyone, not just scientists, and it is quite fun to read, whereas the scientific articles he wrote can be a bit of a challenge at times. For example, I chuckled when I read of the new pet food, "Biologically Appropriate Raw Food," (BARF), which is advertised beneficial for dogs. And this paragraph thrilled me:
Although the australopithecines were far different from us, in the big scheme of things they lived not so long ago. Imagine going to a sporting event with sixty thousand seats around the stadium. You arrive early with your grandmother, and the two of you take the first seats. Next to your grandmother sits her grandmother, your great-great-grandmother. Next to her is your great-great-great-great-grandmother. The stadium fills with the ghosts of preceding grandmothers. An hour later the seat next to you is occupied by the last to sit down, the ancestor of you all. She nudges your elbow, and you turn to find a strange nonhuman face. Beneath a low forehead and big brow-ridge, bright dark eyes surmount a massive jaw. Her long, muscular arms and short legs intimate her gymnastic climbing ability. She is your ancestor and an australopithecine, hardly a companion your grandmother can be expected to enjoy. She grabs an overhead beam and swings away over the crowd to steal some peanuts from a vendor.
Right now, I believe that cooked food is our most natural food, not raw fruits and veggies. Ever since we stood erect, we have also scrubbed around in the dirt for tubers, and banded together to hunt wild game. I don't know which came first, but it makes sense to me that we learned how to cook before we learned how to bring down big animals, and we learned how to eat starchy tubers after learning how to cook, and because of that food source our brain size increased, and we were enabled to communicate and hunt in groups better.
So since my last post I've decided that I will not continue eating a 100% raw diet, following my 30 day experiment.
No, I won't be continuing the raw diet beyond 30 days. I found the information on Raw Foodism at Vegan Health well balanced and complete, and I want to avoid orthorexia (see the videos linked to at the bottom of their Raw Foodism page for an explanation). I've also had some fun lurking on 30bananasadaysucks.com. Beyond thirty days, though, I still plan to eschew, rather than chew, bread and other grains. I'll reintroduce other cooked starches (potatoes, sweet potatoes, legumes, beans) into my diet when the month is up, but remain as vegan as possible, trying to follow some of Dr. John McDougall's guidelines to reduce the dietary fat.
That's where I'm at.
Notes to Myself
- I am working with a girl who has lost 35 pounds on the paleolithic diet, and wants to lose more. Approximately the same amount of weight loss I've experienced, on fasting, high carbs and a raw diet, in roughly the same amount of time. She is doing it to lose weight, I'm doing it to gain health: to detox from bread (and high fats) as an experiment. If it weren't for the threats of heart disease that ketosis-based high protein/low carb diest like paleo give us, both diets might be effective for weight loss. But I'll still have to research how much protein is lost on intermittent fasting before I make any claims about the better efficacy of my own experiment.