COVID-19—Lessons from History and The Power of “The Terrain”
By Dr. Alan Palmer, Contributing Writer
[CHD Note: This is Part Two of a four-part series. In Part One I showed that we are bathing in a sea of microorganisms and we will be exposed to them no matter what we do to avoid it. I also laid out the basis for the terrain as the most important consideration when it comes to protecting oneself from infectious disease. Page numbers referenced throughout the article are from 1200 Studies-Truth Will Prevail, Dr. Palmer’s free eBook. You will find the download link in the bio at the end of the article.]
The COVID-19 pandemic and a lesson from history
We are currently seeing the practical nature of this historical debate playing out with the COVID-19 pandemic. I am not saying that the virus (germ) is not real and to be considered, BUT more important than the pathogen is the resistance of the host. Some people get COVID-19 and do not develop symptoms (some estimates as high as 50% or more) or have mild symptoms. On the other hand, some people are overcome by the virus. It is no wonder that large cities with populations that have higher rates of those predisposing illnesses and high population density are hit harder. History is repeating itself albeit to a lesser degree. Our modern cities are nowhere as bad environmentally as early in the last century, but they do have certain similar characteristics.
In the late 1800s and early 1900s, infectious disease ravaged the large cities here and in Europe. Small Pox, Dysentery, Cholera, Diphtheria, the Spanish Flu and even Measles were very deadly. During that time period, livestock, horses, dogs and other animals walked down the streets, defecating and urinating as they went. The cities were greatly over-crowded. There was no efficient way to dispose of human and animal waste, the air was putrid, the water unclean, the food supply providing only basic needs and most people smoked and chewed tobacco. People’s bodies were living petri dishes, crawling with pathogenic (harmful) organisms, disease causing bacteria, parasites, viruses and fungi. Their “terrain” was fertile soil, ripe for infection as their immune systems were weak and overwhelmed. Interestingly, many third-world countries with people living in similar conditions still have high mortality rates due to the same infectious diseases. That is why what would be a mild self-limiting illness like measles in modern day America, could be deadly under those conditions.
In 1918, the Spanish Flu swept over much of the civilized world. Many believe that its transmission to the U.S. was in part due to the U.S. service personnel returning from the battlefields of World War I. Think about the conditions across Europe at the time. Supply lines were cut off all over Europe. Healthy food and clean water were very difficult to come by. The war placed a tremendous burden of stress and poverty on the entire populace. G.I.s fought shoulder to shoulder under tremendous stress, in rat infested trenches filled with filth, human waste, blood and terrible air quality. Sleep was a commodity that was rare to come by. Food rations were barely enough to provide the energy to fight and was a low nutrient canned mash. Water was in very limited supply. Talk about a breeding ground for disease! The TERRAIN of their bodies was certainly ripe for infection.
By the time the lucky G.I.s who survived returned from war, most were sickly, riddled with infections and unfortunately contributed to spreading the disease throughout the homeland, which was also suffering extreme economic, societal and nutritional despair. And the dirty overcrowded cities as described earlier, added to spread of the disease. It’s no wonder when the Spanish Flu broke out here and abroad, it spread like wildfire and the death rates were near apocalyptic! It is estimated that somewhere between 500,000 and 850,000 Americans died of the Spanish flu. That same flu virus circulating today in modern-day-America, would never have the same catastrophic consequences as it did 100 years ago under those conditions. Although, certain demographics in large overcrowded cities as well as the elderly and people suffering chronic disease would increase the morbidity and mortality, it would still be to a lesser degree.
Why the steep decline of complications and death from infectious diseases between 1900 and 1963?
How much credit should vaccines get for the decline in mortality from infectious diseases? The blunt answer is practically zero. U.S. Government public health statistics show without a shadow of a doubt, that vaccines did not contribute to any significant degree to the decline in mortality. As an example, the rate of deaths attributed to measles had declined over 98% between the years of 1900 and 1963 (some government statistics say the death rate had decreased 99.4%) and was continuing in a downward decline at that point prior to the introduction of the measles vaccine in 1963. You can clearly see that on the far right of the graph below. The other infectious diseases followed the very same pattern.
The graph shows the mortality rates of five of the most common infectious diseases and their decline between 1900 and 1963. The mortality rate for the measles was 1 in 10,000 cases in 1962, before introduction of the vaccine. By 1962, there were approximately 400 deaths annually in the U.S. out of approximately 4 million plus cases annually. Importantly, studies showed that of those deaths, the death rate was 10 times higher in impoverished areas. That means that the death rate for children of average and higher economic and nutritional status areas was exponentially lower. Another reason why the mortality from measles would not be as bad as before the introduction of the vaccine, according to U.S. Census data, the percentage of people living near or below the poverty rate in 1959, was double what it was in 2017. Less poverty, better outcomes with infectious disease. This just underscores the points made above regarding the lethality of infectious disease when the TERRAIN is compromised.
Most health experts, epidemiologists and historians attribute the drop in deaths from infectious diseases over the 20th century, to:
- Better sanitation and waste disposal (plumbing, sewer and garbage pick-up);
- Water treatment;
- Improved education on personal hygiene and public health;
- Improved supply chains (trucking, railroads and interstate commerce) providing improved supplies;
- And quality of food and fortification of food with vitamins and minerals.
All are measures that improve the terrain (p. 475-483). The CDC’s weekly report MMWR Weekly July 30th, 1999 ran an article titled, Achievements in Public Health, 1900-1999: Control of Infectious Diseases in which acknowledgement of public health initiatives and the subsequent reduction of infectious diseases was discussed.
A 1977 study from authors from Boston U, Mass General and Harvard found that it is… “estimated that at most 3.5 percent of the total decline in mortality since 1900 could be ascribed to medical measures introduced for the diseases considered here.” (“Diseases” being infectious diseases over the first three quarters of the 20th century). Their graphs show the decline of individual diseases and when medical intervention in the form of vaccines and drugs to treat those diseases were first introduced. (p. 482-484.)
Vitamin and mineral fortified foods decreased vitamin deficiency diseases and infectious diseases
This graph shows the decline of both Pellagra and Influenza/pneumonia deaths. Note the sharp decline of both as of 1938 when bread became fortified with vitamins and minerals. Pellagra is caused by a lack of niacin (a B vitamin).
The arrows show the change in trajectory of deaths due to influenza (flu)/pneumonia on the top half of the graphic and the change in the death rate due to pellagra on the bottom half. What a HUGE change in the outcomes of an infectious disease AND a nutritional deficiency disease. This demonstrates the immense power of proper nutrition on health.
Another example is the decline of deaths from Scurvy (a Vitamin C deficiency disease) and deaths from Pertussis or Whooping Cough. (*Note that the pertussis vaccine was not in widespread use until the mid-1940s and not used routinely until it became combined with the diphtheria and tetanus vaccines, as the DPT vaccine.)
Just like with measles, to attribute the decline in death rates for pertussis or any of the other infectious diseases to the use of vaccines is a complete fallacy.
And speaking of measles, vitamin A is and has been used by the World Health Organization worldwide in the fight to reduce deaths from measles. It has been touted as one of the most significant and cost-effective reasons for the decline in measles death rates in third-world countries. It is interesting to note that from the period from 1950-1968 vitamin A fortification in the U.S. went from 3 percent to 12 percent, a 300 percent increase. This also correlates with a continued drop in death rates shortly before the vaccine was released and widely used.
A 2011 study published in the prestigious British Medical Journal titled, Vitamin A supplements for preventing mortality, illness, and blindness in children aged under 5: systematic review and meta-analysis, found Vitamin A to be a low-cost and effective way to improve health outcomes in children in low- and middle-income countries, especially where access to quality nutrition is limited. This study was a meta-analysis of other studies and included 43 trials, with about 215,633 children aged 56 months to 5 years of age. Incredibly, this study found that vitamin A supplementation could reduce the incidence of measles infection by 50% and the death rate from measles by 20%. Implications for policy from the article:
“Vitamin A deficiency is a common condition that contributes to illness, blindness, and death; supplements can reduce these problems for children aged under 5 in low- and middle-income countries. National and regional supplementation programmes could be among the world’s most cost-effective public health interventions. If the risk of death for 190 million children deficient in vitamin A were reduced by 24%, estimates from 2008 suggest that over 600,000 lives could be saved each year and 20 million disability adjusted life years would be gained.”
These powerful examples, demonstrate that improving the terrain and thus the immunocompetency of the individual is a highly effective strategy for preventing and treating infectious disease. And, in the case of COVID-19, like many other infectious diseases, it really is all about the terrain! The bias in the reporting that leads the public to believe it’s all about the virus and that new medical interventions are necessary to save us, is disingenuous at best and an intentional lie at worst.
Dr. Palmer’s free eBook 1200 Studies – Truth Will Prevail, now 730 pages long, includes over 1400 published studies – authored by thousands of scientists and researchers – that contradict what officials are telling the public about vaccine safety and efficacy. It has easy search and navigation features including links to article abstracts and studies on PubMed or the source journal that make it an invaluable research and reference tool. Download it free at www.1200studies.com
The post COVID-19—Lessons from History and The Power of “The Terrain” appeared first on Children's Health Defense.
© 28 Jul 2020 Children’s Health Defense, Inc. This work is reproduced and distributed with the permission of Children’s Health Defense, Inc. Want to learn more from Children’s Health Defense? Sign up for free news and updates from Robert F. Kennedy, Jr. and the Children’s Health Defense. Your donation will help to support us in our efforts.