Healthcare and Electricity - Let There Be Life and Light

In the first edition of this blog, I noted that “the electric sector underpins every other essential industry sector, and it also relies on many of them. I…think of the overlaps like the Olympic rings – all interlinked, with some overlapping more than others.”

For the next several editions, I’ll continue to focus on each critical infrastructure sector in relation to the electric sector because electricity – which began to be deployed as a service close to 150 years ago – has enabled the progress, convenience and abundance that are hallmarks of modern life. Thereafter, I’ll get into the overlapping policy issues in more detail.  

According to the Department of Homeland Security’s Critical Infrastructure Security Agency (CISA):

The Healthcare and Public Health Sector protects all sectors of the economy from hazards such as terrorism, infectious disease outbreaks, and natural disasters. Because the vast majority of the sector's assets are privately owned and operated, collaboration and information sharing between the public and private sectors is essential to increasing resilience of the nation's Healthcare and Public Health critical infrastructure.  

The Healthcare and Public Health Sector is highly dependent on fellow sectors for continuity of operations and service delivery, including communications, emergency services, energy, food and agriculture, information technology, transportation systems, and water/wastewater systems. 

I mentioned in the last edition that I handled numerous policy issues when I worked on Capitol Hill, but healthcare was not one of them.  I actually avoided it, like the plague…(too much?), probably because the policy issues were not appealing to me.  But, of course, I recognize the extreme importance of a modern and thriving healthcare system.  This week, in particular, I’m reminded of the need for such a system – my mother-in-law just passed away unexpectedly of cardiac arrest on Wednesday.  We’re still absorbing the loss and sadness of her passing, but her diagnosis of Alzheimer’s a few years ago was the first, and possibly strongest, blow.  Leading up to her diagnosis and since, my husband has had a front-row seat to our healthcare system and has seen some incredibly positive elements and some not so positive. But the fact of the matter is that every time his mother had a healthcare emergency (and there were several), the ambulance showed up and she was helped.  This week, her heart could no longer sustain her body, but the healthcare providers tried everything they could to keep her alive and we are grateful for their efforts.

So, how did we arrive at our modern healthcare system?  In early human history, everyday maladies like intestinal aches or colds were treated with herbal remedies. They were considered part of the human experience that could be aided by such treatments. According to Brittanica and aligning with a vague recollection of my college course on the history of science and medicine, in contrast to these treatments (and their presumption of physical causes that could be affected by physical treatments), major illnesses, accidental maiming or death, were, up until relatively recently, considered to be the result of magic, curses, the displeasure of deities, or other supernatural causes. Therefore, the treatments for those were themselves supernatural in nature – appeasement of the gods via sacrifices, counter-curses, potions, suctions, or other similar methods. The use of charms and talismans persists to this day.

Fascinatingly and logically, the placebo effect began in ancient times.  That phenomenon is described well by Brittanica, so I will not try to paraphrase:  

Apart from the treatment of wounds and broken bones the folklore of medicine is probably the most ancient aspect of the art of healing, for primitive physicians showed their wisdom by treating the whole person, soul as well as body. Treatments and medicines that produced no physical effects on the body could nevertheless make a patient feel better when both healer and patient believed in their efficacy. This so-called placebo effect is applicable even in modern clinical medicine.

Several of the other critical infrastructure sectors I’ve discussed previously made breakthroughs about 5,000-6,000 years ago.  In the case of medicine and healthcare, that breakthrough came in Egypt and India about 3,000 B.C., or 5,000 years ago.  While spells, incantations and other magical/spiritual remedies were still in vogue (and would remain so into modern times), the Egyptians also developed detailed lists and descriptions of treatments that relied on trendlines of positive results and some level of analysis. Surgical treatments of wounds were identified and used during this time. Egyptians’ embalming methods resulted from religious motivations, but such preservation has enabled us to better understand some of the chronic diseases that did and did not exist then – scourges like syphilis and rickets were later developments, for example.

The overlap between religious practices and healthcare is evident in the Bible’s Old Testament (mirroring timelines of ancient Egypt), with God’s commandments and expectations of the Jews setting the bar high for public health (which is the prevention of disease through practices such as food safety, hygiene, etc.). 

Indian leaps in medicine and healthcare were significant.  In the seminal Ayurveda, published about 5,000 years ago, the authors identified three main bodily elements – air, bile, and phlegm – which they thought of as divine, universal, forces and from which all primary substances of the body derived.  This thinking was similar to that developed in other parts of the world, with slight variations, but much later.  In the first millennium B.C., Pythagoras and then Hippocrates postulated that the body had four “humors” – blood, phlegm, yellow bill, and black bile. The Chinese saw the human body as the same as other natural objects in terms of its essential makeup, which they thought was made up of wood, fire, earth, metal, and water. They also developed the unique concept of yin and yang, which was tied to the duality of the male and female. The Chinese homed in on blood circulation as critical to health, resulting in pulse checking often during diagnostic interactions with patients – a practice that continues today.  Acupuncture was developed exclusively by the Chinese about 4,500 years ago to increase blood flow and, in turn, support healthy organs and overall health.

In the several centuries before Hippocrates and his famous oath, surgical methods were developed on the battlefield and progress was made to apply greater observational techniques. A medical school was formed on the Greek island of Cos in the eighth century B.C. that Hippocrates may have attended several centuries later. Hippocrates developed and/or solidified several methods that we now hold as foundational to modern medicine -- the idea of prognosis or predicting the development of disease and how treatment(s) might impact such development, rigorous and recorded observational methods, and categorizations of diseases, to name a few. The modern, abbreviated, version of the Hippocratic oath – “first, do no harm” -- was likely coined by the English surgeon Thomas Inman in the 19th century, but a phrase from the original oath reads “"Practice two things in your dealings with disease: either help or do not harm the patient.” I encourage you to read the entirety of the original Hippocratic oath as it discusses the effective use of diet in treatment and, very interestingly given the historical context, it does not differentiate between men and women, free people or slaves, in terms of its focus on the patients’ wellbeing. 

In the several centuries following Hippocrates who, like Shakespeare, could have been one person or multiple people using the same name, the Greeks and Romans combined their knowledge of medicine and made significant progress on understanding physiology. Breakthroughs in childbirth and contraception were made by Soranus of Ephesus. He discovered how to rotate the fetus to aid in delivery, but that knowledge was subsequently and unfortunately, lost for several centuries. In 200 AD, Galen built upon the Hippocratic tradition by creating a comprehensive system of observational practices that were adopted and used until the modern era.  During the early Christian era, circa 360 AD, Bishop Basil of Caesaria created a hospital with an emphasis on helping the poor.  He also expounded on treating men and women equally and noted that no person is “a slave by nature.”

In the Middle Ages, several significant developments occurred in Europe and in Arabic cultures, including discoveries about changes in eyesight and how to correct them – in 1300, concave lenses were used for myopia in Italy. Veterinary medicine was first identified as an area of practice and scholarship in the mid-1200s. Surgeries became more prevalent with related surgical instruments during this time. 

Moving into the Renaissance in the 1500s, methods to treat gunshot wounds and other battle-related wounds became prevalent.  But this was also a time when unconventional thinking could be life-threatening – around the same time as Galileo’s imprisonment for supporting the Copernican theory that the sun, not the earth, was the center of the universe, a Spanish doctor and humanist named Miguel Servet was declared a heretic and burned at the stake for discovering and describing the circulation of blood through the lungs. As an aside, this reminds me of one of the most profound books I have read in my life, called “The Structure of Scientific Revolutions,” by Thomas S. Kuhn.  It describes paradigms in science (and medicine) that develop from theories that could be partially correct at the time, but that, when subsequently called into question by additional research and observation, are fiercely protected by anyone who has benefitted from them.  Such attachments to disproven theories and failed concepts stymie innovation and limit needed course corrections. I have long argued that these hegemonic paradigms are prevalent throughout society – not just in science or medicine.

But I digress…as with our other critical infrastructure sectors, things started to get interesting in the late 1700s, leading into the 1800s.  The first smallpox vaccine was developed by Edward Jenner in 1796 – while inoculations had been used in Europe, Asia and Africa for centuries, Jenner’s major breakthrough was first in identifying smallpox’s relationship to the less virulent cowpox which could better be used for inoculation, and, second, in subsequently proving the efficacy of the vaccination. His basic framework has been used ever since.  Smallpox killed fully 10-20 percent of the European population annually at that time -- his discovery eventually led to its practical eradication in the 1970s.

In the 1800s, developments in medicine happened rapidly, many coinciding with the expansion of other critical infrastructure sectors (note the major overlaps at the beginning of this blog).  Anesthetics like morphine were discovered enabling blood transfusions and painless surgeries – the first was undertaken in 1846. In 1870, Louis Pasteur and Robert Koch developed the germ theory of disease. Pasteur then developed rabies and anthrax vaccines. With the advent of electricity and its widespread use by the 1870s, electrocardiograms were developed in 1887.  In 1901, the major blood types were identified by Karl Landsteiner.  

The last 120 years have seen leaps in medicine, surgery and technology enabled by sub-microscopic analysis of cells, the discovery of DNA, the ability to clone via stem cells, and the list goes on and on.  The discovery and use of mitochondrial DNA for applications that could eradicate horrendous genetic diseases and even cancers have at the same time raised existential ethical questions for the human race – a classic dichotomy. Putting aside those ethical questions for now (and maybe forever – not yet sure I want to tackle those in this forum), these leaps were enabled by sterilization techniques, widespread refrigeration, and ample water supplies that could have only happened with electricity and other overlapping critical sectors.  Hospitals themselves use life-saving machines that were unheard of a century ago, but that require continuous and reliable electricity to function. Electric utilities often work with hospitals to ensure adequate redundancy and to prioritize their restoration in an unexpected power outage. Manufacturers of intricate medical devices and machinery also themselves need highly reliable power lest a flicker of the lights cause them to have to start over.

During the COVID-19 pandemic, many policy makers simply did not understand these inter-dependencies even though DHS’s CISA so clearly lays them out for everyone to see on its website. Our brethren in the electric, water, transportation, IT, telecommunications, and critical manufacturing sectors do such good jobs of keeping the lights on, the water flowing, and the materials being delivered that policy makers and the general populace do not think about how essential they are.  It was disturbing that we had to make that case during the pandemic, when it seemed so obvious to us in those very industries.  This is one reason I am motivated to write about “The Essentials.”

Here are some other ways that the healthcare sector and the electric sector overlap:

  • Reliance on transportation. The pandemic showed us how important personal protective equipment (PPE) is, but other machines, medicines, and devices used for healthcare need to be transported as well. Electric utilities transport coal, but also must ensure deliveries of critical grid components, bucket trucks, copper wire, poles, and the list goes on…

  • Reliance on critical manufacturing.  See above – modern healthcare requires a lot of moving parts.  If the parts are not available, people’s lives are on the line.  Electricity powers both the manufacturers and the hospitals, so there is a layering effect here. 

  • Environmental regulation/climate change.  This is less direct for healthcare, but certainly environmental factors can contribute to disease and medical waste is regulated by the EPA.  This area may be more tangential, but there should at least be situational awareness between the two sectors. 

  • The use of natural gas.  Natural gas comprises approximately 40% of domestic electricity generation, and it is used for heating in many parts of the country, including hospitals. Natural gas is a raw input into material for medical products like intravenous lines and bags, gloves, masks and catheters. There is pressure to eliminate natural gas overall and there should be awareness and collaboration on this important topic between the two sectors.

  • Reliance on water.  Clean water, refrigeration and freezing are all essential for hygiene and for preservation of essential drugs. Electric utilities also use traditional hydropower and new water-power technologies to produce emissions-free electricity. They also use water to cool nuclear and fossil-fueled power plants. But the resource can be constrained in drought conditions, especially out West. 

  • Workforce challenges and the knowledge drain that has resulted from retirements in recent years. 

  • Supply chain constraints that impact every aspect of infrastructure deployment and maintenance.

  • How to best use technology to create efficiencies and minimize expenses.  

  • How to manage the cybersecurity risk that comes with those technology deployments.  

I now kind of regret my lack of interest in healthcare policy during my Capitol Hill days – researching this blog has been fascinating.  I hope it peaked your interest, and potential further research, as well.


Previous
Previous

Financial Services and Electricity - Show Me the Money

Next
Next

Mining and Electricity - The Chicken and the Egg