This is part one of a three-part blog series that will take us on a journey from the 1850s through to the 2020s, concluding with a vision at what post-2030 healthcare could be, as we evolve new ideas and technologies.
Roman Medical Theory
What we call “healthcare” has evolved in an accelerating manner over the past two centuries. Developing industrial societies progressed from an opinionated, superstition-driven past to an empirical, science-based approach.
Going back to the Roman Empire, the time of Galen (Aelius Galenus or Claudius Galenus (Greek: Κλαύδιος Γαληνός; September 129 – c. 200/c. 216), people have trusted wise people to help them regain their health, reduce their pain, and settle their minds. In Galen’s day there was very little understanding of how our bodies worked. Galen's understanding of anatomy and medicine was principally influenced by the then-current theory of humorism (also known as the theory of the four humors: black bile, yellow bile, blood, and phlegm).
Uninformed, Vulnerable Patients
Patients weren’t expected to understand anything about their illness (most of Galen’s patients would have been high-ranking soldiers and gladiators). And, with limited understanding of human physiology and disease, the ideas around causation had no scientific validity. Thus, even the richest people in the world could die from a scratch that turned into an infection.
Not a great deal changed for hundreds and hundreds of years, as much of the western world inched their way through the dark ages, filled with plagues, wars, and pestilence. This part of our journey takes us to a world with no firm understanding of how diseases spread. No microscopes to see germs - in fact, no idea germs exist - and, perhaps most deadly, the arrogance of many “experts'' who often pointed to a transgression (or perceived transgression) on the part of the sick person as the ultimate cause of their suffering. Additionally, the medical instruments and techniques of the day were sometimes distinguishable only by intent from those used in torture.
Germ Theory Reduces Mortality Rates
As we move into the mid-nineteenth century, surgical wards were dark, dirty, and unhygienic places. The lack of anesthetic likely kept away all but the most pain-tolerant and ill patients. There were sparks of light, however; during this time Doctor Ignaz Semmelweis discovered through experimentation that you could move a disease from a person to another by simply touching them during an examination. Simple washing of hands was sufficient to reduce infant mortality from nearly 20% to less than 2%. Joseph Lister and others made antisepsis mainstream practice over, as the medical community began to understand and accept the germ theory of disease. Poor Ignaz Semmelweis didn’t go on to the same acclaim and went mad with frustration that his idea wasn’t immediately accepted by the profession. The people in power couldn’t believe that the most accomplished practitioners could actually be making conditions worse!
Lack of Data
Even with germ theory firmly accepted, society was still limited by the lack of information across the country. There were some considerations for providing healthcare to certain groups of Americans early in U.S. history. July 16, 1798 President John Adams signed the first Federal public health law, "An act for the relief of sick and disabled Seamen." This assessed every seaman at American ports 20 cents a month. This was the first prepaid medical care plan in the United States.
It wasn’t until 1929 that there was any type of national economic healthcare information collected and made available. Adding to this limited health information, there was little consensus on the economic models best suited to delivering care across the country. The “value” of making someone well wasn’t well understood.
In part two of our three-part series we will take a look at the changes from 1950-2000, a time of incredible discovery and innovation in both the science of medicine and the economic models that support the delivery of the treatments.