Friday, August 20, 2021

NOWHERE MAN IN NOWHERE LAND - TWENTY-TWO

 Nowhere Man in Nowhere Land –

TWENTY-TWO


Medicine Empowered!

 

 JAMES RAYMOND FISHER, JR., Ph.D.

© September 14, 2016/©August 20, 2021

 

This is the reason for so much bitterness, shortness of temper, and recriminations in our daily lives.  We get back a reflection from our loved objects that is less than the grandeur and perfection that we need to nourish ourselves.  We feel diminished by their human shortcomings.  Our interiors feel empty or anguished, our lives valueless, when we see the inevitable pettiness of the world expressed through the human beings in it.  For this reason, we often attack loved ones and try to bring them down to our size.  We see our gods have clay feet, and so we must hack away at them to save ourselves, to deflate the unreal over investment that we have made in them to secure our own apotheosis. 

Ernest Becker (1924 – 1974), Jewish-American cultural anthropologist, Denial of Death (1973)

LEVERAGING FOR A LEGACY

We have dissected the world into bits and pieces and its inhabitants into appetites and lifestyles with propensities for certain diseases and maladies.  We have manipulated the economy politically into interchangeable units including demographics, preferences, and profiles to satisfy an insatiable hunger for the new, the different, and the rare in the name of progress

In doing so, we have cut the individual up into pieces in the same way we have reduced the planet in “Cut & Control” fashion into divisive fragments with the focus always on the part, never the whole.  Is it any wonder that technology and medical science would follow this same threadbare approach? 

Technology, since the invention of the wheel, has been a boom to the human spirit equally seductive and utilitarian as much a departure from what was experienced to what now could be expected.  We have no problem calculating the gain but difficulty in assaying the loss, mesmerized by progress delirious with optimism until reality clashes with our fixation. 

We are caught up in the newly discovered congratulating ourselves on how clever we are, but negligent when it comes to computing at what cost?  How could anyone imagine something was lost when something as powerful as the wheel was gained?  And so it has been from the beginning.

We count our blessings in the heat of the moment caught up in the discovery, the new convenience, evidence, once again, of how much more advanced we are than other animals on the planet.  Indeed, we see ourselves as godlike.  We even have great books that confirm that we were made in the image and likeness of God.  What’s to worry about?

When it comes to medicine, it is intriguing how true this self-regard blinds us to our taste for suffering.  Medical science was ready with palliatives to epidemics that technology often unwittingly helped to generate, diseases such as typhoid fever, tuberculosis, diphtheria, cholera, scarlet fever, and whooping cough, among other diseases.  These are cultural or social or societal diseases, diseases that people of the New World not only didn’t have but failed to have immune systems to fight.

Continental epidemics brought on by Europeans to the New World nearly wiped out the indigenous population of Native Americans in the 16th and 17th centuries, diseases carried by European explorers, soldiers, and missionaries. 

Using the reductionist methodology, medical science was placed at the forefront of epidemiology to deal with these socially transmitted diseases with the inchoate pharmaceutical industry right behind to make itself equally indispensable.

EARLIER – THE BLACK DEATH! – AND WHAT IT TELLS US!

In the late Middle Ages (1340 – 1400), Europe experienced the most deadly disease outbreak in history with the infamous pandemic of the Bubonic Plague or “Black Death” hitting Europe in 1347 after ravaging Asia killing 50 million on its way west.  A third of the European population would vanish with the disease. 

Something else happened as well.  Society became more violent and less orderly as the mass mortality rate appeared to cheapen life and thus led to increased internecine violence, warfare, and bloodshed, along with the popular mass movement of revolt accompanied by waves of senseless killing and persecution. 

The Black Death originated in Mongolia, China, and spread westward to Italy.  When trade on the Silk Route between China and Europe was cut off, it appeared to halt the spread of the disease, which was eventually traced to infected oriental rats with a particular disease-transmitting flea to humans.  The sad irony is that Italian merchants, fleeing the orient on their ships, were bringing those rats and that disease to Europe.

The Bubonic Plague was quite tangible and punishingly disruptive but what of less apparent or discernible disease disorders since the Industrial Revolution?

It was communicable contagions that followed Europeans into the new world, diseases that had acted as constant disruptions to the daily life of Europeans since the transition from an agrarian to an industrial society.

Diseases can have an impact on the soul of a people as the “Black Death” did with people believing God was punishing them for their transgressions as ordinary human beings.  It was as if they were paying for Original Sin on their souls which tended to immobilize society from confronting and dispatching the disease.  Slowly, this barrier of inertia was penetrated. 

That said doctors failed to escape the impediments to embrace the challenges of the day as people in general as they suffered the same self-negating preoccupations and vanities. 

Yet, societal disturbances ultimately provoke psychologically change.  Nothing happens in a vacuum until occupied by ideas.   French social psychologist Gustave Le Bon (1841 – 1931) writes in “The Psychology of Peoples” (1899):

The history of the genesis of ideas, of their domination, of their transformation, and their disappearance . . . we would show that the element of civilization is subject to a very small number of leading ideas whose evolution is exceedingly slow.  The sciences do not escape this law.  The whole of modern physics is derived from the idea of the indestructibility of force, the whole of biology from the idea of evolution, the whole of medicine from the idea of the action of the infinitely small . . .

We can add:  The ideas generated by an industrial civilization invariably dictate the approach to the problem-solving of all its components.

Western society is in its fourth “Industrial Revolution,” which while changing many things has failed to change the undercurrent of the mindset that remains essentially the same in problem-solving.  Moreover, despite the progress realized by the “Cut & Control” phenomenon, all societal and cultural missteps can be traced to this singular aspect.

With industrialization, we went from living and working in the fresh air and wide-open spaces in an agrarian society to being compacted into urban squalor without adequate sanitation, housing, or recreational space to luxuriate our souls as well as renew our bodies.  Instead, we were exposed to putrid gases and smells in the air from smokestack and chimneys rich in soot and carbon emissions with no choice but to breathe these pollutants.  Moreover, the majority were forced to make a living by being imprisoned in windowless factories with poor ventilation, lighting, and intellectual stimulation. 

Fast forward and these windowless factories metamorphosed into glass cathedrals with workers confined to blind cubicles and artificial air conditioning while fighting the smog and carbon emissions once leaving these confinements for home.

The “factory of the mind” in the postmodern era is already obsolete.

Rather than address core problems at their source, the medical profession, like all others, is famous for taking the industrial approach to treating symptoms of societal maladies while the pharmaceutical industry in cahoots with medicine looks for “miraculous drugs” to deal with individual self-indulgences from obesity-associated diseases; to such addictions as cigarette smoking, alcohol and drugs abuse leading to lung, esophageal, and colon cancer, cardiovascular and kidney diseases; to promiscuity leading to sexually transmitted diseases (STDs) such as AIDS, gonorrhea, and syphilis.

So monstrous are these self-inflicted pandemics that people are no longer treated as individuals but as specimens under the microscope reduced to pie charts and graphs, statistical correlations, and computerized demographics. 

Scientists and philosophers of the 17th through the 20th century opened the door to this by cutting the mind from the body, the subjective from objective in “value-free” analysis.  Of one thing you can be certain, whatever the nature of the inquiry, whether scientific, artistic, philosophical, religious, or industrial, the mechanism in its propagation is likely to be identical. 

Citizens of Tournai bury plague victims: The Chronicles of Gilles Li Muisis (1272 – 1352), Bibliotheque Royale de Belgique.

Transitions are slowly adopted – going from an agrarian to industrial society – are first started by a small self-interested group. In this case, it was the European merchants who were interested in trading beyond Europe. Once the action is seeded, it eventually takes on a life of its own to exert pressure on the masses; then once established, everyone acts as if it has always been so.

Medicine is now empowered as a prestigious discipline, no longer in need of a bedside manner. Astronomers had removed man from a God-centered universe; medicine replaced God and moved into the void as controller of “life and death” situations in a man-centered universe.

Medical doctors were now engineers of the body viewing that body as a mechanistic machine. The whole vocabulary of medicine now hinged on reducing that body to fragments and medicine to a complex modality with research condensed to a reductionist discipline creating an obscure Latinized nomenclature to which only medical professionals were privy. The best prerequisite for a medical career was to have the aptitude of a plumber.

Earlier men had gazed into the skies and discovered the earth rotating around the sun. Now, the factory manager gazed out at his workers and saw them as inanimate cogs in a machine to be used as interchangeable parts in the interest of efficiency.

Once the factory mentality touched everything, it no longer needed justification and quickly spread to everything including the medical profession.

It had the effect of contagion with everything obedient to the same machinations. We see this today with the fascination everyone has for their handheld devices, which share the same contagion of the factory mentality, only now as “Toys of the Mind” (see The Worker, Alone! 2016). Once the factory mentality acquired the penetrating force of contagion, it simultaneously created the paradigm in problem-solving. Like a speck of fine dust, it has come to penetrate everything everywhere.

Before medical science separated the body from the mind, the two were assumed to be one. Personality and emotional states were not yet dissected by Sigmund Freud, Alfred Adler, and Carl Gustave Jung. The paradox, given the primitive nature of medicine at the time, is that the patient’s confidence and trust in the doctor were equally as important as the diagnosis and remedy, if not more so, for often they were wrong. Yet, the patient got well because of such trust and belief. This is because the body, itself, is the best physician.

Doctors in the 18th century attempted to control diseases with crude methods of bloodletting and the use of leeches. These were practices common to medical practice from antiquity over some 2,000 years.

Bloodletting was based on an ancient system of medicine in which blood and other bodily fluids were regarded as “humors” that had to remain in proper balance to maintain health. The application of a living leech to the skin was used to initiate blood flow or deplete blood from localized areas of the body. Up through the 19th century the practice was still used in Europe, Asia, and America.

THE WAY WE MUDDLE FORWARD

Medical practice is no different than any other human enterprise. It imitates the prevailing cultural norm, in this case, the factory mentality. This reductionist strategy goes back to the early days of the “First Industrial Revolution” around 1760 which then merged with the “Second Industrial Revolution” in 1850 (before the American Civil War). We have never escaped this prevailing norm.

When a new way of doing something is adopted everything else being done falls into place consistent with that pattern or paradigm.

The only difference is that such transitions or transformations once took about 300 years or more; now they can be practically realized overnight but with one annoying frequency. The collective mindset of society fails to keep up with swift change as most people linger in the nostalgic past while robotically moving inappropriately forward.

Western society from the fertile plains of North and Central Europe, the plains that nourished the men and huge horses that invaded and successfully sacked Rome from 410 to 527, profiled the culture of agriculture which was to dominate the West for the next thousand years, a culture that Rome neglected as it was obsessed with empire and its military machine leaving its fields fallow.

The agrarian nature of society is reflected in its Christian religion and Enlightened Philosophy, in small industrial guilds of craftsmen, in the dominance of the ecclesiastical authority of the Church and the submissiveness of secular authority to the church’s demands, in monastery and convent retreats where the archives of civilization were being preserved, in shipbuilding looking to the newly discovered New World, followed by Catholic missionaries. There was an orderliness to the age that hid the destructiveness that was eminent in this new enterprise.



Sacking of Rome in 410 by King Alaric of the Visigoths

Utopian ideas are prominent when society’s victims are lost in the moment, and can’t seem to fathom the forest for the trees. Nowhere Man in Nowhere Land is lurking in the mist as boundaries dissipate and victors and victims are interchangeable.

After the “Dark Ages” followed by the sacking of Rome, Western society was in a free fall for nearly a thousand years only to come to an abrupt halt when a modest but temperamental cleric, Martin Luther, posted his 95 theses (complaints) on the Wittenberg church door in his small German community.

Pope Leo X, who was operating a corrupt pontificate, attempted to crush Luther, only to be crushed by Luther’s followers providing the catalyst for the Protestant Reformation, Industrial Revolution, and the American Revolution and French Revolution.

This would lead to an industrial capitalistic society and modernity, then post-modernity and post-industrial society, but without forsaking the collective factory mindset that rose out of this muddle.

Today, we are shackled with the dominant mindset of a mechanized universe, a universe that Luther launched when he posted his grievances on October 31, 1517.

Over the past 500 years in our “Cut & Control” manner, we have gone from a holistic agrarian society to a particulate mechanized – by the numbers – a society that Martin Luther set in motion.

Despite all the wonders of our time, we live in a factory society and cannot shake that mindset or its factory mentality:

· In a nuclear family, which has splintered to nonexistence;

· ln, the Christian religion, which has forgotten its mission being obsessed with its political survival;

· In the education system, which is obsessed with process efficiency (grades) at the expense of product effectiveness (skills);

· In the government, which has forgotten its role is to serve and lead and not self-promotion;

· In the liberal arts of literature, music, art, and architecture, which has been demoted to the periphery while the obsession is with science;

· In medicine, which is controlled by the pharmaceutical and insurance industries and has become a pill dispensary;

· In work, which has gone from brawn to brains, while leaving tens of millions behind without the wherewithal to make a living.

Early in the 16th century while Martin Luther was in Rome, he observed the blasphemous work of the Dominican friar Johann Tetzel selling indulgences at the auspices of Pope Leo and was horrified. Tetzel’s selling pitch was that Catholics could bypass Limbo and go straight to Heaven with this purchase. Angered at this corrupt practice, it touched Luther’s soul and by extension the collective subliminal soul of the German people. The rest is history.

The reason for this preamble to “Medicine Empowered” is that in times of transition reality trumps the known and understood at every step. Much as pundits and gatekeepers form comprehensive theories of what is going on and where everything is going the mind is not the playground of change but human emotions. Consequently, it is easier to imagine conditions in the time of Luther and his impact on the last 500 years than to gauge contemporary life and where it is going.

Meanwhile, despite this those in positions of power vacillate between means and ends in a rhapsody of logic or retreat from the fray altogether in acquiescence hoping for the best. Polar oscillation results in forward inertia with Nowhere Land on the horizon.

HISTORICAL PERSPECTIVE

The factory mentality was much in evidence with the French mathematician the Marquis de Condorcet (1743 – 1794) who saw the advantage of reducing a large number of people to numbers. He found a substitute for theology in science. Reason was the spirit of the time, and religion was now seen as its enemy. He subscribed to the Aristotelian dictum that man is a rational animal. His vision was that of a free rational man emerging from centuries of ignorance, fear, and superstition into the bright new dawn of Reason, Science, and unlimited progress.

Man, the moving force of society, was to be guided by rational knowledge in a conscious attempt to remake the world in man’s image and thereby shape his planetary destiny.

The Marquis was joined in this social philosophy by Hobbes, Locke, Montesquieu, Rousseau, and Turgot. The consensus among them was that once man had conquered the natural forces; had successfully put aside the worship of his God; had cleared his mind of pesky illusions; had come to rely on science and reason, the earth would turn into a utopian earthly paradise, and everyone would bask in the calm, cool light of reason. Indeed, science envisioned its illusionary utopia:

That with knowledge applied to society, there would be no poverty, no war, and no misery.

Man has used his reason to question ecclesiastical authority; to rise above the barriers of false notions, to generate the utopian ideal, but somehow in attempting to do so society has sunk into depressing anxiety with a mania for taking, grabbing whatever it can while the getting is good with no one seemingly in charge, personifying Nowhere Man in Nowhere Land.

In the 18th century, Condorcet attempted to bridge this void by developing a new science of society. To him, history was above all a science to foresee the progress of the human race. He believed this would make it possible to tame the future. In his Sketch for a Historical Picture of the Human Mind (1785), he traced human development through nine epochs to the French Revolution, and predicted in the tenth epoch would come to the ultimate perfection of man.

The agents of this perfection would-be scientists. The more the state provided education to train scientists the more likely the certainty of this perfection. This idea has driven Western planners since.

Condorcet thought that the manipulation of masses of people by the numbers needed two kinds of data collection: general observations of entire populations through the application of mathematics; and an intense examination of limited numbers of the specimen through the application of medicine. These studies would make possible what he called “the limitless perfection of human faculties and the social order.”

The reduction of the individual to a manipulative numerical unit was given further impetus by the discovery in 1802 of basic physical units called “monads.” These are the ultimate invisible units of existence which were first postulated by German philosopher Gottfried Wilhelm Leibniz (1646 - 1716). Monads themselves are independent of each other and all have some power of action and direction towards a goal or end.

Romanticism or the Romantic Movement was an artistic, literary, musical, and intellectual movement that originated in Europe towards the end of the 18th century and in most areas peaked between 1800 and 1850. It was during this period that monads were most popular as theoretical units or the common substrate of life, and the ultimate link between man and nature.

It was felt that these monads might be observed and measured as a means of establishing mathematical laws for life forms. Human organs were described as “little machines in great machines.” Tissue was thought to form the basic structure of these machines. This marked the invention of the pathology of anatomy.

With the aid of statistics and pathology, medicine was now able to do to disease what classification had accomplished with botany, what logic had done to argument, and what printing had done to language.

The study of disease rather than the study of people with the disease distanced the doctor from the patient. Medicine found ways to reduce data on the human body to many subcategories via the microscope.

The microscope’s invention variously accredited to Zacharias Jenssen, a Dutch spectacle maker around 1590, and to Galileo, who announced his invention in 1610, became the tool of choice in the “Cut & Control” study of patients. In 1841, using the microscope, blood was analyzed for its properties in terms of globules, fibrous material, solids, and water comparatively between the sick and healthy person averaging out the differences.

In this way, a numerical description of blood was used in diagnosis. Medicine was becoming a science like chemistry and physics. And medical technology was making strides by making disease tangible and reducing data on millions of individuals to comprehensive charts, graphs, pictures, and statistical correlations.

This new medical science came into prominence with the British cholera epidemic of 1831. Some 32,000 in Great Britain died of this disease as the disease killed 50 million on its way to the West from India over some 14 years.


Portrait of Caspar David Friedrich, Wanderer above the Sea of Fog, 1818 to show idealized Romanticism of the period.

To discover the source of this epidemic, and the subsequent outbreak of yet other diseases, a statistical inclined public servant, Edwin Chadwick (1801 – 1890), launched an investigation in 1838 into the possible relationship between sanitary conditions and the disease. He conducted this study in five underprivileged areas of London.

The report published later that year documented the appalling filthy conditions under which poor people were forced to live, surrounded by cesspools, stagnant water from plugged drains, mounds of garbage, and sewage heaped carelessly about to become infested with vermin and insects with the sweet-smelling putrefaction of rot and decay being carried away with the rains to pollute the water supply available for drinking, cooking, and bathing.
 

Sir Edwin Chadwick.

When this report stalled, a more ambitious report was published nationwide in 1842, reporting the conditions with pictures and statistics, London society was shocked to the core.

Then, when a cholera epidemic struck the city again in 1842, killing 70,000 citizens, panic set in. This led to Great Britain Public Act of 1848, the development of public health measures, and the establishment of a Board of Health. The Nuisance & Removal Disease Prevention Act followed. These public health initiatives brought the government into the private sector as never before to be involved in the daily lives of its citizens.

The 1846 government intervention was designed as a temporary measure to help to stem the spread of cholera. The act set out procedures for the removal of “nuisances” and increased the powers of the Privy Council to make regulations for the prevention of infectious disease.

The act was updated in 1848 alongside the introduction of the Public Health Act. The Nuisances Removal and Diseases Prevention Act 1848 made provisions regarding nuisances and the prevention of diseases in places where the Public Health Act was not being enforced. Parallel to this, the Sewerage and Drainage of Liverpool Act was passed in 1846 that allowed for a Medical Officer of Health to be appointed to inspect and report on the sanitary state of the area and to point out “nuisances” that might contribute to disease.

The health propaganda machine went into high gear and hygiene now became a separate discipline within medicine. The use of soap and water and washing regularly was encouraged as were open-air activities. In the 1860s, athletics became more fashionable especially notable in public schools, which were private schools.

Even so, another epidemic of cholera hit Great Britain in 1853 calling for further investigation into the root causes of the disease. John Snow (1813 – 1858), an English anesthetist and epidemiologist was also a meticulous statistician. His study a year later showed that water from the fecal contaminated London reaches of the River Thames was nine times more likely to cause cholera than water drawn from an uncontaminated source upstream from the city.

Physician John Snow was a skeptic of the theory that cholera and the bubonic plague were caused by pollution. The germ theory of disease had not yet been developed. So, Snow did not understand, at this point, the mechanism by which the disease was transmitted. His observations led first for him to discount the theory that cholera was caused by foul air. He published his theory in an 1849 essay, On the Mode of Communication of Cholera, followed by a more detailed treatise in 1855 incorporating the results of his studies and the role the water supply played in the Soho epidemic of 1854.

By talking to residents, he identified the source of the outbreak as the public water pump on Broad Street. Although chemical and microscopic examinations of the water sample from the Broad Street pump did not conclusively prove its danger, his studies did show a pattern of the disease that was convincing enough to persuade the local council to disable the water pump by removing its handle.

This dramatic action has been commonly credited as ending the outbreak, but Snow observed that the epidemic may have already been in a state of decline:

“There is no doubt that the mortality was much diminished, as I said before, by the flight of the population, which commenced soon after the outbreak; but the attacks had so far diminished before the use of the water was stopped, that it is impossible to decide whether the well still contained the cholera poison in an active state, or whether, from some cause, the water had become free from it.”

His modesty notwithstanding, he continued to argue that death rates would drop dramatically where water supplies were filtered. In 1866, the final major cholera epidemic in the city, 14,000 were killed. In support of his theory, it was found that more than a third of the deaths in those London areas still lacked filtered water supplies.

Still, at this stage, no one knew specifically what the disease was, or what caused it. What they did know was that control through statistical quantification seemed to have worked. The success set the pattern for future government interventions when the situation was deemed a matter of public concern. These epidemics, and the social conditions that caused them, served to strengthen the grip of government on the community at large. Now, institutionalized public health and medical technology would strengthen that grip further.
 

Dr. John Snow determined the source of a cholera outbreak was fetid water.



The irony is that the drive to find the source of cholera would further reduce the status of the individual. The “Cut & Control” tool that made this possible was the microscope used with a series of improved lenses designed for higher magnification that had been perfected by Joseph Jackson Lister (1786 – 1869). A wine merchant, an amateur optician, and physicist developed a method of reducing chromatic and spherical aberrations.

His son, Joseph Lister (1827 – 1912) would distinguish himself as the “father of modern antiseptic surgery” and would introduce antiseptic techniques (such as the use of carbolic acid and phenol) to sterilize the wound before surgery thus reducing postoperative infections and improving the survival rate of patients. His antiseptic system revolutionized modern surgery across the globe and reduced surgical mortality.

In 1826, James Smith, working closely with Joseph Jackson Lister, built the achromatic microscope which was a great improvement of the Lister design. It was used in 1827 for the first professional article on histology. The Smith microscope revealed blood corpuscles were not globular as previously thought, but bi-concave.

Scientific discoveries in medicine followed thick and fast. The key advantage of research was the discovery of cellular pathology. This led to the expression: Physicians are the natural attorneys of the poor. Social problems should be solved by them.

Medicine was now deemed a social science and politics nothing but medicine on a larger scale. Some still think medicine has not escaped this identity since the American Medical Association is one of the most powerful politically active lobbying groups today in Washington, D.C.

Danish scholar S. E. Stybe, in Copenhagen University: Five Hundred Years of Science and Scholarship (1979), pays a special tribute to Louise Pasteur (1822 – 1895) for discovering the source of rabies, and to Edward Jenner (1749 – 1823) for doing the same for smallpox.

While Stybe acknowledges Danish pathologist Peter Panum (1820- 1885) as the founder of modern physiology. He fails, however, to mention that Panum’s study of the 1846 epidemic of measles on the Faroe Islands led to the discovery of the source of this contagion as well as other diseases.



Amateur scientist Joseph Jackson Lister with his microscope.

Panum paid particular attention to the hygiene conditions of the islanders. In the course of his work, he performed systematic studies of endotoxin, a substance once referred to as a “putrid poison” that was responsible for symptoms observed in individuals with sepsis. Not only did his investigation set the standard for all subsequent epidemiological studies, but it settled once and for all that measles is a specific contagion.

At the same time, he demonstrated how hygiene was a factor in the spread of the disease while showing that an incubation period conferred on the sufferer provided a lifelong immunity to the infectious disease.

Panum found that the 98 survivors of the previous Faroe’s epidemic were still immune to measles even though they had been infected with the contagion in 1781, and that of the 5,000 inhabitants of the islands exposed to an infection that 99.5 percent caught the disease.

He deduced from this that vaccination, or being infected with a low dosage of the contagion would become an active antidote to the disease. This was counterintuitive thinking and went against the popular perception of the time, which was “You don’t treat a disease with a disease.”

William Simpson, the medical health officer of Aberdeen, Scotland, writing in 1882 about smallpox, diphtheria, cholera, measles, whooping cough, and scarlet fever noted:

“It comes out, as a peculiar fact, that the most dreaded diseases are the least fatal, and the least dreaded diseases are the most fatal. Measles, whooping cough, and scarlet fever are the most serious, although it is usually considered they do little harm, their very frequency makes them less dreaded. The disease that comes unexpectedly, and passes over quickly, is looked upon with a greater feeling of terror than the disease which may be more fatal, but more common.”

Scottish bacteriologist Thomas Hugh Pennington (born 1938), and former President of the Society of General Microbiology, place this in context:

“New hypotheses – guesses – drive scientific progress. Most never become public. The majority are stillborn in the scientist’s mind. Only a few survive to get discussed in the laboratory.

“The tiny minority that gets published are nearly always those that are supported by enough evidence to make them worth testing. Even at this stage, it is unusual for them to come to public attention.

“Two things are needed for this: an enthusiastic hypothesis and an interesting subject. In the hard sciences, it is therefore very unusual for wrong ideas unsupported by evidence to have much of a life outside the private world of the scientist. But flaky evidence can be overcome by fervor, at least for a time.

“Medicine is different. It is the natural home of the untested hypothesis where acupuncture, homeopathy, osteopathy, and chiropractic flourish. And it is a happy hunting ground for the enthusiast; heath stores are hot news. So a heavy responsibility falls on those who publicly promulgate hypotheses whose consequences may cause alarm.”

Given this assessment, it is clear that deviation from the scientific inquiry is considered suspect and to lack legitimacy. This has the implicit ring of dogmatic certainty once associated with religious institutions.

At the same time that scientific inquiry was finding its legs, the mechanistic mindset – or by the numbers – was being reified with certainty. German pathologist/physiologist and cellular biologist Rudolf Virchow (1821 – 1902) presented a paper in 1845 with the title, “The Need for and Correctness of Medicine based on a Mechanistic Approach.”

The paper held that life was nothing more than the mechanics of cellular activity. Cells were the basic unit of existence. Life was the sum of the cellular phenomenon that could be submitted to chemical and physical laws. If all life was cells, the paper reasoned, and then disease being alterations in the cells was nothing more than life under altered conditions. If these conditions could be avoided, public health would be dramatically improved. The microscope was now the tool of choice for social control.

German physician and pioneering microbiologist Robert Heinrich Koch (1843 – 1910) took the final move to reduce the patient to a statistical construct. He is acclaimed as the founder of modern bacteriology and is known for his role in identifying the specific causative agents of tuberculosis, cholera, and anthrax. It was in a small town in East Prussia that in 1876 that he isolated the anthrax bacillus, cultured it, and watched the bacillus produce spores in the tissues of animals.



Book cover of my source for these wonders when I was a boy.

Koch’s experiments provided the first clear evidence that a specific microbe organism causes a specific disease. His further researches in microscopy and bacteriology led to the discovery in 1882 of the tubercle bacillus. In 1883, heading a German expedition to Egypt and India in quest of the cholera germ, he discovered the cholera bacillus.

Koch also developed new techniques for growing bacteria in the laboratory that gave future medical researchers the ability to manipulate nature at the microscopic level. This did away with the need for the presence of the patient. Now, culturing the disease and controlling its development required no more than a single drop of the patient’s blood.

By the end of the century, Koch’s new science of bacteriology had identified among other microbes those of tuberculosis, typhoid fever, tetanus, gonorrhea, cholera, leprosy, and malaria. He won the Nobel Prize for Physiology and Medicine in 1905.

It was soon discovered that some bacteria staining chemicals could selectively kill certain bacteria. Thus developed the first “magic bullet” of medicine in the form of a drug. It was called “Salvarsan,” arsphenamine to treat syphilis. It was introduced at the beginning of the 1910s and was used also to treat trypanosomiases, which is also known as “sleeping sickness.” It is carried by a parasitic biting insect. Salvarsan was an organo-arsenic compound and the first modern chemotherapeutic agent.

Chemotherapy made it clear that the focus of attention in medicine should be on the disease and not on the patient. People as patients became little more than material for study in the laboratory.

Over the past 100 years, despite these dramatic discoveries in medical research along with the development of “miraculous drugs,” and procedures, the factors having the most spectacular impact on general health through disease control have been in the area of public health and public education.

In 1965, 42 percent of American adults smoked cigarettes. In 2014, this was down to 17 percent. Moreover, those that continued to smoke were more likely to be in the lower socioeconomic strata of American society.



Robert Koch, German Nobel Laureate in Medicine & Physiology, 1905



MEDICINE AS SOCIAL CONTROLLER

In 1964, The Surgeon General Report indicated that smoking was bad for your health and was correlative to lung cancer, high blood pressure, and other debilitating illnesses. The American Cancer Society launched a concerted effort to lower smoking among adults with blanket advertising of this fact on national television. This campaign was dramatically successful, so much so that smoking is no longer possible across the United States in most public places, industrial and commercial workplaces, shopping malls, schools, and hospitals, or even in enclosed sports arenas.

In the early 20th century, one of the key social effects of this focus on disease control was to divert attention from the larger public municipal problems of water supply, street cleaning, and mosquito control, housing reform, and living conditions. Social pathology of communicable diseases was cheaper and easier to control than massive public health initiatives.

For example, to control tuberculosis, it was not necessary to improve the living conditions of 100 million Americans, the population at the time, but merely to supervise and control the 200,000 active cases in the pipeline and limit the potential to infect others by isolating those with the contagion in sanatoriums.

The first handbook on public health (1915) was devoted to contagious diseases with a much smaller section on industrial hygiene, housing, water supply, and public education. This narrow bacteriological view of the disease remains with some exceptions.



Cover of guide to Public Health Professionals, 1915

Today, society has been medicalized as the language of medical and hospital jargon has entered the daily vernacular. Indeed, with the Internet, many are surfing the net to make their diagnosis of what ails them.

Meanwhile, hospitals have often been cited as disease-infected incubators of contagions, especially with staph infections. Staphylococcus is a group of bacteria that can cause several diseases from exposure to various tissues of the body. Patients with skin problems such as burns or eczema may be likely to get staph skin infections. Such infections are possible from contaminated objects, as staph bacteria often spread if someone touches the infected area. It is an uphill battle for hospitals to reduce this possibility.

Hospitals are often associated with a very difficult to control staph infection called “methicillin-resist Staphylococcus aureus” or MRSA. It is a group of bacteria that is resistant to most antibiotics and has long been associated with outbreaks in hospital environments and can even prove fatal if not treated quickly and effectively.

The doctor has acted over the past century as the new shaman closely associated with society’s self-conscious propensity to burn the candle at both ends and not suffer the consequences. Medical science has increased longevity, the quality of life, and the standard of living. As such, the doctor has been placed in the role of the priest over the body rather than the soul, guardian of material well-being rather than the arbiter of spiritual contentment, apologist for lifestyle excesses of self-indulgence. In other words, something of a miracle man with the support of the pharmacist in the wing.

As a mediator in a medical sense of social behavior and as assessor as to a person’s state of health, strangely, the doctor has come to carry the scepter of the existential philosopher with the focus on the eternal quest for hedonistic deliverance.

Thanks to the bacteriological revolution of the last 200 years or so with reductionism validated with the microscope, the human dimension has been reduced to data tracks. The self has been divided between public health (macro) and disease control (micro).

With Western medicine, the soul has been cut from the body, and the mind is made homeless as the healer (doctor) is disconnected from the sick (patient). This factory mentality is evident in the workplace where the worker is separated from the manager, the technologist from the user, the priest from the community, the community from the nation, and the people from the land.

In the 21st century, medicine has become truly long distant. Crozier Chester Medical Center in Philadelphia has its CAT scans processed at that center on the emergency room midnight shift and then electronically transferred and diagnosed by radiologists 6,000 miles away in Jerusalem. The patient has been reduced to a negative image on a computerized axial tomography radiotracer to be studied as a thing to be analyzed, not a person to counseled and consoled.

NEXT - Nowhere Man in Nowhere Land – Part Twenty-Three – The Knowledge Keepers & Destiny!

No comments:

Post a Comment