Popular Posts

Saturday, August 28, 2021

NOWHERE MAN IN NOWHERE LAND - TWENTY-FOUR



Nowhere Man in Nowhere Land – Twenty-Four




PSYCHIC ENTROPY



IS THERE A Way Out OF NOWHERE LAND?



JAMES RAYMOND FISHER, JR., Ph.D.

Originally © October 17, 2016/August 28, 2021


A LOOK BACK TO SEE AHEAD

In this final segment and the Afterword, we complete the commentary of the PAST IMPERFECT, PRESENT RIDICULOUS, and FUTURE PERFECT. The possibility that human beings may soon be marginalized in the name of progress is no longer a farfetched consideration. The narrative of these twenty-four episodes has been essentially a peripatetic walk through time culminating in the clumsy attempt of the United States to redress a catastrophic wrong (i.e., the 9/11 Osama bin Landen’s Aikido attack on the New York City Twin Towers and the United States Military Pentagon in Washington, DC) leading to a preemptive attack of Iraq to punish that nation for its alleged association with ben Landen and for having weapons of mass destruction, which proved a fairy tale of the CIA, compounded in the process by an attempt to convert Iraq into a “kinder, gentler” society to mirror the West, particularly the United States.

The long shadow of Christianity and its utopian zeal is seen in the presumptive strategy that clings to the United States like a shadow since 1945 when America emerged serendipitously as the lone superpower on the planet seeing the utopian dream of progress through a cloudy ethnocentric lens.

America’s ignorance of the world has manifested itself in psychic entropy to confirm the status of NOWHERE MAN IN NOWHERE LAND. American physicist Stephen Hawking writes:

Just like a computer, we must remember things in order in which entropy increases. This makes the Second Law of Thermodynamics almost trivial. Disorder increases with time because we measure time in the direction in which disorder increases. You can’t have a safer bet than that!

Hungarian American psychologist Mihaly Csikszentmihalyi writes in "FLOW: The Psychology of Optimal Experience" (1990):

To overcome the anxieties and depressions of contemporary life, individuals must become independent of the social environment to the degree that they no longer respond exclusively in terms of its rewards and punishments. To achieve such autonomy a person has to learn to provide rewards to herself. She has to develop the ability to find enjoyment and purpose regardless of external circumstances.

This is an invitation to turn Psychic Entropy on its head to serve rather than continue to be enslaved to the reckless dissonance of the times. This is clearly in evidence as semiotics and semantics signs, meanings, and codes that guide people's lives have been corrupted to justify the radical behavioral change in terms of sex-role identity. Turning Psyche Entropy inside out:

To wit, such designations as male and female, boy and girl, man and woman, husband and wife of different genders, and Homo sapiens being of one species. Instead, this finds boys wanting to be girls, and girls wanting to be boys, using hormones to change their appearance if not changing their sexual anatomy by submitting to surgical sex-changing identity. Boys want to play girls’ sports, and girls want to play traditional boys’ sports. Boys want to dress like girls and girls want to dress like boys. Blacks want to be whites, and whites want to be black, or some other Homo sapiens’ identity. Happiness has turned into pill dependency while self-regard has transmogrified into self-hate. Nobel Laureate for Literature (1946), Swiss-German novelist, poet, and painter, Herman Hesse, writes:

“If you hate a person, you hate something in him that is part of yourself. What isn’t part of ourselves doesn’t disturb us.”

British philosopher, semiotician, and semanticist John Locke (1632 – 1704) anticipated when the general public could go off the rails, penning “An Essay Concerning Human Understanding” (1695) which relates to personal identity. Locke claimed what makes me the same person over time, despite all the differences made to my appearance and character by the passage of the years is that we have an immortal soul which remains invariant through the accidental and oftentimes misguided circumstances of our self-perception.

Accordingly, it is the soul, not the body, that carries our personhood. This implies “a person” and a “human being” is not the same thing. Indeed, Locke treated the concept of a person as a forensic concept, that is, relating to our DNA, fingerprints, biology, pathology, etc. To the question of personal identity, he claimed it consists of “consciousness of being the same person over time.” This consciousness is constituted by self-awareness, memory, and self-regarding interests in the future. Locke introduced the word “consciousness” into English by this theory.

In the final book of the Essay Locke gives his account of the nature of knowledge which in turn guides behavior. This he says consists in “the perception of the connection among ideas and agreement or disagreement and repugnancy, of any of our ideas. In this alone it consists.”

When the connection or disagreement among ideas is immediately obvious, Locke calls it “intuitive knowledge.” When reasoning is required to see the connection or opposition, he calls it “demonstrative knowledge.” His theory is almost exactly what we commonly think of as empirical knowledge, the knowledge that arises in us because of the causal interactions between the world and our sense organs. Generally, perception is a mainly reliable source of information about our world and beyond (The Fisher Paradigm©™, 2018, looks at this concept in pragmatic empirical terms of everyday life).

Personal identity and the common good has been misappropriated from the individual by political identity and political correctness where the politics of resentment cloud the freedom of free choice with persons wanting to abort their DNA and the landmarks of their culture for the surreal.

Francis Fukuyama reminds us in “Identity Politics” (2018) that identity and identity politics are of a fairly recent province popularized by the psychologist Erik Erikson in the 1950s, a man who left Europe without academic credentials, assumed a new name and new identity, finding a home in our most prestigious academic institutions, while becoming a spokesman for the multitude who felt their identity had been stolen.

No longer is maturity and aging gracefully a badge of honor. Now, age is denied as an absolute by those who submit to elective plastic surgery often turning the so obsessed to more resemble zombies. Meanwhile, work has been given a bad name as the State promises to protect its citizens from work as well as pain, and failure, or the necessity of making choices and experiencing the consequences of their actions.

If we accept the State’s promise, all we have to give up is our freedom, individualism, and independence. Political correctness has been on steroids for decades but it now appears more resolute to change our lexicon by removing such terms as Merry Christmas, Happy New Year, Happy Easter, and the designations of male and female, man and woman from the language. At the same time, it would eliminate the celebration of Washington’s and Lincoln’s birthday, Columbus Day, and many other such cultural-historical days from the calendar. Given this retreat from history, it is only a matter of time before the busts of Lincoln, Washington, Jefferson, and Theodore Roosevelt will be dynamited into oblivion on Mount Rushmore.

Spanish philosopher George Santayana once said, “Those who cannot remember the past are condemned to repeat it.” History is part of our personal identity.

Psychic Entropy goes from high disorder to the point of fanatical order. It can be reversed by negative entropy which is a matter of creative renewal. We are on the cusp of that matter as we live in an age where no one is in charge. Our cell phones and our apps with scores if not thousands of friends has become a societal pacifier leaving little time or inclination to read books, certainly such difficult books and writing as Herman Hesse. I write:

The West is in decline as Oswald Spengler suggested more than one hundred years ago, but what he may not have envisioned is that self-indulgent man, a product of this Psychic Entropic run amuck may one day find man near his journey end (see Can the Planet Earth Survive Self-Indulgent Man, 2015). The earth is becoming a very different place than it has been since the arrival of man. While humans are unlikely to become extinct, the world in which they have evolved is rapidly vanishing.

Psychic Entropy is real. We have seen in modern times going from high disorder to fixated low disorder attempting to establish a sense of being in control when everything is out of control. What once constituted leadership is now simply a struggle for power and dominance at any cost. Decision-making has become chimerical in its conclusions often being inconsistent with unrealistic situational demands. We see this in corpocracy be the institution the government, academia, the military, business, and industry, or the public health industry as each has lost its way to becoming essentially political. 

While position power is now atavistically replaced by knowledge power, the possessors of this power of the majority of the citizens and not those at the top of what is now an anachronistic organizational structure. The dominant virtue of enterprise is artificial in the sense of being dependent on prevailing social convention. Artificial virtue, or the appearance of advocating virtue when vice, iniquity, faintheartedness, fearfulness, and timidity prevail, according to English philosopher David Hume (1711 – 1776) points to consistent conformity to socially espoused norms that often no longer apply, leaving society to slip from the Past Imperfect and Present Ridiculous to envisioning a miraculous Future Perfect, a climate that couldn’t be more suitable to Nowhere Man in Nowhere Land.

While this appears to be the general climate of the United States at the moment, quietly and unobtrusively one industry has managed to thrive no matter the prevailing cultural climate and that is the farm industry, which is the subject of the AFTERWORD in terms of how it has always dealt with Psychic Entropy.

THE COMFORT OF RHETORIC

Dante’s decision seven hundred years ago to write his great poem not in Latin but in what he called the vulgar eloquence – Italian, the language of the people – and the innovation in the following century of printing from movable type represent landmarks in the secularization of society, as well as an affront to the hegemony of priests and oligarchy tyrants. The impact of today’s emerging technologies promises to be no less revolutionary, perhaps even more so. The technology of printing enhanced the value of literacy, encouraged widespread learning, and became the sine qua non of modern civilization. New technologies will have an even greater effect, narrowing the notorious gap between the educated rich and the unlettered poor and distributing the benefits as well as the hazards of our civilization to everyone on earth.

American cultural critic John Leonard (1939 – 2008), “Cri de Coeur,” The New York Review (February 8, 2001)

“There are ample opportunities to help create a more humane and decent world if we choose to act upon them. A democratic communications policy would seek to develop means of expression and interaction that reflect the interests of and concerns of the general population, and to encourage their self-education and their individual and collective action.”

American journalist and political commentator Bill Moyers (born 1934), American journalist and political commentator, A World of Ideas (in a 1989 interview of Noam Chomsky).

WHY NO WAY OUT?

We have overwhelming data to solve our problems, but instead of solving our problems with this providential abundance we mask our difficulties and compound our problems with disinformation.

Novelist D. H. Lawrence (1885 – 1930) was prescient long before the computer age:

“Is it true that mankind demands, and will always demand a miracle, mystery, and authority?” He answers: “Surely it is true. Today man gets his sense of the miraculous from science and machinery, radio, airplanes, vast ships, zeppelins, poison gas, artificial silk: these things nourish man’s sense of the miraculous as magic did the past.”

John Gray in Straw Dogs (2002) concurs:

“Today for the mass of humanity, science and technology embody ‘miracle, mystery, and authority.’ Science promises that the most ancient human fantasies will, at last, be realized. Sickness and aging will be abolished, scarcity and poverty will be no more; the species will become immortal. Like Christianity in the past, the modern cult of science lives on the hope of miracles.

“But to think that science can transform the human lot is to believe in magic. Time retorts to the illusions of humanism with the reality: frail, deranged, undelivered humanity. Even as it enables poverty to be diminished and sickness to be alleviated, science will be used to refine tyranny and perfect the art of war.”


German economist, sociologist, philosopher, and political scientist Karl Marx (1818 - 1883) scored utopianism as unscientific yet his “scientific socialism” resembles alchemy, not chemistry, but magic, which is utopian, as he believed that technology could transmute the base metal of human nature into metaphorical gold. Moreover, he saw no danger in the expansion of human numbers for he saw no limit to productivity with the abolition of scarcity, private property, the family, the state, and the division of labor. Paradoxically, while he saw religion as the “opium of the masses,” his ideology was consistent with that of Christianity and its deism, which he rejected.

It would appear there is no way out because man does not seem willing to accept that he is an animal no different than a cow or a horse or a tree when it comes to nature. Naturalist E. O. Wilson (born 1929) envisions the next century and saw a closing of the “Age of the Mammals,” and new age as one characterized by biological impoverishment, an “Age of Loneliness.” How so?

According to John Gray, humans co-opt over 40 percent of the Earth’s living tissue. We see this in dried-up lakes, treeless mountains, polluted oceans, toxic atmospheres, and melting ice caps. We are now seven billion souls on earth today. In the not too distant future, we will be twelve billion souls. When there is little available potable water or food supply, energy sources will cease to be the critical mass driving societies to war. It will be potable water.

The synthetic world that humans have created will mirror their collective solitude as there are no exits from the creative fiction of Nowhere Land. New mystics will surely appear suggesting empty spaces in which those that survive can open themselves to something outside their solipsism, possibly a colony in outer space. John Gray suggests these mystics should be ignored for a monastery of scientists is not where peace will be found but by citizens of this small planet communing with nature in the company of other animals, not seeing these creatures as inferior but as beings who have never been inclined to separate themselves from nature or its rhythm and renewal.

PSYCHIC ENTROPY

Entropy: A thermodynamic property that is the measure of a system's thermal energy per unit of temperature that is unavailable for doing useful work. This is known as the Second Law of Thermodynamics. In all energy exchanges, if no energy enters or leaves the system, the potential energy of the state will always be less than that of the initial state. This is another way to refer to entropy.

Psychotherapist Carl Jung (1875 – 1961) created the concept of “Psychic Energy” claiming since our experience is confined to relatively closed systems, we are never in a position to observe absolute psychological entropy. But the more the psychological system is closed, the more the phenomenon of entropy is manifest. A system is closed when no energy from outside can be fed into it; only in such a system can entropy occur. An example would be mental disturbances characteristic of intense seclusion from the environment or the dulling of the psychological affect in schizophrenia.

Psychologist Mihaly Csikszentmihalyi (born 1934) incorporated Jung’s psychic entropy theory into his “flow theory” of optimal experience in positive psychology where psychic entropy is interpreted as a disorder in mental consciousness that produces a conflict with an individual’s goals. He writes:

“Emotions refer to the internal states of consciousness. Negative emotions like sadness, fear, anxiety, or boredom produce psychic entropy in the mind, that is, a state in which we cannot use attention effectively to deal with external tasks because we need it to restore an inner subjective order.”

We see elements of this in American psychologist Leon Festinger’s (1919 – 1989) A Theory of Cognitive Dissonance (1957) where he argues we make new information fit with information already indicating in our closed mind. This incompatibility (dissonance) could happen when you do something that goes against a value that’s important to you. Or maybe you learn a new piece of information that disagrees with a long-standing belief or opinion. As humans, we generally prefer for our world to make sense, so cognitive dissonance can be distressing. That’s why we often respond to cognitive dissonance by doing mental gymnastics to feel like things make sense again when they don’t.

We see it also evident in American philosopher Alan Bloom’s (1930 – 1992) Closing of the American Mind (1987) where he shows how American higher education has produced psychic entropy by reinforcing cultural biases, myths, and barriers to reality instead of opening young minds to a world beyond their cultural conditioning.

Then there is American academic psychiatrist Nassir Ghaemi’s (born 1966) A First Rate Madness (2011) in which he contends that great leaders such as Lincoln, General Sherman, Churchill, Gandhi, FDR, JFK, among others, demonstrate palpable evidence of mental illness, where the singularity of their minds blocks out reality to pursue a closed-minded course of madness to epitomize psychic entropy.



Hungarian psychologist Mihaly Csikszentmihalyi


We saw psychic entropy on display when the George W. Bush administration relied on the Central Intelligence Agency’s director George Tenet (born 1953) to accept the CIA director’s “slam dunk” metaphor that Iraq had weapons of mass destruction or WMDs). It was wishful thinking culminating in General Colin Powell (born 1937) taking the fall when asked to give a speech of assurance to the United Nation’s Security Council on February 5, 2003, that Iraq, indeed, had WMDs. A tragic comedy of errors followed.

HEGEMONY/IMPERIALISM/PSYCHIC ENTROPY

The missionary zeal neoconservatives in the Bush administration had been building a frenzy since 9/11. Spanish philosopher George Santayana (1863 – 1952) in Birth of Reason (1968) writes:

The humanitarian, like the missionary, is often an irreducible enemy of the people he seeks to befriend because he has not have the imagination enough to sympathize with their proper needs nor humility enough to respect them as if they were his own. Arrogance, fanaticism, meddlesomeness and imperialism may then masquerade as philanthropy.

In this “Information Age,” while John Leonard and Noam Chomsky would suggest a way of getting beyond the madness of total reliance on technology, which still proves – in the case of the CIA – just this side of guesstimates can however be catastrophic in consequence. American conservative cultural and political commentator David Brooks (born 1961) writes:

“For decades, the US intelligence community has propagated the myth that it possesses analytical methods that must be insulated pristinely from the hurly-burly world of politics. Rather than rely on a conference-load of game theories or risk assessment officers, when it comes to understanding the world of things and menaces . . . I’d trust anyone who has read a Dostoyevsky novel over the past five years.”

Inside this critique is the fact that those in government, especially in the 21st century, have been irredeemably bad when it comes to prescience. This transcended from bad to worst with the preemptive invasion of Iraq on faulty intelligence and then escalated into the insanity, which has become the “War on Terror.”

English philosopher historian John Gray sees The Iraq War as serving an economic system that he deems “casino capitalism” where long-term investment has been replaced with ubiquitous gambling. Evidence is that the war, which has already cost multi-$billions will ineludibly be written off as a bad debt. He writes:

“If there is a symbol that captures America in Iraq, it is not the colonial institutions of former times, it is Enron, which vanished leaving nothing behind.”


Workers at “ground zero” after the 9/11 destruction of the Twin Towers in New York City


[Enron was a small company formed by Kenneth Lay in 1985 merging Houston Natural Gas and InterNorth, two relatively modest regional companies. Kenneth Lay then hired Jeffrey Skilling as a consultant in 1990. Soon after, he was made chairman and chief executive officer of Enron Finance Corp. In 1991, Skilling became chairman of Enron Gas Services Company, the result of the merger of Enron Gas Marketing and Enron Finance Corp. Skilling developed a staff of executives who were able to use accounting loopholes, special purpose entities, and poor financial reporting to hide billions of dollars in debt from failed deals and projects.

The esteemed accounting firm, Arthur Andersen, became complicit in this scandal, which resulted in the bankruptcy of Enron Corporation and the dissolution of Arthur Andersen. Enron shareholders filed a $40 billion lawsuit after Enron stock fell from US$90.75 in June 2000 to $1 by the end of November 2001.


The monumental collapse of ENRON


Several Enron employees (24 in all) were convicted of fraudulent practices including Kenneth Lay and Jeffrey Skilling, who went to prison. Kenneth Lay could have been sentenced to a prison term of 45 years but died in 2005 before sentencing. Enron’s utopia’s underbelly of deceit led to a giant corporation vanishing without a trace, leaving only the nightmare of swindled stockholders facing the future a lot poorer.]

Enron was financial fraud in which rational deception was apparent. What of the fiasco that has become the Iraq War, and by extension the aftermath of that preemptive invasion on the faulty CIA claim that Iraq had weapons of mass destruction? Let us peruse this utopian scenario of psychic entropy.

IRAQ: WHERE PRUDENCE FAILED

Once Islam terrorists destroyed the Twin Towers in New York City and severely damaged the US Military Pentagon on September 11, 2001, claiming three thousand lives, the temperament of the nation’s leadership demonstrated a lack of maturity and perspective. Moreover, technology, which had risen to the order of the high church, failed the CIA, the Bush Administration, and the US Congress. The majority in Congress with a closed mind was on board to teach the terrorist a lesson by invading Iraq. It mattered little that the terrorist came out of Saudi Arabia, an ally, and not from Iraq. A tragedy of errors followed that has continued through President Barak Obama’s administration. It is as if the American blind spot is the inclination to treat its myths as reality and acting on them as if a whim in a state of being unhinged.

From America’s first Puritan settlement in 1628 to the present, the United States has seen itself with missionary zeal as the “redeemer nation.” American novelist Herman Melville (1819 – 1891) echoed this sentiment in White Jacket (1924):

“We Americans are the peculiar, chosen people – the Israel of our time, we bear the ark of the liberties of the world.”

English Puritan lawyer John Winthrop (1587 - 1649), founder of the Puritan Colony on Massachusetts Bay stated, “We must consider that we shall be as a city on a hill.” The United States President Woodrow Wilson (1856 – 1924) then echoed these sentiments on November 20, 1914, in his speech to the American people on Neutrality: “America is the house of the Lord and the city on a hill.” He saw the eventual entrance of the United States into WWI as “salvation” to the Allies.

This mindset of Christian rectitude drove “born again Christian” President George W. Bush to invade Iraq. This passion to right a wrong has fallen apart like a house of cards. It was a knee-jerk reaction to a terrible wrong (9/11) with shoddy information. Just as the computer is still dependent upon information submitted, faulty thinking is bound to produce suspect solutions.

America’s ruinous invasion of Iraq is seen by philosopher John Gray as more than a fusion of the utopianism of Bush, but as an “exotic and highly toxic blend of beliefs, none of them grounded in any observable reality . . . a type of ‘liberal imperialism’ based on human rights.”

Where this gets murky is when a government applies its liberal ideal of self-determination and democracy as justification for questioning the legitimacy of a government for human rights violations. Incredibly, once this assessment is made, it goes forward with a good conscience to override that government’s claim to sovereign authority.

Even with all the negative consequences of the Iraq War, the Bush administration with the support of most Americans defended the rightness of its military intervention to overthrow the tyranny of Saddam Hussein. Canadian political author Michael Ignatieff (born 1947) wrote in The New York Times three months before this preemptive invasion of Iraq:

“America’s empire is not like empires of the past, built on colonies, conquest, and the white man’s burden . . . The 21st-century imperium is the new invention in the annals of political science, an empire lite, a global hegemony whose grace notes are free markets, human rights, and democracy, enforced by the most awesome power the world has ever known . . . Regime change is an imperial task par excellence since it assumes that the empire’s interest has a right to trump the sovereignty of a state. The Bush administration would ask, what moral authority rests with the sovereign who murders and ethnically cleanses his own people has twice invaded neighboring countries and usurps his people’s wealth to build palaces and lethal weapons?”

Given this scenario, who can question the righteousness of the good and the tyranny of the bad? "The War on Terror” was launched with this righteous sentiment and a score of years later few would disagree that things are worse today than they were in the beginning. There is a word in medicine that describes this propensity, iatrogenic, or the cure was worse than the disease.

It is hard for the West to accept that democracy is not for everyone or that the world is not one contiguous common civilization. Iraq had something approaching stability between its three major groups: Kurds, Shia and Sunni. This tenuous bond was decimated with the Iraq War. The West then compounded the situation by inadvertently creating a vacuum with the withdrawal of mainly American and British forces from the country prematurely leading to the metastasis of ISSI and al-Qaeda as a gestation center for a worldwide jihadist.

Australian political columnist Tom Switzer (born 1971) writes:

“The unintended consequences of the war were not just the dear costs in blood, treasure, and prestige for the United States. Nor were they the hundreds of thousands of deaths of Iraqi civilians. They were what amounted to a strategic victory for the Islamic Republic of Iran.


The human brain is finite in processing information. Therefore, human coping is limited. There are two limiting conditions in our cranial cellular congress to cope with reality. The brain's capacity to process information is finite, and the machinery with which to do it is not a conscious unity. When the space requirements of problems fit the network there, things go well. When they do not, things go awry.

“Tehran’s presence in the Shia South, moreover, was felt well before the withdrawal of U.S. forces. Recall that General Qasem Soleimani, the commander of the Revolutionary Guards’ Quds Brigade, spearheaded Iran’s political and military involvement in Iraq a decade ago. Recall, too, al-Maliki’s debt to Iran in helping secure his presidency in 2006.

“By toppling a Sunni regime and bringing democracy to Iraq, the U.S.-led coalition brought to an end the sectarian imbalance that had been in place for generations. The Baathists, like the Hashemites, British, and Ottomans ­before them had kept in place minority rule, giving Sunni Arabs a disproportionate share of power and resources while brutally suppressing Shia and Kurds.

“The invasion in 2003 meant that the majority Shia became the new winners in post-Saddam Iraq, and the minority Sunnis the new losers. The former turned to their Shia brethren in Tehran for support; the latter turned to a Sunni insurgency that has morphed into a plethora of Sunni jihadists, including the Islamic State.”

President George Bush sent Paul Berman to manage the transition of Iraq after the initial victory in 2003 resulting in the overthrowing of Saddam Hussein. Berman was a trained historian and neoconservative technocrat who knew nothing about the discipline of Organizational Development (OD) in social/industrial psychology with its assessment and implementation tools that could have proved useful. OD would not have acted until:

· A thorough understanding of the apocalyptical Islam religions of Iraq and the region were thoroughly perused including the historical relationship of Shia and Sunni and the Kurds to each other, along with the source of the cultural/religious/political conflicts between them in terms of secular and theocratic Islam;

· Unobtrusive data and measurement of the situation had been collected involving extensive personal interviews of the principals including members of the Saddam Hussein government, the military, and the industrial/commercial establishment. This would also include Process Flow Analysis of the demographics history, culture, languages, ethnicities, and long-term/short-term assessment of the Sunni, Shia, and Turk populations at every level of society;

· Surveys, interviews, and direct observation of key elements of the previous political structure including the military, administrative, as well as Iraqi allies and adversaries of government;

· Personnel of nationals of the three dominant groups, as well as the military and political operatives, had been properly vetted;

· Establishing a reasonable transitional time before making any firm commitments to final designators.

· Establishing a strategic and tactical format and process to bring on board critical components of the former administration including the military.

· Foster cooperation with warring factions. This means seeing through surface issues to natural barriers between the factions.

· Determining major concerns that need addressing and redressing immediately; then developing a conflict resolution strategy in which all the factions have adequate representation.

OD aims to make the client self-sufficient. The prescription for establishing that objective cannot be adequately determined until the situation is sufficiently defined and the readiness of the factions involved appropriately assessed.

OD serves – in this case, the Iraqi people and nation – with no interest in promoting one faction at the expense of another. Politics are involved but the psychological climate that is created will be the final arbiter to self-sufficiency.

This did not happen. Instead, the Bush administration launched its regime change of Iraq as if a holy crusade. The banner of the “War on Terror” provocatively promoted human rights and women’s rights across the Muslim world when the United States had entered a hostile environment, first as invaders and now as occupiers with the solipsistic hubris that America was a “savior nation” and knew best. Again John Gray writes in Black Mass: Apocalyptic Religion and The Death of Utopia (2007):

Paul Berman gave vent to this sublime vision in 2003. It contained no inkling that the result of the overthrow of secular despotism in Iraq would be a mix of anarchy and theocracy. The impossibility of liberalism in Afghanistan – which has only ever had something resembling a modern state when Soviet forces imposed, with enormous cruelty, a version of Enlightenment despotism on parts of the country – was too disturbing to contemplate. All the liberal causes that were wrapped up in the ‘war on terror’ were inherently desirable, and so – it seemed to follow – practically realizable. In their attitudes to regime change, neo-conservatives have been at one with many liberals. Regime change was an instrument of progress …

Nasr Iranian-American academic Vali Reza in The Shia Revival (2006) observes:

The Shia still fear Sunni rule, and Sunnis rue their loss of power and dream of climbing back to the top . . . Each has a different vision of the past and a different dream for the future. There are still scores to settle, decades of them.

Gray goes on to point out that overthrowing tyranny may bring democracy temporarily without advancing liberty. No constitution can impose freedom where it is not wanted or where it is no longer valued. Nor can the secular tyranny that was destroyed be reinvented.

Several blunders followed Bergman’s management of the interim government in transition. The army was disbanded with all its weapons, and soon this became a force with which to reckon. He disbanded some religious groups and left the Sunni minority, which had maintained a stable if a draconian balance between these three Islamic groups -- Sunnis, Shias, and Kurds -- for decades. In an instant, this was summarily destroyed.

American analysts have suggested that perhaps a three-way partition of Iraq is the solution. The idea of a split of Iraq into three states has seemingly only added fuel to the fire as Shia, Sunni, and Kurds are already vicious rivals for power and natural resources, mainly oil.

In any case, the Sunni minority would be likely to lose everything. To wit, the United States is powerless in the face of anarchy, religious and tribal tension between the three groups that the invasion has created. But perhaps the greatest faux pas was to create a vacuum by dismantling the state and the military of Iraq that Iraq’s bitter enemy, Iran, could now exploit for its purposes. President Barak Obama added icing to this Iranian cake when he withdrew all American troops from Iraq without convincing Iraqi Prime Minister Nouri-a-Maliki (born 1950), a Shia Muslin, that it would thwart Iran’s attempt to export its terrorism to Iraq and beyond.

GLOBAL TERRORISM AND THE DECLINE OF THE WEST?

There have always been terrorists, but terrorism seems to have been spiked and gone global since the Iraq War. The United States reacted to 9/11 precipitously and emphatically with the preemptive invasion of that country. Does this mean that sovereign states no longer are relevant?

In this new system of justification, the clear suggestion is that market zones have more relevance than the value of nation-states. Does this mean that states that do not behave consistently with the order prescribed by the West they can be attacked? This may sound facetious but the American system, its democracy, and capitalistic economy have been spreading throughout the world since WWII imposing its will sometimes subtly and otherwise not.

There is a pervasive component of utopia in nostalgia for a world without religious and ethnic divisions, a world in which new technologies erase differences. Emperor Constantine of The Roman Empire converted to Christianity in 312 and the Roman Empire adopted the Christian faith in 380. A century later Rome was sacked and the “Dark Ages” followed with Emperor Charlemagne crowned the Holy Roman Empire in 800.

In operational terms, as the sovereignty of many states has diminished including Great Britain, France, Germany, and many other sovereign states, the burden on The United States has grown. This scenario accelerated the decline of Rome, and now with American power integral to the globalization of society, and other states heavily dependent on the US for security, subsistence, and development could this be the formula for the decline of the United States?

China, India, and Russia are behaving as the US has in using global markets to enhance its authority and power. The world outside the United States wants what it has, and it wants it now.

While the “War on Terror” goes nowhere as this is as inevitable as the sun will rise tomorrow, disruptive and tragic as suicide bombers and other bombings are, the most catastrophic risk of the future is nuclear terrorists. Using “suitcase bombs” or “dirty bombs” (conventional bombs with radioactive waste), terrorists could kill hundreds of thousands of people and paralyze social and economic life in an instant.

There is no suitable strategy, no high-tech surveillance methodology, or concerted effort to annihilate the terrorists or lessen the threat other than to understand those who would go to such extremes from their perspectives which appears they believe they have nothing to lose.

It would be pure insanity “to take Iran out” for Iran unlike other countries in the region is a fairly cohesive state. It is the home of an ancient and rich Persian culture and civilization. Should the next President of the United States attempt to destroy Iran, it could trigger an upheaval in the Islam states in the Middle East, including Pakistan and Afghanistan that would make the current situation seem like nothing at all.

It has become the discourse of the West to link terrorism with Arab culture and an Islamic cult of martyrdom. Gray writes:

Islam is a religion, not a culture, and most of the people who live in the “Islamic world” are not Arabs . . . Suicide terrorism is not a pathology that afflicts any particular culture nor has it any close connections with religion . . . Much terrorism is like other types of warfare. Nearly always, wars are fought within or across cultural boundaries.

We make no progress in the dialogue by calling Iran an “evil empire” or looking for simplistic formulae to deal with adversaries who have agendas that differ from our own. This, too, is OD.

Gray adds:

The “war on terror” is a symptom of a mentality that anticipates an unprecedented change in human affairs – the end of history, the passing of the sovereign state, the universal acceptance of democracy, and the defeat of evil. This is the central myth of apocalyptic religion framed in political terms, and the common factor underlying the failed utopian projects of the past decades . . . Apocalypse failed to arrive, and history went on as before but with an added dash of blood.

In this commentary of 24 related essays, we have been going from the “Past Imperfect” through the “Present Ridiculous,” and despite the fanfare of a secular society, religions persist often in all their violence but even more frequently in religious misunderstanding.

Yet, it is science and not religion that has come to be viewed as the vehicle of revelations. Science is now prominent and will be increasingly so in the “Future Perfect” world. Science is the instrument for forming reliable beliefs in which these beliefs are meant to mirror the world, not as “it is, but as it ought to be.

Religion is a human instrument. While science solves mysteries of nature, religion embraces the mystical to nourish the life of the spirit. Science deals with certainties while religion entertains doubts.

Doubts are part of the garment of everyday life. Science and religion serve different needs which may pull in different directions but are equally necessary. The irreducible reality of religion is apparent in the apocalyptic shadow of Christianity. Science or religion can't escape this shadow as it is evident in the common belief in utopian progress and the “Cut & Control” commitment to soaring beyond the norm whatever the costs.

Secular mythmakers differ little from those of religion; all ultimately resort to violence even terrorism. Over the past two centuries, the storyline has been human progress with the clash of dark forces destined for destruction from German Nazism to communist Marxism to ISSI and al-Qaeda terrorism.

Followers of Marxism believed humanity could only advance through a series of catastrophic revolutions; followers of Nazism conspired to eliminate the Jewish race and establish a semi-divine immortal state of Anglo-Saxons; followers of ISSI and al-Qaeda believed the entire Western culture needs to be destroyed and a new Islamic state and culture must become the dominant force in the world.

The aims of each of these secular or semi-religious movements reflect the powerful influence of “The Age of Enlightenment,” an intellectual and scientific movement of 18th century Europe that was characterized by a rational and scientific approach to religious, social, political, and economic issues. The belief was that once poverty was eradicated and education was universal that social equality and political repression would vanish, and formal religion would be no more important than a personal hobby. The shadow of Christianity is equally apparent here as well. Alas, in the future as in the past, there will be authoritarian states and liberal republics, theocratic democracies, and secular tyrannies, imperialistic empires, and city-states in the mix. But there will also be something quite different, which is the subject of The Afterword. People close to the earth have always managed to subsist in whatever insane political climate of their times because they know in being close to nature and the land they are close to God however He is defined.

THE AFTERWORD




ROBOTIC FARMER


WHY THE FARM INDUSTRY IS SPECIAL IN TERMS OF PSYCHIC ENTROPY

The farm industry has never had a series of international trade unions to support and defend its interests. Over the past centuries, farmers across the continent have had to deal with fickled nature: from draughts to torrential rains, from soil erosions to insect infestations, from too hot to too cold fluctuating climates, from forest fires and crop failures to soil erosions, from total dependence on animal power to managing the expense and complex intricacies of machine farming, from mysterious small crop yields to inadequate understanding of genetic vulnerabilities of plants.

Farmers have always been associated with a quiet industry of individualism, independence, and freedom to the stoic demands of farm labor as they chose to farm as vocation and avocation with little or no inclination to bring attention to themselves or this disposition. 

Farmers are one with the land and have remained so over time, adjusting to the dependence of the general public on their ability to meet its ever-increasing demands.

Farmers found the advantage of rotating crops on the land and allowing the land to remain fallow to regain the land's vigor; discovered the benefit of irrigation and showering crops with water overnight when the temperature drops to freezing. They invented and/or invested in new machinery to make farming more efficient and sent their sons and daughters off to college to study agriculture and return with new tools to enhance farming technology. They used pesticides and herbicides economically to ward off such possible crop devastations. While being suspect of the agribusiness and the corporate industry, they realized that this was reality and something they need to embrace rather than fight to survive and still be able to husband their crops and operate their farms.

Farmers have demonstrated that they follow the empirical pragmatic wisdom of psychologist William James (1842 – 1910) who posits the idea that the human experience of emotion arises from physiological changes in response to external events that have been experienced. While not drawing attention to this inherent characteristic, farmers, alone, have mastered Psychic Entropy overtime to their advantage.

THE GENESIS OF AMERICAN FARMING

In 1900, just under 40 percent of the total US population of 76 million lived on farms, and 60 percent of Americans lived in rural areas. Today, the respective figures are only about 1.312 percent of Americans are farmers supporting a population of 330 million, while some 20 percent of Americans still live in what is deemed rural areas.

The United States had between six and seven million farms from 1910 to 1940. Today, there are less than 2 million.

The predominant economic sectors in which most Americans are found are 20 percent in industry and 79 percent in social and governmental services.

Farmers are rich in the United States, but this too is deceiving. Corporate farming now dominates in what is now known as “agribusiness.” The richest farmer with the biggest farm is none other than Bill Gates of Microsoft with 269,000 acres of farmland growing potatoes and carrots. He has 70,000 acres of farmland in North Louisiana where he grows soybeans, corn, and cotton.

Another possible surprise about the contemporary American workforce is that, while women only represent half the population, women hold more jobs than men and occupy 50.04 percent of working positions. Stated another way, there are now 109,000 more women working than men.

Today, the education and health service industry employ the largest number of people in the United States with 34.1 million employed in the education and health care industries.

Agriculture with its scientific management and well-educated farmers has become more proficient and quietly the most taken-for-granted industry in the American culture. When I was a chemist at Standard Brands, Inc. in Clinton, Iowa, a farm community, I found myself one Saturday in Dewitt, Clinton, a small farming community in Clinton County having breakfast in a country kitchen-type restaurant. In conversation with four other men eating breakfast, I learned they were all farmers, and had received agricultural degrees from Iowa State University, one of the esteemed agricultural scientific centers of the country. When I shared this with my colleagues back at work, they were surprised that I was surprised. “What did you think,” one said, “that it was a miracle that we eat so well?”

Agriculture serves as the backbone of the economy of any country, without which it would collapse and face a serious financial breakdown. Exporting of cars and other factory equipment by United States producers is a distant second to agricultural products sold to the world. The United States is rich in highly cultivated land that sufficiently supports a strong economy.

Agriculture is not simply the cultivation of crops but also the farming of plants and animals, cattle and fish. Americans see their country as “the breadbasket of the world” when China has more arable land. Agriculture is also a brewing industry and a source of income for millions of Americans so employed.

Agricultural was chosen to profile the other side of Psychic Entropy where negative entropy has quietly been employed to improve the efficiency, productivity, and the yield of the land through technology, scientific management, rotation of crops, harnessing new methodology including robotics, automation, and hybridization to attain the best quality of crop yield and profit. At the University of Iowa, I had classmates who were children of farmers who were becoming medical doctors, dentists, and other professionals at their parents’ insistence when all they wanted to do is be farmers. They confessed to me that being close to the land was like being close to God, and nothing made them happier.

And aside from this nervous dance to oblivion that seems to consume so many Americans with Psychic Entropy, African American George Washington Carver (1865 – 1943) came to Iowa State in 1891 and earned a bachelor’s degree in 1894. Because of his excellence in botany and horticulture, he was appointed to the Iowa State faculty, becoming the university’s first African American faculty member. He went on to an illustrious career in agricultural science, far ahead of his time.


Nowhere Man is forever lost between a posteriori with his observations and experiments and a priori and his methods of reasoning.


Monday, August 23, 2021

NOWHERE MAN IN NOWHERE LAND - TWENTY-ONE

 

 Nowhere Man in Nowhere Land - TWENTY ONE

 

Christian Missionaries on the New

Frontier!

James Raymond Fisher, Jr., Ph.D.
Originally published © September 8, 2016
/© August 19, 2021


GOING ETHNOCENTRICALLY INTO THE FUTURE 

Beginning with Columbus in 1492 and continuing for nearly 350 years, Spain conquered and settled most of South America, the Caribbean, and the American Southwest.  After an initial wave of conquistadorsaided by military advantages and infectious diseases that decimated the native populations— defeated the pre-Columbian civilizations of the Aztecs, Mayans, and Incas, Spain organized a huge imperial system to exploit the land, labor, and mineral wealth of the New World. The Spanish Empire became the largest European empire since ancient Rome, and Spain used the wealth of the Americas to finance nearly endless warfare in Europe, protecting the Americas with a vast navy and powerful army and bringing Roman Catholicism through the missionaries to the New World. While the conquistadors destroyed the institutions and artifacts of societies in the Americas of more than a thousand years, the Soldiers of Christ, or Christian missionaries, took possession of their souls.

WHEN CULTURAL HUBRIS RUNS AWRY

The Spanish Conquistador Francisco Pizarro’s soldiers battling the Incas

The West, in this case, the Spanish Armada, with its cultural sophistication, military prowess, and armed with the stanchion of Christianity, the only true faith, felt it ordained with God’s will to vanquish the infidels of the New World, rid them of their profane customs, primitive culture and pagan gods. Indeed, it was Spain’s manifest destiny to colonize the New World for the crown, country, and God. Enter the Christian missionaries.

The arrival of Europeans in the New World in 1492 changed the Americas forever. Over the course of the next 350 years, Spain ruled a vast empire based on the labor and exploitation of the Indian population. Conquistadors descended on America with hopes of bringing Catholicism to new lands while extracting great riches.

Religion and self-interest combined to create a potent mixture that drew hundreds of thousands of Spaniards across the ocean with hopes of finding riches and winning souls for God. Along with the Spaniards came diseases to which the New World natives had no immunities.

What followed was one of the greatest tragedies in human history as smallpox, influenza and other communicable diseases ravaged the native populations, killing millions.

The Spanish never set out to destroy the people of the New World as their goal was to use Indian labor for their own ends. Almost immediately, an ethical debate arose in Spain on the righteousness of this action. This was the first time any European nation had consciously considered the status of non-Christians.

The traffic of Europeans to the Americas was not a one-way street. The so-called Columbian Exchange brought European goods and ideas to the New World, including the horse, which was not native to the Western Hemisphere, in exchange for new plants and animals to the Old World, including potatoes, corn, tomatoes, and other crops. The world was forever changed by the new horizons opened by Spain's intrepid explorers, despite the misdeeds of Spanish rule in America.

Spanish Conquistadors were primarily poor nobles from the impoverished west and south of Spain and were able to conquer the huge empires of the New World with the help of superior military technology, disease (which weakened Indian resistance), and military tactics including surprise attacks and powerful alliances with local tribes.

Once an area had been conquered, it was partitioned into land grant parcels. More importantly, the Indian people themselves were appropriated along with the grants to the conquistadors. These nobles were given title to the land and its people in return for a promise to teach the natives Christianity. This system was heavily abused, and Indians throughout the Americas were reduced to a condition of virtual slavery.

However, due to natural attrition and harsh misrule, the population of native laborers soon became too small for the voracious Spanish, so they began to import African slaves to work in sugar plantations and silver mines. The introduction of African traditions to the Native Americans and made for a social mixture richer than in almost any other part of the world, although racism continued to play a dark role in the New World.

Colonial society was hierarchical, based upon the amount of non-Spanish blood a person possessed. A complicated system, called the casta, delineated over 100 separate names for groups containing certain levels of Indian and African blood. Jobs, government positions, titles to land, and almost everything else in the Americas functioned according to this system with those at the top getting preference over those lower on the list. Discrimination and repression were features of Spanish colonial rule throughout its history.

Spain's government in Madrid tried hard to govern the New World, despite its distance from Europe. Using a system of viceroyalties and royal courts of appeals, the Spanish Monarchy was able to exercise control over Spanish settlers, even when they didn't want any sort of interference from the central government. The Crown was entitled to one-fifth of all mining profits, and this huge income helped Spain to become the largest and most powerful empire in Europe by 1600.

Religion was mixed with politics to create a hybrid system in what would become the American Southwest: Dominican, Franciscan, and Jesuit missionaries were often left in charge of large areas in what is now Texas, Arizona, New Mexico, and, later, California. With its goal of bringing the Catholic religion to the New World, Spain was also able to use the existing church governments for its political uses. Today, religion and politics continue to mix in Latin America.

The often-heavy-handed rule from Madrid (Spain), and the new ideas of liberty and freedom coming out of the American and French Revolution brought about the wars of Independence in the early nineteenth century.

Simon Bolívar (1783 – 1830), the “Great Liberator,” José de San Martin (1778 – 1850) led the fight for independence, although this was not a fight for Indian rights, or on behalf of the poor. Those who fought for South American independence was called criollos, or American-born descendants of Spaniards. They continued to rule the many new nations of Spanish America for generations. The Spanish left a legacy of cruelty and exploitation in their wake, but they also managed to open the world and increase cultural exchanges to a level never before seen in human history.

INCA EMPIRE – BEFORE & AFTER THE SPANISH CONQUEST

The Incas were part of an empire in pre-Columbian America centered in what is now Peru from 1438 – 1533 C.E., which represented the height of the Inca Civilization. It was known as the “Kingdom of Cusco” before 1438.

Throughout the empire, the Incas used conquest and peaceful assimilation to incorporate large portions of the western part of South America and centered on the Andean mountain range.

Shortly after the Inca Civil War with the Spanish Conquistadors (1529 – 1532), the Inca emperor was captured and killed on the orders of Conquistador Francisco Pizarro, marking the beginning of the Spanish rule.

The remnants of the empire retreated to the remote jungles of Vilcabamba and established the mall Inti). The angles of its base suggest that it served to define the changes of the seasons. Qalla Q'asa, which is built onto a natural spur and overlooks the valley, is known as the citadel.

View from Qalla Q'asa to Andenes

The Inca constructed agricultural terraces on the steep hillside, which are still in use today. They created the terraces by hauling richer topsoil by hand from the lower lands. The terraces enabled the production of surplus food, more than would normally be possible at altitudes as high as 11,000 feet.

With the military, religious, and agricultural structures, the site served at least a triple purpose. Researchers believe that Písac defended the southern entrance to the Sacred Valley, while Choquequirao defended the western entrance, and the fortress at Ollanta Tambo the northern. Inca Pisac controlled a route that connected the Inca Empire with the border of the rain forest (Wikipedia).

In a sketchy way, this gives you a sense of one of the sophisticated civilizations the Spanish (and later Portuguese) Conquistadors encounters as they raced through the Americas. They conquered and plundered in quest of this prize, but it was Old World diseases such as smallpox, typhus, measles, and influenza that wiped out 90% of the indigenous populations in the New World, making it the key factor in the European conquest of the Americas.

But the Incas were not alone in this devastation. They formed a city-state and powerful and wealthy empire in Peru, Bolivia, and Ecuador around the 12th century with Cuzco the capital and major city. They ruled the empire with a centralized government in four provincial sectors. Spanish Conquistadors, led by Francisco Pizarro, conquered most of the Inca Empire in 1533.

The Maya Civilization was located in Central America. It included southern Mexico, Belize, Guatemala, El Salvador, and Honduras and had existed without interruption from 2500 B.C.E. to 1500 C.E.

Independent of Egypt and the Middle East, the Maya Civilization developed a hieroglyphic writing system. They studied astronomy and mathematics, calculated highly accurate calendars, predicted eclipses, and other astronomical events, and they built elaborate temples and pyramids that are still impressive in their ruins today. The Maya people also had a complex social order.

The Maya people were a religious society and held festivals throughout the year to honor their favorite gods. They sacrificed to the gods and made ritual offerings.

The great cities of this classical period were Tikal and Palenque, which were religious centers. They loved sports and playing ball games and left elaborate ball courts. Most of the people were farmers and lived in small communities. Conquistadors did not destroy this civilization as it mysteriously disappeared around 900 B.C.E. leaving abandoned cities. The Maya people, however, survived continuing to live in Mexico and Central America.


ENTER CHRISTIAN MISSIONARIES IN SOUTH AMERICA



There is a humane and inhumane interpretation of the second wave of Spanish colonization in the Americas led by the missionaries. Through conquest and brutal occupation, along with the disease that the Conquistadors brought with them from Europe to a native population without the immune system to fight these diseases, the people who were still alive were vulnerable to any kind of kindness.

The missionaries brought such kindness with them along with medicines and medical personnel to tend to the sick and dying, but with a definitive agenda: that these natives would be converting to Roman Catholicism.

Pope Alexander VI (1431 – 1503), one of the most controversial popes of the Renaissance having fathered several children with several mistresses, issued a Papal Bull (Inter Caetera), awarding colonial rights over most of the newly discovered lands to Spain and Portugal with the proclamation to Christianize the indigenous populations of the Americas such as the Aztecs, Incas, and Mayas. The state-controlled clerical appointments with no interference from the Vatican.

Spreading Christianity was a top priority, but it went far beyond that. The influence of Franciscans, sometimes seen as a tool of imperialism, enabled other objectives to be in the programming including the Spanish language, culture, and value system to the conquerors with the idea of exercising total political control of the indigenous peoples in the New World.

The goal was to change the agricultural or nomadic Indians into a model to mirror that of the Spanish people and society. Basically, the aim was to urbanize this pastoral society by “offering gifts and persuasion and safety from their enemies.” This protection system was also to ensure the security for the Spanish military operation, believing if the missionaries could make the people more docile and passive they would be less inclined to be warring.

The top agenda of the missionaries was committing these natives to Christianity quickly and efficiently, purging them of their native cultural practices, while giving the impression they were blending their traditional beliefs with common Catholic practices, which of course was a ruse. However, the Spaniards did not impose their language to the degree that the missionaries did their religion.

In fact, the missionary work of the Roman Catholic Church in Quechua, Nahuatl, and Guarani actually contributed to the expansion of these American languages, equipping them with writing systems.

The 1510 Requerimiento, issued in relation to the Spanish invasion of South America, demanded that the local populations accept Spanish rule, and allow preaching to them by Catholic missionaries, on pain of war, slavery or death, although it did not demand conversion. Slavery was part of the local population's culture before the arrival of the conquistadors. Christian missionaries provided existing slaves with an opportunity to escape their situation by seeking out the protection of the missions which was a spur to their conversion.


Antonio de Montesinos, the Dominican friar, preached against the slavery of indigenous people.

On December 1511, the Dominican friar Antonio de Montesinos openly rebuked the Spanish authorities governing Hispaniola for their mistreatment of the American natives, telling them "... you are in a state of mortal sin ... for the cruelty and tyranny you use in dealing with these innocent people".

Kind Ferdinand (1452 – 1516) enacted the Laws of Burgos which resulted in a relaxing of enforcement while he blamed the Church for not doing enough to liberate the Indians when it was only the Church that had raised its voice in defense of the indigenous peoples.

Nevertheless, Amerindian populations suffered serious decline due to new diseases, inadvertently introduced through contact with Europeans, which created a labor vacuum in the New World.

THE FRANCISCANS

In 1522, the first Franciscan missionaries arrived in Mexico. They established schools, model farms, and hospitals. When some Europeans questioned whether the Indians were truly human and worthy of baptism, Pope Paul III (1468 – 1549), who came to the papal throne in 1534 after the “sacking of Rome” in 1527, a very iffy period for the church.

In Pope Paul’s bull of 1537 (Sublimis Deus), he emphatically declared "their souls were as immortal as those of Europeans" and they should neither be robbed nor turned into slaves. The practice, however, persisted.

Over the next 150 years, missions expanded into southwestern North America. Native people were often legally defined as children, and priests took on a paternalistic role, sometimes enforced with corporal punishment.

In America, Junipero Serra, a Franciscan priest, founded a series of missions that became important economic, political, and religious institutions. These missions brought grain, cattle, and a new way of living to the Indian tribes of California. Overland routes were established from New Mexico that resulted in the colonization of San Francisco in 1776 and Los Angeles in 1781. Here too, as in South America, in bringing Western civilization to the area, these missions and the Spanish government were responsible for wiping out nearly a third of the native population, primarily through disease.

Only in the 19th century, after the breakdown of most Spanish and Portuguese colonies, was the Vatican able to take charge of Catholic missionary activities through its Propaganda Fide organization and challenge Spanish and Portuguese draconian policies. Pope Gregory XVI (1765 – 1846) took on the “mad monarchs” of his time, and for it is considered one of the most effective popes in Catholic history. He entered the monastery as a boy and was a staunch defender of Catholic orthodoxy in his early life as a cleric, but became – perhaps because the times demanded it – a political conservative and an adept politician.

Born into the post-French Revolution period, he saw the devastation inflicted in the New World on the church by colonializing governments, haunted by the kidnapping of Pope Pius VII (1742 – 1823) from 1809 to 1812 by Napoleon, while himself a rising cleric in the Catholic hierarchy, he proved a different kind of pope when he came to the papal throne in 1831.

But before that time, there was Pope Leo VII (1760 – 1829), who came to the papal throne in 1823 and was pontiff to his death in 1829. The church had not recovered from the shock of Napoleon and the emperor’s violation of the sanctity of Rome and the Vatican. Perhaps that explains why Pope Leo was so harsh and insensitive to the laity. In any case, he was a very unpopular pope, only to die to be succeeded by Pope Pius VIII, who reigned only from 1829 to 1830 accomplishing very little and being remembered for nothing.

Pope Gregory XVI, a priest, and a Franciscan was unusual in the way he took control from the first, but also in being the last pope to rise to the papal throne who was not a bishop.

From the beginning Pope Gregory challenged the Spanish and Portuguese policy in the New World with his Propaganda Fide organization, appointing his own bishops to the colonies, condemning slavery and the slave trade, and doing so in a Papal Bull In Supremo Apostolatus, and approved the ordination of native clergy in the face of government racism. Despite these advances, the populations of the Americas continued to suffer sharp declines from exposure to European diseases.

It was a time of upheaval with a “war of terror” on the streets of Paris from 1830 to 1848. Pope Gregory was in the middle of this mix with his own “war of terror.” He quite famously opposed gas lights and railways, fearing the rise of a liberal middle-class elite in the Papal States, and opposed political concessions, making him one of the most hated men in Europe in leftist circles. Infuriating Irish Catholics in Ireland urging them to be loyal to their Protestant British monarch. His bottom line was absolute opposition to revolution given its history in his lifetime. Rare (in my experience as a Roman Catholic) was his telling statement on liberalism:


Pope Gregory XVI condemned slavery and backed it with an aggressive pontificate.

“This shameful font of indifferentism gives rise to that absurd and erroneous proposition which claims that liberty of conscience must be maintained for everyone. It spreads ruin in sacred and civic affairs, though some repeat over and over again with the greatest impudence that some advantage accrues to religion from it. ‘But the death of the soul is worse than freedom of error,’ so says Augustine. When all restraints are removed by which men are kept on the narrow path of truth, their nature, which is already inclined to evil, propels them to ruin.”

One can imagine how this threw the Church Fathers into moaning apoplexy.

THE JESUITS

Jesuit missionaries in the Americas were controversial in Europe, especially in Spain and Portugal in as much as Jesuits were disinclined to take orders from governments when such governments saw their cavalier disregard as interference with their colonial enterprises.

From the beginning, Jesuits were often the only force standing between Native Americans and slavery. Throughout South America, but especially in present-day Brazil and Paraguay, they formed Christian Naïve American city-states called “reductions.” These were societies set up according to an idealized theocratic model.

It is partly because the Jesuits such as Antonio Ruiz de Montoya (1585 – 1652) were willing to sacrifice life and limb to prevent Spanish and Portuguese colonizers from enslaving the natives. Other Jesuits, such as Manuel da Nobrega (1517 – 1570) and Jose de Anchieta (1534 – 1597) were critical in Indian pacification leading to the establishment of stable colonial settlements in the colonies. They founded several towns in Brazil in the 16th century, long before the Franciscans arrived. Among these towns were Sao Paulo and Rio de Janeiro. Using a feather rather than a sword, their efforts were influential in pacification, religious conversions, and education of Indian nations.

The Jesuit Reductions were a particular version of the general Catholic strategy without the hard club of dogma. It was effectively used in the 17th and 18th centuries of building reductions to Christianize the indigenous populations of the Americas more efficiently than the absolutism to be practiced later.

[When I was a young chemical engineer in the field for Nalco Chemical Company, my next-door neighbor was a civil engineer with a large national firm building highways across the nation. He was from Bolivia, and remembered fondly the Jesuits who taught him, but had little affection for Roman Catholicism and clerics that tried to force dogma down his constitution.]

The work of the reductions created the Catholic order of the Jesuits in South America in inhabited areas of Tupi-Guarani, which corresponds to modern-day Paraguay. Later reductions were extended to Argentina, Brazil, Bolivia, and Uruguay. Jesuits have a popular history of fairness and consistency and follow-through in their practices, especially among the disenfranchised. The current Pope of the Roman Catholic Church Pope Francis I (born 1936) is the first Jesuit to rise to the papal throne and is as controversial with the establishment as were his predecessors those centuries ago, and truly a breath of fresh air in the tradition of Pope Gregory XVI.

Where these Jesuit reductions differed from the reductions in other regions was consistent with Jesuit philosophy: Indian natives were expected to adopt Christianity but not European culture. Under the Jesuit leadership of the Indians, the reductions achieved a high degree of autonomy within the Spanish and Portuguese empires.

With the use of Indian labor, the reductions became economically successful. When their existence was threatened by the incursions of the Bandeirante (Portuguese for “those who carry the flag) slave traders, Indian militia were created that fought effectively against these fortune-hunting Portuguese settlers in Brazil.

The resistance by the Jesuit reductions to slave raids, as well as their high degree of autonomy and economic success, have been cited as contributing factors to the expulsion of the Jesuits from the Americas in 1767. Jesuit reductions present a controversial chapter of the evangelical history of the Americas and are variously described as jungle utopias or as theocratic regimes of terror.

The suppression of the Jesuits in the Portuguese Empire (1759) and the Spanish Empire (1767) was precipitated by a series of political moves rather than a theological dispute. Monarchs were attempting to solidify their power and centralize their authority in the secular realm, viewing Jesuits as being too liberal, international, and too allied to the papacy. What’s more, monarchs found Jesuits too autonomous and territorial.

That said the papacy threw the Jesuit order under the bus. Pope Clement XIV issued a papal bull (Dominus ac Redemptor) on July 21, 1773, suppressing the Society of Jesus. The Jesuits took refuge in non-Catholic nations, particularly in Prussia and Russia where the order was either ignored or formally rejected. The Jesuits were allowed to return to many places starting in the late 19th century.

ASSIMILATION

The conquest of the conquistadors was immediately accompanied by the evangelization of the indigenous peoples with a clearly Catholic phenomenon to mystically materialize. This helped to solidify the conquest of the minds and hearts of the natives.

The Virgin of Guadalupe is one of Mexico’s oldest religious images. The Virgin Mother of Jesus is said to have appeared to Juan Diego (1474 – 1548) in 1531. News of the 1534 apparition on Tepeyac Hill spread quickly through Mexico; and in the seven years that followed, 1532 through 1538, the Indian people accepted the Spaniards, and 8 million people were converted to the Catholic faith. Thereafter, the Aztecs no longer practiced human sacrifice or native forms of worship.

In 2001 the Italian “Movement of Love” was created, an evangelization project launched in 32 states. A year later, Juan Diego was beatified in 1990 and canonized a saint by Pope John Paul II (1920 – 2005) in 2002, the first Roman Catholic indigenous saint from the Americas.

Saint Juan Diego is said to have been granted an apparition of the Virgin Mary on four separate occasions in December 1531 at the hill of Tepeyac, then outside but now well within metropolitan Mexico City.

The Basilica of Guadalupe, located at the foot of the hill of Tepeyac, claims to possess Juan Diego's mantle or cloak on which an image of the Virgin is said to have been impressed by a miracle as a pledge of the authenticity of the apparitions. These apparitions and the imparting of the miraculous images (together known as the Guadalupe event) are the basis of the cult of Our Lady of Guadalupe. This cult is ubiquitous in Mexico and prevalent throughout the Spanish-speaking Americas, and increasingly worldwide. As a result, the Basilica of Guadalupe is now the world's major center of pilgrimage for Roman Catholics, receiving 22 million visitors in 2010.
 

Virgin of Guadalupe

Guadalupe is often considered a mixture of the cultures which blend to form Mexico, both racially and religiously. Guadalupe is sometimes called the “first Mexican,” as Guadalupe brings together people of distinct cultural heritages, while at the same time affirming their distinctness.

One theory is that the Virgin of Guadalupe was presented to the Aztecs as a Christianized Tonantzin, or necessary for the clergymen to convert the indigenous people to the Catholic faith. The missionaries literally built their first churches with the rubble and the columns of the ancient pagan temples, so they often borrowed pagan customs for their own cult purposes.

Such Virgins appeared in most of the other evangelized countries, mixing Catholicism with the local customs. The Basilica of Our Lady of Copacabana was built in Bolivia, near the Isla del Sol where the Sun God was believed to be born, in the 16th century, to commemorate the apparition of the Virgin of Copacabana. . In Cuba, the Virgin named Caridad del Cobre was allegedly seen at the beginning of the 16th century, a case consigned in the Archivo General de Indias. In

Brazil, Our Lady of Aparecida was declared in 1929 official Patron Saint of the country by Pope Pius XI (1857 – 1939). In Argentina, there is Our Lady of Lujan. In other cases, the appearance of the Virgin was reported by an indigenous person, for example, Virgen de Los Angeles in Costa Rica.

ASSESSMENT OF MISSIONARIES

For most of the history of post-colonial Latin America, religious rights have been regularly violated, and even now, tensions and conflict in the area of religion remain. Religious human rights, in the sense of freedom to exercise and practice one's religion, are almost universally guaranteed in the laws and constitutions of Latin America today, although they are not universally observed in practice. Moreover, it has taken Latin America much longer than other parts of the West to adopt religious freedom in theory and in practice, and the habit of respect for those rights is only gradually being developed.

The slowness to embrace religious freedom in Latin America is related to its colonial heritage and to its post-colonial history. The Aztec and the Inca both made substantial use of religion to support their authority and power. This pre-existing role of religion in pre-Columbian culture made it relatively easy for the Spanish conquistadors to replace native religious structures with those of a Catholicism that was closely linked to the Spanish throne.

Anti-clericalism was an integral feature of 19th-century liberalism in Latin America. This anti-clericalism was based on the idea that the clergy (especially the prelates who ran the administrative offices of the Church) were hindering social progress in areas such as public education and economic development. The Catholic Church was one of the largest land-owning groups in most of Latin America's countries. As a result, the Church tended to be rather conservative politically.

Beginning in the 1820s, a succession of liberal regimes came to power in Latin America. Some members of these liberal regimes sought to imitate the Spain of the 1830s (and revolutionary France of a half-century earlier) in expropriating the wealth of the Catholic Church, and in imitating the 18th-century benevolent despots in restricting or prohibiting the religious institutes.

As a result, a number of these liberal regimes expropriated Church property and tried to bring education, marriage, and burial under secular authority. The confiscation of Church properties and changes in the scope of religious liberties (in general, increasing the rights of non-Catholics and non-observant Catholics, while licensing or prohibiting the institutes) generally accompanied secularists, and later, which has gone on to this day.

NEXT – NWL - PART TWENTY-TWO - MEDICINE EMPOWERED 

Saturday, August 21, 2021

NOWHERE MAN IN NOWHERE LAND - TWENTY-THREE

 Nowhere Man in Nowhere Land – TWENTY-THREE



 

The Knowledge Keepers

&

  Utopia!

 

JAMES RAYMOND FISHER, JR., Ph.D.

Originally published © September 29, 2016/©August 21, 2021

 

“As a patterning system, the brain can only see what is prepared to see.  The analysis of information will not produce new ideas, merely a selection from existing ideas . . .To deal with the future we have to deal with possibilities (not certainties).  The analysis will only tell us what is already known.”

Edward de Bono (born 1933), Maltese physician and psychologist, Lateral Thinking (1970)

THINKING, FEELING, SPECULATING, BEING & BECOMING IN A NERVOUS DANCE THROUGH TIME

Over the past 120,000 years, thinking man has made the great journey to colonize the planet.  Today, it would seem at times that we are near the journey’s end.  Along the way, thinking man has had many occasions to face insurmountable odds always coming up with a “Cut & Control” tool to master the situation in the short term.

Our leaders and institutions have always accepted the challenges and used the new tool for its short-term quick fix value and advantage.  Likewise, these leaders have consistently ignored the tool’s long-term consequences not only in a material sense but in an immaterial sense as well. 

Austrian psychoanalyst Otto Rank (1884 – 1939) insists:

“Man is a theological being and not a biological one. 

It is as though German American Lutheran theologian and existential philosopher Paul Tillich (1886 – 1965) were speaking.  Tillich claims:

“Faith consists in being vitally concerned with that ultimate reality to which I give the symbolical name of God. Whoever reflects earnestly on the meaning of life is on the verge of an act of faith.” And elsewhere, “Doubt is not the opposite of faith; it is one element of faith."

These two thinkers were of the same generation, a generation now gone and for that reason discounted as no longer relevant. Yet, what they say resonates with Saint Augustine of Hippo (354 – 430) and Danish existential philosopher Soren Kierkegaard (1813 – 1855).

Augustine’s “City of God” (426) and Kierkegaard’s “Purity of Heart” (1938) are complements to man’s character. Rank’s statement is uncanny in that, he echoes that sentiment as a psychoanalyst in the world of science and not as a theologian.

THE MIGRAINE HEADACHE OF NOWHERE LAND

Carol Gentry suggests the impact of self-generating ambivalence in this “Age of Doubt” (The Tampa Tribune, May 16, 2006) is considerable:

“Migraine headaches emerged as the No. 1 cause of lost production time in a 2005 study of national companies by the Employers Health Coalition, Inc. For the diseases they studied, the monetary loss in productive time-averaged two to three times the health care cost. But for migraine headaches the cost was huge. ‘You lose $33 in productivity for every $1 spent on medical care with migraines,’ said Frank Brocato, President, and Chief Executive Officer.

“The health coalition survey found that migraines were only the seventh most prevalent illness among respondents after allergy, high blood pressure, arthritis, sciatica, depression, and spine problems. However, the time wasted on unproductive suffering – mostly at the office rather than at home – was greater for migraines than for the others.”


Modern workers are not happy campers and many health problems are psychosomatic, which doesn’t make them any less painful or debilitating. It does indicate that medical science has yet to be effective as a social palliative. The dictum: Social problems should be solved by physicians has not proven reliable.

It gives credence to the declaration of Otto Ranks that man is a theological being, not a biological one. Attention is given to psychosomatic problems when employees’ health affects the bottom line, but seldom before. What is ironic is that the migraine is a stealth syndrome, that is, mild and infrequent in the early stages, building to temporary debility due to stress and distress like “a shadow that stays with me,” as one employee described the malady.

The migraine puts the sufferer in Nowhere Land looking for utopian relief from the chronic pain with various over-the-counter headache remedies or with prescription drugs known as “triptans.” This is a family of tryptamine-based drugs used as abortive medication in the treatment of migraines and cluster headaches. It was first introduced in the 1990s. While effective, it is not a preventive treatment nor is it a cure, as it has nothing to do with the root causes of the condition.

Triptans disguise the various triggers such as sleep deprivation, stress, overwork, eating irregularities, lack of exercise, drinking and smoking too much, too much texting, television or computer exposure, poor lighting or ventilation, or other life exigencies such as being obsessive with success and failure, disappointment, personal sorrow, or trying to be perfect, which relates to utopian expectations of oneself or others.

The dream of utopian splendor finds few far from Nowhere Land as revealed by the chronic stress patterns that course through Western society. This short excursion endeavors to show that it is part of our cultural DNA going back to the early Christians. While we may make short shrift of the simple everyday migraine headache, it is a systemic indicator of a far greater societal impediment.

IS A SELF-INDULGENT MAN NEAR JOURNEY’S END?

As wondrous as these “Cut & Control” tools that we pride ourselves in having, they may be accelerating our collective doom. While our environment is being reduced to rubble, makeshift demands pull man hither and yon in constant ambiguity for what man has mastered has, in turn, mastered him.

Man feels torn precisely because he lacks no central gyroscope, no centering moral compass, no intuitive directional device for him to do what is best. He keeps doing what he is doing until there is nothing left in him for the enterprise. To get beyond the things of the world, man must find a way to reconnect the body with the soul leavening the material world with its immaterial counterpart.

The “Cut & Control” tools themselves did nothing to create immediate alarm or harm. They did change the way man saw his relationship to himself, others, and to Nature. Each time developing and using a new tool, society displaced old values with new ones. Each time the new tool was embraced the focus was on the benefit of the new not the cost of losing the old. Cumulatively, however, the cost would prove prohibitive.

For over 100,000 years, a thinking man was content to be a hunter and gatherer, to be a member of nature, not separate from the natural world. This changed 12,000 years ago when agriculture gave a man a quick fix on a short-term problem, a sustainable food supply. He now banded together into tribes with the population remaining relatively stable as deaths closely matched births.

Today, global agriculture is in a precarious state thanks to the spectacular success of scientific farming. It has resulted in an unmatched population explosion that is on its way to being 12 billion souls in the second half of the 21st century. Should that happen not only must the environment be treated with more loving care but people of the planet as well. The challenge then will be for the planet to be one people of great diversity and orientation, where common humanity will recognize people are more alike than different, more common in spiritual roots while uncommon in ethnicities. Nationalities and territorial imperatives, founded in mythology, will dissolve and diverse peoples will join the ranks, at least that is what utopian thinkers envision. As we shall attempt to show, there is more to Utopia than meets the eye.

At the end of the 19th century, more than four out of every five Americans worked in agriculture or support this industry in a population of 100 million. Today, with a population of more than 330 million, less than 3 percent of Americans are involved in agriculture.

Even then, agriculture is a very different business as it has progressed to corporate agri-business with few family farms any longer extant. Conglomerates now dictate what has become largely a synthetic hybrid industry of scientifically enhanced products that are likely to reach the dinner table.

Moreover, 90 percent of the world’s food supply is now produced from only eight species of livestock and fifteen species of plants. Genetic engineering has reached its apogee in agri-business and this, in turn, has quite remarkably changed the physical appearance of the 21st-century person.

When most people were close to the land, when they had an intimate understanding of the seasons of the year, of the life and death cycle of everything, they had roots and looked to a hierarchy beyond themselves to their God. They had a sense in their bones that everything was connected to everything else; that everything had to go somewhere; that Nature knew best and, in any case, could not be changed; that there was no free lunch and that in life every act had consequences whether intended or not. There was a maturity that has escaped postmodernity man.

With people of the land, religion and science did not operate in different universes and therefore were not a threat to each other. That division was spurned by the “Industrial Revolutions” (1760 – 1820) and has widened ever since.

Peripatetic savant Hungarian Arthur Koestler writes in The Sleepwalker: A History of Man’s Changing Vision of the Universe (1959):

“The Philosophy of Nature became ethically neutral, and ‘blind’ became the favorite adjective for the working of natural law . . . As a result, man’s destiny was no longer determined from ‘above’ by superhuman wisdom and will, but from ‘below’ by the subhuman agencies of glands genes, atoms, or waves of probability. This shift of the locus of destiny was decisive. So long as destiny had operated from a level of the hierarchy higher than man’s own, it had not only shaped his fate but also guided his conscience and imbued his world with meaning and value. The new masters of destiny were placed lower in this scale than the being they controlled; they could determine his fate, but could provide him with no moral guidance, no values, and meaning.”

Five thousand years ago, we started our steady progression away from that idea. A lifesaving tool was invented that changed our attitude about infertile land. The tool was irrigation. Engineers have since ensured increasing water supplies even in desert communities. Today, wells go deeper than ever before to retrieve the water in the aquifers filtered through limestone beds that reached deep into the earth. Enormous reservoirs are available for hydroelectric power generation and irrigation. Major rivers have been diverted to supply entire countries.

Yet, every time this “Cut & Control” tool has been employed to materially improve the lives of people, the need for it has accelerated as the birthrate keeps climbing. The demand for water in the 20th century was so great that many reliable perennial water sources simply dried up.

Now, in the second decade of the 21st century, the forecast is not good. Global industrial and domestic water demand has quadrupled since 1950 without abatement. Scarcity is now common in 26 countries including Russia, Middle Eastern countries, and parts of India, Africa, and the Southwestern United States.

Today, fish farmers are failing and oceans are dying. According to the United Nations Food and Agricultural Organization, four of the world’s seventeen fishing zones are already over-exploited. Between 1950 and 1990, the catch increased in these zones fourfold. In 2000, many warned the world catch had reached the limits of sustainability. Meanwhile, saltwater contamination is posing a new problem for freshwater.

Koestler sounds the alarm in the last pages of The Sleepwalker:

Man had now acquired the means to destroy the planet. Evolution had granted him a technological capacity far above his spiritual capabilities. Thus within the foreseeable future, man will either destroy himself or take off for the stars.”

The message was clear for everyman as well as the scientist: spirituality and science need each other as survival is predicated on their acknowledged interdependence.

To give a sense of how a “Cut & Control” tool can misfire, take the Aswan Dam of Egypt constructed in 1965. With this magnificent structure, the natural discharge from the Nile River virtually ceased with a catastrophic impact on the southeastern Mediterranean fisheries. Many species disappeared completely; catches were often small and more often contaminated. An entire industry in this part of the world has essentially disappeared.

Elsewhere, human activities involving deforestation, mining, dredging, and erosion have created sedimentation that now fills reservoirs, lakes, and rivers. Sediments from industrial plants and home use of fertilizers have put nutrients into the discharge that has produced algae bloom contaminates that pollute lakes, rivers, and even oceans, resulting in massive levels of fish kill.

On the west coast of Florida in the United States, the “red tide” is a seasonal occurrence in this tourist-sensitive milieu with its unmistakable mind-wrenching stench of dead fish hugging the shore. This is not only an economic disaster but an embarrassment to the multimillion-dollar homes and condominiums that grace the waterfront.

A century ago, forests covered 90 percent of the Dominican Republic and 80 percent of Haiti. Today, the forests of the Dominican Republic have been reduced to 20 percent and Haiti to between 1 and 5 percent due to corporate lumbering and domestic use of wood for cheap fuel. The consequence is that these countries are subject to devastating mudslides from the mountains after torrential rains, destroying homes, businesses, entire communities while killing hundreds of people in their wake.
 
The famous Aswan Dam in Egypt was constructed in 1965

Sedimentation is also fatal to the coral reef which is called “bleaching.” This is now a world problem. Although coral reefs cover only 0.17 percent of the ocean floor, they are crucial to biological diversity and stability. The food they supply sustains a quarter of all fish types in the developing world. At the current rate of coral reef deterioration, it is estimated that another two-thirds of the world’s coral reefs will be lost in the next several decades.

One of the “Cut & Control” tools of technology is energy. Today, the effect of uncontrolled energy sourcing and exploitation has been globally distressing. Consumption of fossil fuels has grown in flagrant disregard for the consequences. Whether you believe in global warming or not, few can deny that in many cities and industrial areas the air is difficult to breathe.

The United States is only 1/20th of the world’s population, but it consumes 25 percent of its fossil fuel. Oilman T. Boone Pickens believes we’ll be out of the hydrocarbon era before the end of the 21st century. At the present rate of global consumption, oil reserves will be depleted in 50 years and gas reserves in 200.

These estimates were generated before the explosive industrial boom in China and India in 2005 and 2006. Consumption of fossil fuels of these two countries have resulted in wide fluctuations in the price of gasoline at the pump as political instability continues in the new century in the Middle East, while OPEC and the United States, as well as Argentina and Mexico, fluctuate in their production quotas.
 

An American marine biologist who quietly took on the agri-business

China, with its population of 2 billion has put aside its bicycles and discovered a mania for the automobile. The same is occurring in India, which is growing even faster than China and will soon overtake it in population. Likewise, China and India have the fastest-growing markets for energy-demanding technology. With an electronically connected world of the Internet and the iPhone, people in China and India, as well as many other Third World, developing nations, seeing how the other half lives on their modems, want a higher standard of living and the automobile as middle-class symbols of “making it.”

Not surprisingly, a vicious Third World cycle has developed. With insufficient gains in agricultural productivity due to inequitable land distribution, the high cost of Western farming methods, and constant warring factions within these countries increased land clearance has become necessary to feed the swelling roaming populations. This reduces forest and agricultural productivity which brings about the need for more forest clearance with long-term effects in erosion, mudslides, stream pollution, and depopulation of marine life, and so the vicious cycle continues.

American marine biologist Rachel Carson (1907 – 1964) alerted the public to these dangers in The Sea Around Us (1951) where she identified the effects of pollution on large-scale marine life.

Then she shocked the world with her international bestseller Silent Spring (1962) in which she focused on the disturbing impact of the use of synthetic pesticides such as DDT on agriculture and plant life as well as on humans. In her short life, she framed the ecological problems that have plagued us for the past 50 years. Chemists in her day failed to heed her warning while many in the agri-business disparaged her research methods and conclusions. This was not true of the former United States Vice President Al Gore. He writes:

“For me, Silent Spring had a profound impact. It was one of the books we read at home at my mother’s insistence and then discussed around the dinner table. . . . Rachel Carson was one of the reasons why I became so conscious of the environment and so involved with environmental issues. Her example inspired me to write Earth in the Balance. . . . Her picture hangs on my office wall among those of political leaders. . . . Carson has had as much or more effect on me than any of them, and perhaps than all of them together.”

Two journals of the period, Chemical & Engineering News and Chemical Engineering, both with proprietary interests faulted Carson for her gaffes in chemistry paying little attention to her environmental warnings. Frank Graham, Jr. in Since Silent Spring (1970) points out that while several states did ban DDT, a wide range of other pesticides more toxic than DDT continued to be used in high quantities. As late as 2004, the use of pesticides continued unabated with the US Government assuming little control over them.

Then in 21st Century Science & Technology magazine, entomologist J. Gordon Edwards penned an article, “The Lies of Rachel Carson,” indicating once again that scientists are human, and are equally good at coming out with a red pencil to someone else’s original and remarkable work as in other endeavors.

The massive use of fossil fuel for use as fuel, creating pesticides, and generating electricity have changed the earth’s atmosphere. This was ignored until a quarter-century ago. As the accumulation of carbon emissions in the atmosphere grew exponentially, a new phenomenon was noted: The Green House Effect. This occurs when carbon emissions are trapped in the atmosphere and prevent sunrays from radiating back to the earth. It produces what many scientists, but certainly not all, insist leads to global warming.

At this point, it is conjecture as to the actual level of the Green House Effect. Some suggest, based on studies that the polar ice cap is melting. This would cause a significant rise in sea levels. Others see major forests dying due to the instability of plant species to adapt to rising temperatures. The same problem would be true of crop yields. Meteorologists estimate a small increase in sea temperature would cause a greater frequency of major storms in South East Asia, Australia as well as in the South Eastern region of the United States, as well as states aligning the Gulf of Mexico, the Caribbean, and Western Mexico.

Tsunamis have occurred often throughout history. So frequently in Japan that they invented the word specifically for the phenomenon: “tsu” meaning harbor and “nami” meaning wave. In 2004, there was the devastating tsunami in South East Asia, and in 2005, Hurricane Katrina in the United States wiped out a good portion of the Gulf Coast of Alabama, Mississippi, and Louisiana, turning New Orleans into a ghost town with 80 percent of its occupants forced to evacuate.

Scientists claim a sea-level rise would likely cause major population displacement and death among two-thirds of the world’s population that chooses to live on low coastal lands. What is clear is that even scientists don’t know for certain how bad things could be as such dramatic catastrophes go back hundreds if not thousands of years and conditions previously were quite different than they are today.

The world is rapidly moving towards a mass-produced uniform culture that takes the word of science as gospel but heeds it no better than the religious tenets that once guided that same world. This characterization of Nowhere Man has failed to be noted as the attention has been on utopia or Nowhere Land.

Mankind’s compulsion for self-destruction seemingly has not lessened in this age of science and political religion. Progress is the new mantra and everyone dances to its tempo. German social psychologist Erich Fromm (1900 – 1980) sees this as evidence of the pathology of normalcy. You ask, how so?

Today, of the more than seven billion people now living on the planet, 100 million are homeless, 500 million suffer from severe malnutrition, 800 million are illiterate and 400 million have no jobs. These figures do not include the constant ethnic wars in Africa, Syria, Iraq, Libya, and Afghanistan or the threatening noises coming out of China and the Korean peninsula.

The 1992 standard of absolute poverty as an annual income of under $500 per year fit nearly one billion souls, most of them in Africa. In 2006, that figure passed the one billion mark. Meanwhile, the United States spends more than $5 billion a year on self-indulgent diets, and more than $2 billion on food and care of house pets, while American women spend $30 billion a year on clothes.

Insulated in the unconscious swirl of the moment, most Americans and Western Europeans comfortably ensconced in passable distractions experience little disruption to their lives even with the sporadic attacks of ISSI and al-Qaeda terrorists in their midst. The evidence? No one changes the focus or tempo of their lives except those who have lost loved ones.

Instead, they retreat behind the façade of seeing terrorists as simply evil scoundrels failing to appreciate their possible motivation. Terror may be the Court of Last Resort for impressionistic angry people who feel abused and slighted. They are then manipulated by truly evil people into a twisted religious frenzy with the cry of “jihad!”

French Islam scholar Olivier Roy (born 1949) in Globalized Islam (2004) sees the seeds of the current terrorism having Christian roots:

“The figure of the lonely metaphysical terrorist who blew himself up with his bomb appeared in Russia at the end of the 19th century . . . The real genesis of al-Qaeda violence has more to do with a Western tradition of individual and pessimistic revolt for an elusive ideal world than with the Koranic conception of martyrdom.”

British philosopher John Gray adds, “Nazism and communism are products of the modern West. So, too, is radical Islam.” The radical Islam leader and Egyptian intellectual Sayyid Qutb (1906 – 1966) lifted many of his ideas from European thinkers, especially Nietzsche. He was executed by Nasser in 1966.

Nowhere Man prefers to differentiate the “good guys” from the “bad guys,” as if in a vacuum, separated from common roots shared in the radical history of the United States, France, Israel, India, and elsewhere. Nowhere Man is everyman when the cry of the tortured soul is the feeling of being left out and in the wilderness.

According to the World Resources Institute, by 2050 eighty-four (84) percent of humanity will be living in what is now the Third World, half of them in only five countries. These shifts in population distribution and density will have serious implications for food, resource distribution, and the political make-up of the planet.

There are no secrets anymore for with the Internet and clever technicians there is no way to hide greed and corruption as we already know.

This institute predicts that in another score of years industrialized democracies will be by comparison small populated nations. The United States population will be less than Nigeria. Iran will be twice the size of Japan. The population of Canada will be smaller than Madagascar or Syria.

Industrialization in the electronic age has tipped the scale in favor of population growth. We are seven billion today, and twelve billion by 2050.

It took 12,000 years for the population to reach the level of 5 million. Now more than 5 million are born worldwide every two weeks. Most of history associated high birth rates through religious practices, moral codes, and political laws against birth control, marriage habits, and family structures. High birth rates were countered by equally high mortality rates due to disease, famine, wars, and epidemics. Improved public health, more hygienic lifestyles, more reliable food, and water supplies, and better medical care have resulted in people living much longer and better.

Many Third World countries in the 21st century are succumbing to mass starvation because of tribal wars, droughts, and the lack of modern tools and techniques for self-subsistence. The common answer is to industrialize this world, democratize it, and make it into the image and likeness of the West.

Well-meaning Western democracies, as well as utopian-minded philanthropists are crippling these nations by their largesse because they fail to appreciate the cultures, histories, and motivation of the people they would aid. Consequently, rather than showing these people how to take charge of their destiny and carve out a new chapter in their existence, too frequently they unintentionally make these Third World countries are counter-dependent on the West for their security, sustenance, and survival.

Industrialization, per se, is not the answer. Nor are handouts. Too often corrupt middlemen take the major portion of the aid and little changes. Education and enlightened involvement and commitment to a viable process are fundamental to any chance of success.

This is not simply a Third World problem. Over the centuries, workers in emerging industrial societies have been excluded from the knowledge and information possessed by specialists. Moreover, they have failed to be made privy to the dramatic and subtle nuances implicit in new technology. Instead, they have been used up and discarded as no longer relevant, and then blamed for lacking the readiness to step into the new world and function competently.

The industrial workplace should function at all times as a university of emerging technology with workers schooled long before the new skills are needed.

Workers, American workers especially, have complacently accepted policies and procedures that treat them as if obedient children, and then management wonders why they lack the initiative to embrace challenges. Treat workers as children and they will behave as children into their fifties and sixties; treat them as adults in their twenties and they will remain so the rest of their careers. When people have been kept in ignorance and programmed in passivity, it becomes a terminal state of suspended adolescence and dependence in perpetuity.

We see this in the United States today. Modern technology is finding ways to bypass the need for factory workers, store clerks, fast-food servers, truck drivers, hospital employees as well as managers, administrators, engineers, and scientists. This “Cut & Control” practice, which technologists and entrepreneurs congratulate themselves in creating, leaves tens of millions of people without some kind of work, and work is necessary to have a sense of pride, dignity, and identity. To remove this sense from a person is to remove that person from himself and the consequences of the action that might follow.

To make the majority of people dependent on corporate welfare, to make them passive occupants in their bodies, to treat them as collectives in advertising and political campaigns, reducing them to demographic data is to rob them of the self-satisfaction that is provided by some kind of work, which as Lebanese poet Kahlil Gibran says is “love made visible.”

Alas, even if people have jobs, if they do not accept ownership of the job, but treat it as a renter, then society has failed. Some might see this as moving into a bankrupt world in which the rich get richer but are only poorer for the failure to harness people's power. Billionaires don’t change the world the common worker does.

We are currently captivated by information technology as a commodity with all its useful and synthetic purposes. When primitive man fashioned the first stone implement and used it to change his existence, he took a step out of the cold and away from the hostility of his environment. He didn’t realize, immediately, what he had done. Yet, with this tool, he soon had a special power. While weak in comparison to nature’s giants in the trees, the bush, or on the veld, he no longer saw himself quite so handicapped. What separated him from other species was his specialness. We call this “knowledge.”

THE KNOWLEDGE KEEPERS

Knowledge is an aspect of the perennial utopian dream of serving the world and controlling it positively. Utopia is always about the Future Perfect after stumbling through the Past Imperfect and the Present Ridiculous.

Knowledge keepers control the process of change while being vulnerable to the new instruments of change, smitten with their versatility, complexity, and promise, failing to see such attachments can lead to becoming their prisoner. We refuse to acknowledge that utopia is dead, keeping that thought far from our minds while looking for the “magic bullet,” the ultimate tool that will rescue us from our retreat and advance us beyond our calamity.

How do we change the way we think in time to stop short of ultimate catastrophe? One answer may be with this new tool, Information Technology; the other is an old one, our brain. The only problem is that the new tool comes out of the old tool’s hardwiring, which is the way we think and address our difficulties.

So, while our “Cut & Control” dance into the future may not be an act of regression, it isn’t exactly a demonstration of progression. At the moment, assessing what we have lost for what we have gained, it would appear we are stymied in forward inertia, which is to say, not moving at all only unconscious or seemingly untroubled with the fact.

British historian Norman Cohn (1915 – 2007) in The Pursuit of the Millennium (1957), anticipating the 21st century revisits the Middle Ages. He sees the roots of the violence and terror now being experienced has roots in the fanaticism that accompanied a series of radical changes in Western society:

The Protestant Reformation;

The Industrial Revolution;

The American Revolution; and

The French Revolution.

The Roman Catholic Church, at the time, was in decline and science was on the rise with its special knowledge and insight into the workings of the universe. Western Civilization believed it had stepped out of the chaos and savagery of the past and was moving into the light of utopian rationalism, failing to see it was changing one absolute authority for another. Cohn shows that science and the “Age of Enlightenment” failed to escape the apocalyptical shadow of Early Christianity.

The historian traces back to the chiliastic upheavals that marred the revolutionary movements of the 20th century (chiliasm is a theological doctrine that Christ is expected to return to earth for 1000 years, thus the reference to millennialism). These 20th-century disruptions marked the belief that God had failed! The Inner Demons were on display in WWI and WWII with Soviet Communism and German Nazism being responsible for the deaths of tens of millions of minorities within Christendom and Jewish ghettos.

The Nazi Holocaust is known for killing six million Jews, while the Soviet Union, which Cohn shows annihilated millions more, is seldom in the conversation. This carnage was not unlike The Inquisition and wars that rocketed the late Middle Ages. The Inquisition was established by an ecclesiastical tribunal of Pope Gregory IX in 1232 for the suppression of heresy. It was active chiefly in northern Italy and southern France, becoming notorious for the use of torture. In 1542 the Papal Inquisition was re-established to combat Protestantism, eventually becoming an organ of papal government. It became even more brutal and consequential in Spain.

The Spanish Inquisition was independent of the Papal Inquisition. It was established (1478) by King Ferdinand and Queen Isabella with the reluctant approval of Pope Sixtus IV. One of the first and most notorious heads was Tomas de Torquemada, who could rival Adolf Hitler for his demonic mania. The Spanish Inquisition was entirely controlled by the Spanish kings with the Roman Pontiff only naming the Inquisitor General. The popes were never reconciled to the institution, which they regarded as usurping church prerogatives.

The purpose of The Spanish Inquisition was to discover and punish converted Jews (and later Muslims) who were insincere. However, soon no Spaniard could feel safe from it; thus, St. Ignatius of Loyola, founder of the Society of Jesus (Jesuits), and St. Theresa of Ávila were investigated for heresy.

With the Spanish censorship policy, even books approved by the Holy See was condemned. The Spanish Inquisition was much harsher, more highly organized, and far freer with the death penalty than the Papal Inquisition; its autos-da-fé became notorious. Hitler had his gas chamber. The Spanish Inquisition had its burning at the stake.

The Spanish government tried to establish The Inquisition in all its dominions, but in the Spanish Netherlands the local officials did not cooperate, and the inquisitors were chased (1510) out of Naples, apparently with the pope's involvement. The Spanish Inquisition was finally abolished in 1834.

The point is that the utopian desire for perfection of behavior according to some authoritative script is not new, but part of Western history.

Norman Cohn identifies distinctive features of a utopian millennial movement:

· It is a collective in that it is enjoyed by the community of the faithful;

· It is terrestrial in that it is realized on earth rather than in heaven or an afterlife;

· It is imminent in that it is bound to come soon and suddenly;

· It is total in that it will not just improve life on earth but transform and perfect life;

· It is miraculous in that its coming achieves or assists in achieving divine intervention.

Millennial utopianism is as farfetched as anything believed in medieval times but it has the mesmerizing appeal in our day with the imprimatur of science and the belief that man can be delivered from evil by the power of knowledge. In its most radical form, this mirrors the revolutionary utopianism that defined our past two centuries.

This underscores the insatiable desire to find redemption with our newest tool as the instrument to reshape the world as it is into a kinder, gentler, safer, and more satisfying place for humans to be. The evidence thus far is that these tools often prove otherwise. Why is that so?

THE BRAIN AS A KNOWLEDGE MACHINE

For starters, it is clear the “Knowledge Revolution” could benefit from a better understanding of the brain as a knowledge machine. The maxim, “Garbage in, garbage out!” applies here.

The German gestalt psychiatrist Frederick Perls ((1893 – 1970) suggests as much in his book, Out of the Garbage Pail (1963), promoting a more holistic view of our dilemma. If we are what we eat, he argues, we are what we think as well. The core of Gestalt Therapy has enhanced awareness of sensation, perception, bodily feelings, emotions, and behavior at the moment.

Perls claims he stumbled on this insight when he considered relationships with oneself, others, and the environment, seeing them as not separate or conflicting entities but parts of the same whole. He died in 1970 and therefore had little sense those relationships would be reduced to texting, voice, and e-mail while learning would be reduced to tutorials surfing the Internet.

Knowledge workers are products of this new technology, workers largely involved in non-routine activities in which the intellectual capital is knowledge. We see them working as software engineers, physicians, pharmacists, architects, scientists, public accountants, lawyers, and academics whose primary job is to think to make a living.

Perls believed that if we didn’t like the biases programmed into us we could choose to think and behave differently. He didn’t envision people’s brains being freeze-framed at the moment as they are.

That said modifications to our brains happen frequently and naturally. The process started when we learned to read and write. This represented a major alteration to the natural processes of our mental development. Thinking with written words is different than thinking orally in language.

We experience the difference when we attempt to write down what we think finding it easier to express it orally, failing to realize what we think and say, and then attempt to commit to written words may differ widely. Likewise, a written novel converted to the stage or film has often little to do with the story between the book’s written pages.

Yet, thinking involves using “words” that are arbitrary inventions and only approximate what we see, hear, feel, taste, smell, or do. Consider this as a possible test: imagine you have an important paper to write, speech to give, or interview to make.

Chances are you go to bed and think about this with thoughts tumbling out of your head and lining up naturally and fluently so that you think you have a handle on your problem.

If you are like me, you go immediately to your study and attempt to transcribe what was so clear in your mind. And if you are equally like me, you become somewhat disconcerted when what you thought does not surface on paper or your computer screen as you thought, but only as a vague and unconvincing version.

Now, if you are equally like me, as I have been on occasion, you think “I’ve got it,” and you go back to sleep confidently it will come back to you in the morning. For me, at least, it never does.

Then, too, we even condition our minds to read from the left to the right instead of the right to the left, or vertically. We craft a certain world with this propensity, which differs from those who do so differently.

A more jaundiced look at knowledge workers might be to see them as serving the world to control people through mastering the control of what they do.

Nowhere Man in Nowhere Land, which is the theme of this series, attempts to show how knowledge has been used in “Cut & Control” fashion to dispatch or nullify “what was” to what is purported to be “what is,” separating the mind from the reality of that experience. We find comfort in the illusion that “what was” no longer contaminates “what is,” feeling superior in the new oblivious to what may have been lost.

The “cult of the exile” is the consequence of the brain on automatic pilot as the knowledge machine. Take commentators who promote globalization citing its connectedness and economic advantage, declaring. “The world is flat!” They fail to mention that if this is true then space and place are superfluous turning individuals across the globe into nomads. Permit me to explain.

People no longer need to leave their homes to feel connected. They have the world in their hands in their electronic gadgets. It brings information to them instantly from every corner of the world on the wings of electronic waves bouncing off satellites in outer space, allowing them to cross borders, invade heretofore secret spaces, and cultivate contrived relationships while remaining essentially stationary and unconnected except to an inanimate machine.


Knowledge workers combine convergence, divergence, and creative thinking.

With all this knowledge at our fingertips, it begs the questions: who are we, where are we going, and where do we belong? We are heady with the apocalyptic sense of being on the threshold of utopia with the progress of our quest for perfection, but do we know what or where that is?

Our cities across the globe are multicultural, multiethnic, and cosmopolitan with the sense of space and place retreating from our minds as home has become everywhere and nowhere. If home once denoted a certain kind of identity, we are displaced from that comfort zone as everything is now fluid.

Countries are now transnational configurations of economic networks of competing centers of commerce over which we have no control and no sovereignty.

Meanwhile, we have been distracted from reality to become pampered information junkies, bullied, seduced, and manipulated into voting blocks of contention by 24/7 cable news networks. There is no grand narrative coming out of churches or academic institutions but white noise from special interests competing for our attention, subliminally bombarding our senses reducing us to automatons.

We are not sure who to believe or what to believe, or what we stand for, even why we exist. We no longer have faith in some unassailable truth. It is for this reason utopia is intoxicating as orchestrated by the knowledge keepers in the language of destiny.

Many if not most of us are crazed victims of the utopian promise of this “Knowledge Revolution.” We live in the climate of intellectual homelessness with the unspoken dread of nuclear holocaust. There is no peace when the mind can find no shelter.

Life has been reduced to a metamorphic “house of mirrors” in which nothing is what it seems with fear and anxiety custodians of these distortions. A composite of these disguises blankets aggressors who look and seem like everyone else until they transcend that norm and commence to kill and maim the innocent on the street, in shopping malls, schools, churches, synagogues, mosques, and workplaces.

We ask why when these are not terrible people but people doing terrible things, and for a cause, they are unlikely to understand but still gravitate to for its identity and sense of belonging.

Indian scholar Homi Bhabha (born 1949) of Harvard University has studied mass migrations, cultural displacement, and the barbarism of colonizers who have uprooted and made millions homeless, only to have these perpetrators equally homeless as wanderers themselves across languages and continents. He writes:

“It is the trope of our times to locate the question of culture in the realm of the beyond . . . Our existence today is marked by a tenebrous sense of survival, living on the borderline of the present.”

Mass confusion has reduced classifications to the shiftiness of prefixes: postmodern, postindustrial, postcolonial, post-Christian, post cultural, and so on. Borders are collapsing, states are becoming fluid, and people are living in ambivalent sponginess and constantly metamorphosing.

Formerly sacrosanct boundaries are now the place from which something begins as religious institutions and nation-states are no longer reliable reference points. It is in this climate that utopianism, heady with the “Knowledge Revolution” has once more come to the fore.

BIRTH & DEATH OF UTOPIA

British philosopher John Gray states, “Utopianism was a movement of withdrawal from the world before it was an attempt to remake the world by force.”

We have seen what the utopian idea proved to be when the United States attempted to affect regime change in Iraq and Afghanistan with Western-style democracies after disposing of Saddam Hussein in Iraq and the Taliban in Afghanistan. Likewise, we have seen what Vladimir Putin has done with our Western market economy and democracy in post-Soviet Russia. Nothing has turned out as it was planned, but this is the common outcome with the birth and death of utopia.

British philosopher Isaiah Berlin (1909 – 1997) writes in The Crooked Timber of Humanity (1990):

“(Confidence) rest on three pillars of social optimism in the West . . . that the central problem of men are, in the end, the same throughout history; that they are in principle soluble; and that the solutions form a harmonious whole . . . this is common ground to many varieties of reformist and revolutionary optimism, from Bacon to Condorcet, from the Communist Manifesto to the modern technocrats, communists, anarchists, and seekers after alternative societies.”

The East as well as the West thinks it has a better idea if only the misguided and ill-informed would soften their resistance, adopt the proposed formula and realize the benefits and harmony promised.

The core feature of all utopias is a dream of ultimate harmony. Plato believed human ends were unchanging; Marx saw them as evolving through the scientific discovery of natural law, while Christianity has seen it as a matter of faith with human conflict left behind. All these utopian attempts have failed.

People of the Third World, where the focus has been, are not looking for handouts. They are unlikely to envy Americans or want to be like Americans. They want a more equitable share of their wealth to ensure national safety, security, and sustenance to which they have been denied by colonial expansionism.

Utopians believe the Third World is looking to the West to provide solutions and ameliorate its conflicts. This is delusional. The Third World has been exploited for time immemorial. These people prefer colonizers to leave, but to show them how to harness Western technology.

Conflict is a universal feature of human existence. Managed conflict is the glue that holds a society on task, not harmony. The United States, a natural divisive society of a diverse immigrant population, experienced this in WWII. The American people mobilized in less than a year to become the greatest war machine in the history of man. Once a people embrace conflict, it moves forward as one.

The idea of utopia is like the dream of winning the lottery. Once the excitement of the winning fades, and reality moves in, the winner is left with who and what he is and where he is.

Winning is unlikely to provide a quiet life in freedom and security because that is the reality of the dream and not of his existence. When the dream was achieved, the winner was introduced to his cruelest nightmare as wealth and its demands are likely to be foreign to his experience. Luckily, few win the lottery and therefore do not have to deal with its challenges and anxieties.

Utopian dreams in our waking hours are fantasies happily avoided because utopian projects are by nature, unachievable. Scottish philosopher David Hume (1711 – 1776) notes:

“All plans of government which suppose great reformation in the manner of mankind are imaginary.”

So, too, such plans are equally imaginary for the individual. Yet, in this age of exploding technology, where everything seems possible, there is little sense of the dangers of utopianism. Nothing appears to be stopping humans from remaking themselves from the outside, in, and the world with them.

It is because of this fabulist climate of unreality that dystopian novelists have come to the fore with such works as Aldous Huxley’s Brave New World (1932), George Orwell’s Nineteen Eighty-Four (1949), H. G. Wells’s The Island of Dr. Moreau (1896), Yevgeny Zamyatin’s We (1924), Vladimir Nabokov Bend Sinister (1947), William S. Burroughs’s Naked Lunch (1959), Philip K. Dick’s Do Androids Dream of Electric Sleep? (1968) and J. G. Ballard’s Super-Cannes (2000), to name a few.

These authors take a prescient glimpse into the ugly reality that infuses this unrealistic utopian dream. Even so, it doesn’t seem to stop the motor of utopian visionaries. Invariably utopianism is associated with violence. We saw this with “The Reign of Terror” that followed the French Revolution when Maximilien Robespierre (1758 – 1794), himself a casualty of the Terror guillotine in 1794, advocated violence as a necessary form of social engineering to realize human perfection.

UTOPIA’S DANGEROUS MISAPPREHENSION

Early Christians promised salvation with apocalyptical utopian life in the hereafter, while many modern political religious sects offer salvation sometime in the future.

We have witnessed a decline in Christianity with a rise in revolutionary utopianism (e.g., Islam Jihad, Russian Soviet Communism, and German Nazism) along with associated terror and violence. Revolutionary utopianism, incidentally, has shifted from the far left in the 20th century to the far right in the 21st century.

Towards the end of the 20th century, utopianism entered the political mainstream. In this reconfiguration, the regime that came to be most prominent was that of America’s democratic capitalism.

Heady with the triumph of its systems in WWII, The United States with utopian zeal committed itself to install its form of democracy across the world. At the same time, after the 9/11 destruction of the Twin Towers in New York City where nearly 3,000 innocent working people lost their lives, the nation then launched its “War on Terror.”

Now, at the end of the second decade of the 21st century, the United States and the world are reeling from this grandiose utopian scheme orchestrated by President George W. Bush and his neoconservative Vice President Dick Chaney and Secretary of Defense Donald Rumsfeld.

After seven years of work The Report of the Iraq Inquiry (2016), chaired by Sir John Chilcot has been published. He writes:

“(The planning and preparation for Iraq after Saddam) were wholly inadequate and the people have suffered greatly.”

British journalist Geoffrey Wheatcroft (born 1945), who has read the 6,275-page report adds:

“Those might seem like statements of the blindingly obvious, as does the solemn verdict that the invasion ‘failed to achieve the goals it had set for a new Iraq.’ It did more than merely fail, and not only was every reason we were given for the war falsified; every one of them has been stood on its head. Extreme violence in Iraq precipitated by the invasion metastasized into the hideous conflict in neighboring Syria and the implosion of the wider region, the exact opposite of that birth of peaceable pro-Western democracy that opponents of the invasion insisted would come about.” (Source: The New York Review, October 13, 2016)

Once the 9/11 attack became a messianic mission of President Bush, his presidency behaved like a revolutionary regime, personified with the preemptive invasion of Iraq in 2003 to topple Saddam Hussein. Now, more than a decade later, the “War on Terror” has led to endless wars in Iraq, Syria, Libya, and Afghanistan.

Despite good intentions, countries and cultures have their history and way of dealing with problems. Consequently, initial utopian achievements are interrupted invariably by the reality of history.

Scientists look for a unifying theory in physics and mathematics; Western politicians look for a universal economic system with global capitalism. Utopian thinkers in both disciplines use reductive tools to leverage their visions. Meanwhile, Syria, Iraq, Iran, Libya, Saudi Arabia and Afghanistan remain part of the same tinderbox. No one seems to have the answer which is brutally apparent in The Report of Iraq Inquiry.

The world seems to be suffering something akin to a nervous breakdown that only time and patience will ultimately resolve.

The shifting allegiance of utopia from the far left to the far right is supported by the power of faith, but not of religion but politics. Politics has become a way of coping with human imperfection. The Right understands that human nature cannot be overcome; that it must be dealt with as it is. So, why has the Right abandoned this philosophy of imperfection, and embraced the pursuit of Utopia?

The Rightwing utopian movement started as a secular movement and then Christian evangelicals jump on board to make it a militant faith of neoconservatives. As the Utopian Right has become more militant, it has become less secular and more religious with powerful alliances with Christian fundamentalism.

Knowledge workers are caught in this madness without recognizing they are in the middle of the maelstrom. Ironically, President Bush appeared first obsessed with evil only to no longer believe in evil. He was followed by President Barak Obama who was equally obsessed with good as if evil was of no significance. Both presidents would have benefited from the Manichaeism of St. Augustine of Hippo, who in the City of God (426) differentiated the “Kingdom of Heaven” from the “Kingdom of Man,” where evil did thrive. He confirmed the existence of evil as a constant conflict with free will in the Doctrine of Original Sin.

Doubt is the essence of civilization while perfection through the violence of force or terror to remake history has led to the demise of utopia.

WE MUST TEACH OUR BRAIN NEW TRICKS

We don’t have to be stuck with our brains as they are. Our biases can be changed by the same means by which they were created. Modification to our brains happens frequently and naturally. The process starts when we learn to read and write. This represents a major alteration to the natural processes of mental development.

Some features of the brain are laid down even before we learn to read. If we do not develop binocular vision using our two eyes as one perhaps it is because we have some eye problem. Even then we are already programmed to a certain extent with the bias that “girls are not good at math” or “Americans are not good at language skills.”

We must learn to teach our old brains new tricks. The key to overcoming the problem made by our technological environment consists in putting things together in new ways. The brain function seems to acquire data the same way we serendipitously have life experiences.

We are a sponge but quickly forced to see and think in terms of hierarchies: parents/children; teachers/students; leaders/followers; gifted/slow-witted; highbrow/lowbrow; good/bad; superior/inferior; boss/subordinate; new/old; valuable/worthless; rich/poor; saved/damned; strong/weak; brave/cowardly; known/unknown; common/uncommon; right/wrong; white/black; God/godless; objective/subjective. What we are not told at an early age is that “the other” is always part of the former.

Management guru Peter Drucker (1909 – 2005) recognized the absurdity, stating: “Strong people always have equally strong weaknesses.”

People are not a matter of “either/or” but a combination of both. We need not put ourselves down if we find ourselves incompetent in something, as everyone is. This should motivate us to find something we like doing, and chances are in doing so we will find competence.

We need each other because we are not equally competent in the same things but complement each other. We are one people. We are social animals. Hierarchical thinking inclines us to disparage our talent and exaggerate that of others. Likewise, it inclines us towards celebrity worship envying people who can do things that we value only much better than we can. We tend to imitate them moving us away from our strengths in imitation of theirs, then hating ourselves for not being as gifted. This drives a wedge between our “real self” and “ideal self” at the cost of self-understanding producing instead of self-estrangement.

It can result in an obsession with comparing and competing. Psychiatrists Willard and Marguerite Beecher write in Beyond Success and Failure: Ways to Self-Reliance and Maturity (1966):

“Competition enslaves and degrades the mind. It is one of the most prevalent and certainly the most destructive of all the many forms of psychological dependence. Eventually, if not overcome, it produces a dull, imitative, insensitive, mediocre, burned-out, stereotyped individual who is devoid of initiative, imagination, originality, and spontaneity. He is humanly dead. Competition produces zombies! Nonentities!”

What if we thought in a way in which such apparent distinctions were treated as part of the same whole? What if we thought with our mind’s eye (imagination) as naturally as with our minds (data collectors), intuitively and routinely as we do cognitively? What if we used our right brain (conceptual synthesizer) fully as a complement to our left brain (information analyzer), what then?

Perhaps we would be less obsessed with control and more inclined to accommodate the requirements of the situation. Perhaps we would be more comfortable with uncertainty and the world of limits, a world that necessitates sharing.

Edward de Bono advocates lateral thinking to bridge our obsession with barriers:

“As a patterning system, the brain can only see what it is prepared to see. The analysis of information will not produce new ideas, merely a selection from existing ideas . . . To deal with the future we have to deal with the possibilities (not certainties). The analysis will only tell us what is already known.”

We are in a “Knowledge Revolution” with science priding itself with close to absolute certainty that mathematics and physics are successfully tracking the physical laws of nature to the benefit of man. But are they?

Mathematics is a language invented by essentially left-brain thinkers. Technology runs with the discoveries of science translating them into new products, new technologies seldom if ever stopping to wonder about the impact or consequences. Technology companies are heady with the products that will accrue from successfully filling the mainly imaginary longings of the buying public for something new, something that will allow them to escape the misery of their mainly humdrum existence.

It is no accident that the popular personal computer and the wonder of handheld electronic devices came out of toy-making designers. We are so enamored of these new gadgets that science has become the new god and technology creators the new high priests of the enterprise. Consequently, there is pressure for manufacturers to come out with new machines, or seemingly new machines, every year with the public standing in lines around the block for days to be first in line to make the purchase.

Albert Einstein (1879 – 1955), who ranks with Galileo and Newton as one of the great conceptual revisionists, put mathematics and science in perspective:

“As far as the laws of mathematics refer to reality, they are not certain; and as far as they are certain, they do not refer to reality.”

Yet, we seem fixated on science and mathematics with utopian zeal. To overcome this all-embracing technological neuroticism, we need to discover ways to teach our old brains new tricks, as this software craze only reifies what is already known and therefore warps the brain into frigid conformity. The paradox is that a society committed to progress is retrogressing at Mach speed.
 
Possible Matrix of the Future

The brain is hungry for data but disinclined to innovation. The mind can get equally as fat as the body, but no one seems to notice, or if they do, only to sense an inevitable kinship.

Innovation consists of putting old things together in new ways. It is why chess, doing puzzles, or playing thought games are stimulating. Unlike sequential technological generated knowledge, imaginative thinking jumps the linear barrier in a non-linear conceptual manner.

There is a place for step-by-step Aristotelian logic, to reducing problems to their smallest parts as a matter of Descartes, even making exhaustive lists in an instant electronically, but this is not thinking. This is playing Ping-Pong with the cerebral cortex.

De Bono labels this obsession with problem-solving as critical thinking or the limitation of thinking of the problem with the same thinking that caused the problem. As a consequence, most problems are reduced to the symptoms of the problem, as the cause gets lost in circular logic. He states emphatically:

“If you are seeking to discover the truth, then you are not interested in creating truths.”

That takes creative thinking which involves pushing the barrier beyond smug beliefs to what is not known but can be found out.

As uncomfortable as it may seem, there appears an irrational component to the problem solving, the non-logical counterintuitive brain that is not limited by the complexity of data and is comfortable with contradictory information. This requires using both sides of an argument and both sides of the brain. Insight, or intuitive understanding, comes in a cloud burst that is not available on the computer.
 
We love the precision of the Matrix whatever the discipline.

The creative mind is not put off with the irrational or inexact or fuzzy concepts such as “almost two,” or insupportable quantum leaps. The irony is that Einstein had trouble with quantum mechanics which he helped to develop, a kind of physics and mathematics that wanders beyond the bounds of the conventional and is not limited to the automated universe.

We are in a culture locked in the rational, the verifiable, the logical while being intimated by the “arational” or thinking that is not sequential and replicable. De Bono attempts to address this gap with lateral and parallel thinking as a complement to vertical and hierarchical thinking. Groundless thinking began to be devalued after the “Scientific Revolution” (1550 – 1700) took hold and ultimately became the new religion.

Today, philosophers and religious thinkers, who jumped on the scientific bandwagon to mimic that community shredding their mystical and metaphysical propensities, are resurfacing as the material world finds it desperately needs to rediscover its spiritual moorings.


Caricature of the Knowledge Keeper?

No one knows precisely what thinking is or how the brain works. Radioactive tracers look at the brain when it is injured or diseased with magnetic resonance imagining (MRI) and computed tomography. They are now scanning the brain to see how it looks when people are solving problems. Alas, the biggest, fastest computer cannot match the insights of the gigantic system we have between our ears of which we still know so little.

NOWHERE LAND IN THE AGE OF INFORMATION

Utopian social engineers have found a powerful new tool on the Internet and in the diversity of software probing the darkness of human limitations. This will continue. Accompanying this utopian intensity is terror and violence. This is not the fault of the new technology but users with opposing utopian views. All societies contain divergent ideas of life. When a utopian regime becomes dominant in a society, philosopher John Gray warns, it collides with this ideal which can only lead to repression and defeat.

Utopian ideals have reproduced religious myths and inflamed mass movements since the Middle Ages. The secular terror of modern times has accompanied Christianity throughout its history. Christianity believed that Utopia could be achieved by human action.

Clothed in science, the early Christian myth of end-times has given rise to a new kind of faith-based violence. We saw this in the preemptive invasion of Iraq where the United States with utopian messianic zeal deposed the totalitarian dictator Saddam Hussein and attempted to install American-style democracy on a mixed tribal society of Shiites, Kurds, and Sunnis with disastrous results.

Missing in this intervention was the understanding that people everywhere continue to be attached to familiar things – religion, nationality, family, and tribe – seeing efforts to separate them from such attachments as being atavistic. Iraq has been Nowhere Land ever since.

Utopianism tracks with the retreat and decline of Christian beliefs as human nature, at the moment, doesn’t seem interested in being improved much less perfected.

Equally alarming is that efforts at perfectibility were clearly apparent with the Soviet Union, Nazi Germany, and now ISSI and al-Qaeda but with a radical difference to that of Christianity.

What these ideologies share in common is a commitment to totally change the social order by first destroying what exists, and then replacing it with the assumed mythical ideal. This differs from Christianity’s mission which is to improve human nature. Yet, in Christianity’s efforts to do so it has engendered terror and violence as well.

It is never the flaws of human nature that stand in the way of Utopia. It is the workings of evil forces. Ultimately, these dark forces will fail, but only after they have blocked human advance by every kind of terror and violence.

It is for this reason that it would be well for knowledge keepers to get inside the habits of radicalism. Terror and violence are as indigenous to the West as elsewhere, and its utopian inclination, although different, is equally destructive. Knowledge keepers, secure in their cubicles, may think harmony is the key to stability when it is managed conflict that is.

NEXT – TWENTY-FOUR – A WAY OUT MAY ALREADY BE HERE -- PSYCHIC ENTROPY