Popular Posts

Sunday, August 01, 2021

NOWHERE MAN IN NOWHERE LAND - SIX

  

AS THE DYING SENSATE INDICATES, WE ARE VERBALIZING OURSELVES OUT OF EXISTENCE.

 

Nowhere Man in the Dying Sensate Culture

 

James R. Fisher, Jr., Ph.D.

Originally published © April 7, 2016/August 7, 2021

 

“The mind has no other immediate object but its own idea.  It is evident that our knowledge is only conversant about them.”

 John Locke (1632 – 1704), English Philosopher


 Ideas have consumed man from the beginning.  His mind will not rest and things cannot remain as they are for man is ever in a mood to change things.  This is the story of Future Perfect after struggling through Past Imperfect and Present Ridiculous, as no man is content with his existing time, but seemingly ever nostalgic for reflecting on his stumbling past.

The “waist-high” culture of the 20th century was consumed with the Sensate Culture.  It is now straining to move beyond that self-indulgent preoccupation of cosmetic beauty, erogenous zones, the “G-spot” and penis envy to embrace more essential challenges in the Ideational Culture of the Creative Tomorrow.

The aftereffect of rejecting rather than embracing what is feared has fashioned an uptight society terrified of being found dysfunctional or irrelevant, which has inadvertently generated the pathology of the normal.   

Men and women mirror each other in their withdrawal from life and meaning, pursuing careers they hate but the money is good, collapsing into each other’s arms without rhyme or reason, taking refuge in the moment surrendering meaningless sensation as if the body has no mind and the mind no need of a body. 

Small wonder people can’t wait to escape to the Ideational Culture of the magnificent tomorrow.  Before that can happen, however, they have to acknowledge they wear the façade of Nowhere Man and exist for the most part in Nowhere Land.

This is a continuation of the story of the “cut and control” phenomenon of progress.  The mind that believes all problems are cognitive is a resident of Nowhere Land.  The story of God and godlessness are distractions to the fact that man has converted this earthly Paradise into a veritable garbage dump.

Conscious man has been mainly unconscious and unconscionable in treating this small planet as a hostile environment that must be subdued not realizing that in the process it meant the destruction of himself.  Now, as emperor of all that he surveys man looks out at a wasteland. 

With all man’s material gains, he has misplaced his immaterial anchor, leaving him in Plato’s cave chasing shadows blinded by the light of his neglected sun.


 THE SOCIAL & CULTURAL CRISIS OF OUR TIME

The reader has been fed a confection of bromides to deal with his cultural indigestion most of his life.  It has been the case with this writer who has lived for eight decades, being born during the Great Depression, growing up as an adolescent during WWII, then the Korean War followed by the Vietnam War, and in the interlude between these two last wars, completing a college education, his military service, getting married, and starting a family by his early twenties.

Between military service and his career as an international executive, a good part of the world from Europe through the Middle East, from South America to South Africa became part of his beat. 

During the 1960s and 1970s, while many Americans were dropping out (“Hell, no!  I won’t go to Vietnam”), running off to a California commune in San Francisco’s Haight Asbury Park, or slipping into Canada in the dead of night to avoid the U.S. Selective Service Military Draft for possible duty in Vietnam, Harvard professor Timothy O’Leary (1920 – 1996) was exhorting everyone to “turn on, tune in, drop out, dispensing the therapeutic wisdom of human potential through mind-expanding psychedelic drugs such as LSD, failing to mention that many were dying from such self-indulgence.

O’Leary was also into space migration and playing house with multiple couples without the benefit of marriage or the responsibility of being a parent and breadwinner.  It was all about living in the moment.

At the time, I was now in my early thirties, the father of four prepubescent children traveling the world as a chemical company executive.  My source for what was going on in the United States was The International Herald Tribune, where it appeared the country was in the throes of a nervous breakdown. 

Then there was my own mental disruption.  It was 1968.  We were living in South Africa where I was facilitating the formation of a new chemical company.  Reading American journalist Allen Drury’s A Very Strange Society (1967), while embedded in that county, I concluded no place on earth was quite like South Africa.  Its people were energetic, the country vibrantly beautiful with the government an Afrikaner representative democracy.  The problem was that the white minority (20 percent) of Afrikaners and British ruled over the Bantu (black) majority (80 percent) with draconian authority.    

Finding this offensive, I would escape into books.  That is how I came to read Social and Cultural Dynamic by sociologist Pitirim Sorokin (1889 – 1968)Written during the Great Depression (1937), the decade of my birth, the book spoke to me.  It was about The Crisis of Our Time with me finding little solace in my crisis.  Sorokin writes:

 I have stated clearly, based on a vast body of evidence, which every important aspect of the life, organization, and culture of Western society is in the throes of an extraordinary crisis.

Its body and mind are sick and there is hardly a spot on its body that is not sore, nor any nervous fiber that functions soundly.

We are seemingly between two epochs: the dying Sensate Culture of our magnificent yesterday and the coming Ideational Culture of the creative tomorrow.

We are living, thinking, and acting at the end of a brilliant six-hundred-year-long Sensate day.

The oblique rays of the sun still illumine the glory of the passing epoch.  But the light is fading, and in the deepening shadows, it becomes more and more difficult to see clearly and to orient ourselves safely in the confusions of the twilight.

The night of the transitory period begins to loom before us, with its nightmares, frightening shadows, and heartrending horrors.  Beyond it, however, the dawn of a new Ideational Culture is probably waiting to greet the men of the future.

The third cycle in Sorokin’s typology, the Idealistic Culture, paved the way into the Sensate Culture.  This culture included the Renaissance, the American and French Revolution, the Protestant Reformation, the collapse of feudalism, and the rise of capitalism to mention a few aspects of this 600-year-long-day.


*  *  *

Sorokin did not attempt to emulate physical scientists in his scholarship as did Talcott Parsons (1902 – 1979), and other social scientists, but followed social data of human behavior over time to render his diagnosis. Consequently, the reader should not apologize if this Russian émigré is not familiar. He quietly let the social data show him the way.

Fast forward to the late trendy popular author Wayne Dyer (1940 – 2015) who broke through the consciousness of the young and disenchanted in the 1970s with Pulling Your Own Strings (1979), taking a page from Timothy O’Leary’s think for yourself and question authority.

Whereas O’Leary provides passage to Nirvana through a psychedelic high, Dyer urges people to practice a veritable religion of self without the need for the approval of others. He validates his thesis by tapping into psychologist Abraham Maslow’s “Hierarchy of Needs,” and his concept of self-actualization, as well as the teachings of Swami Muktananda, Saint Francis of Assisi, and the Chinese philosopher Lao Tzu. Dyer has a way of turning a phrase to make the reader feel in charge:

If you change the way you look at things, the things you look at change. How people treat you is their karma; how you react is yours. When you judge another, you do not define them, you define yourself.

Dyer addressed his philosophy to an ever-increasing audience with his followers deeply loyal to his message. PBS television used him religiously for its annual pledge drives and still does even though he has passed on.

Man as an authority on to himself is however not new. It has been the metaphor for Western man as Nowhere Man during most of this 600-year-Sensate-day, as he aspires to escape his cage while driving himself into it only deeper. Now, the affluent of the Far East, imitating the West, are pulling their own strings with similar puppeteer fascination, marching in sync as hedonistic Nowhere Man.

The self as described in the last forty years is not the thinking self of Hellenistic tradition, or the just and loving self of Hebraic tradition, nor is it the feeling self of the European Romantic period. It is the self as Schopenhauer (1788 – 1860) and later Nietzsche (1844 – 1900) described it as the cynical and loathing self as will.

The will is broken up into specific wishes, wants, appetites, and desires. For Nietzsche, it is the will-to-power; for Schopenhauer, the will-to-live. These two perspectives have dogged Western man since the beginning of his “cut & control” trek across the planet. Now, the self, as perceived by Western man, finds him at home in Nowhere Land. It is here the self with utopian bravado claims to be equal to any encounter. Enabler Dyer would concur while the more cautious Sorokin would infer history suggests otherwise.

*  *  *

The “war of wills” can best be illustrated in a story of how the men of ideas have attempted to carve out a better world, but seldom with a sense of at what price?

With the will-to-power, the inference is that the world was created in man’s image and likeness. Nietzsche dealt with cultural obstacles by declaring the demise of the Christian God while driving man into a new wilderness with a collection of secular materialistic gods.

Less ceremoniously, Schopenhauer rejected Christianity from the beginning preferring Buddhism and Hinduism in his efforts to get beyond the will-to-power to the roots of truth and harmony in life’s eternal struggle to find love and purpose.

Since Nowhere Man revolves around the choices we make, the self revolves around the triangular self: the intelligent self, the empirical self, and the acquired self.

They are all part of the “single self” that would rather live in an ideal technological world than acknowledge the real world of scarce resources and tangible consequences.

Nowhere Man prefers the utopian optimistic point of view taking comfort in the belief that innovative technology supported by continuing scientific breakthroughs will compensate for man’s excesses. Science is thus the whore of rapacious technology.

The first self (intelligence) relates to the insatiable will, the rational drive spurred on by appetite, hunger, desire, and thirst to make the world conform to man’s needs as man’s mind conceives them. Need is confused with want resulting in want becoming a need with a sense of proportion not in the equation.

The second self (empirical self) is the individual expressing the will of the self through experience in space and time. Distance has collapsed in the Information Age so that Nowhere Man can jump from one place to another with the touch of his mouse, or use FaceTime to talk to anyone anywhere in the world with a smartphone and see that other person as if sitting directly across from them.

The third self (acquired self) is the human personality, or the many masks we wear throughout the day to get through its demands. It is the artifice that comes to maturity over the years, only to get lost in its many faces, finding the self a permanent residence in Nowhere Land with little awareness of the fact.

The remarkable thing about the self is that there is little real psychological change once its programming has been solidified in cultural verities. What you do and how you behave is a consequence of what you are, or what you have been made to believe you are, inherently resisting change by any amount of doing.

We are essentially stuck with ourselves as we are, yet inclined to resist admitting this to ourselves. For this lack of attention, we are vulnerable to manipulation and exploitation as the person we purport to be does not exist, but is the only person we think we know.

It would be well for us to get acquainted with that person we are, and should we do so, and be able to accept that person, three things can happen:

· We can become self-accepting of ourselves as we are, warts and all;

· We can come to accept others as we find them;

· We can, in the process, discover that we can change.

This is not a drill. The drill would be to “pull your own strings,” which would only reinforce “us” against “them.” The challenge is always “us” against our reluctance to engage in self-understanding.

Nor is it yet relevant to echo the sentiment, “become all that you could become,” when you have little idea who or what you are. Typically, anxiety fills the void when we fall into the counterproductive behavior of comparing and competing with others. To be obsessed with this mindset promotes an imitation of another at the expense of our authentic self. Nowhere Man triumphs!

Through an assortment of selves that are temporary expressions of the deeper self, there is one principle that reveals us to ourselves and that is consistency. If we stop to think about it, we will see that there is a definitive pattern to our behavior that is performed as if written in concrete. We are the judge to whether that consistency serves or fails to serve us.

We have to constantly introduce ourselves to ourselves as if we were another person to know if this consistency is a favorable gauge to what we purport to be.

We encounter the will-to-power of ourselves before we can deal with the will-to-power-of-others. A strange thing happens when we engage in this process. We not only see ourselves more clearly, but we better understand the behavior of others as well. Everyone is mastered by a level of consistency as if operating on an automatic pilot.

Meanwhile, there is an insatiable lusting of the ego for its self-aggrandizement. As John Locke states, “The mind hath no other immediate object than its own ideas. It is evident that our knowledge is only conversant about them.”

Schopenhauer concurs in the opening lines of The World as Will and Idea (1818):

“The world is my idea: this is a truth which holds good for everything that lives and knows, though man alone can bring it into reflective and abstract consciousness.”

This initial line of justification is acceptable to almost everyone. Schopenhauer argues that we do not know ourselves and the earth in any direct way but only concerning our own experience. All that we ever know is known utilizing the ideas we have about objects. Schopenhauer continues:

“All that in any way belongs or can belong to the world is inevitably conditioned through the subject and exists only for the subject. The world is idea.”

The mind, then, is not in direct contact with things but only with images as if a motion picture that dances before our eyes. It is supposed that these images resemble objects in the external world. We have no way of confirming this, or the true degree of the resemblance because looking involves intervening optical ideas. Consequently, what I see is not necessarily what you see looking at the same thing.

As Oxford scholar, A. D. Nuttall (1937 – 2007) puts it, “Suddenly, loam-footed empiricism, that most earthly of philosophies, is invaded by solipsistic fear.” The mind has striven for material and physical security through science and technology and believes it has achieved it. What could be more solipsistic?

Yet, despite the mind (the self) believing in only its existence, how can we explain the anxiety, insecurity, and uncertainty that dogs society chasing a man in his self-indulgent desperation into Nowhere Land? The mind, at the moment, it would seem, has fallen into moral, intellectual, and spiritual decline. This is consistent with Sorokin’s dying Sensate Culture of our magnificent yesterday. Why is this?

Italian political philosopher Giambattista Vico (1688 – 1744) writes in his opus magnum, New Science (1725):

Men first feel necessity, then look for utility, next attend to comfort, still later amuse themselves with pleasure, then grow dissolute in luxury, and finally go mad and waste their substance.

The end of the process, Vico discerns, is when each man is thinking only of his own private matters. Nearly three centuries later, this is reflected in hedonistic Nowhere Man obsessed with instant gratification, consequences be dammed!

In that sense, Vico is a forerunner to systemic thinking that is popular today, which is the process of understanding how those things or systems influence one another within a complete entity, or larger system.

In nature, systemic thinking includes ecosystems in which air, water, movement, plants, and animals work together to survive or perish. In organizations, systemic thinking consists of people, structures, functions, and processes that work together or against each other to produce functional or dysfunctional operations.

Culture is a systemic system that controls such factors while not appearing to be doing so. In contrast, Cartesian analysis breaks everything down (reductionism) and reassembles the pieces to determine what works and what doesn’t. It is a mathematical methodology reliant on logical analysis and mechanical interpretation of physical nature. This is management guru, Peter Drucker’s (1909 – 2005) Management by Objectives (MBOs), although a faulty performer in the complex organization, is still employed by many corporations.

Vico was also the first to explore the fundamental aspects of social science from a purely empirical rather than through a rigorous scientific methodology, claiming “truth itself is a fact.” He argued that the reading of observations was a valid form of epistemology. He also inaugurated the modern philosophy of history, which is practiced by Isaiah Berlin (see John Gray, Isaiah Berlin: An Interpretation of His Thought, 1996), and is relevant here in this discussion as another form of systemic interpretation.


THE DYING SENSATE CULTURE - IN THE CLASSROOM AND BEYOND

Once you leave the safety of the philosophical, the mind and will encounter the reality of the moment. For example, is addiction a disease or simply an expression of self-indulgence?

In this postmodern world of the affluent West, boredom appears to be the Devil’s Workshop. People suffer from too much too many too soon with little sense of pain or delayed gratification. We mollycoddle our children through 12-years of compulsory education with them leaving feeling as if having completed their prison sentence, but with little understanding, comprehension, or appreciation of the privilege that they have experienced.

Lincoln J. Tamayo, head of school, Academy Prep Center of Tampa, has this to say about the Florida Comprehensive Assessment Test (FCAT):

“Our teachers are becoming widget-markers and our students the widgets . . . So is it surprising that they (public schools) are becoming dull factories, where students are constantly drilled on how to succeed on FCAT? How can teachers instruct creatively when they are forced to prepare for FCAT as if it were the invasion of Normandy?”

How indeed. This is compounded when teachers in high school drill students for the Standard Achievement Test (SAT) for college. Then university professors do the same with the Graduate Record Examination (GRE) to qualify students for graduate school education.

Do students attain no significant cumulative knowledge along the way to deal with such tests, or are such tests fully irrelevant to that experience? We are seemingly a test mad society with little confidence in the “eye test” of student progress. We have to have this confirmed in a test as if a test is an infallible construct.

Intelligence is not a score on a test; intelligence is what it does.

In our utopian Nowhere Land society, we have transformed young people into proficient testing machines, but not into original free-thinking individuals. Still, at the end of the process – high school, college, and graduate school – we profess them to be educated and deem them professionals and equal to the challenges of life.

The boring K-12 educational process often translates into bad behavior as the dumbing down curriculum is meant not to embarrass the student or teacher while frustrating both.

An Ivy League degree today, as syndicated columnist Thomas Sowell points out (see Inside Thomas Sowell, Inside American Education: The Decline, The Deception, The Dogmas, 1993), is not unlike purchasing an imperial title, once you get it the rest is the gravy train. This deferment of struggle or potential failure could translate into suspension into adolescence with the possibility of its becoming a permanent state. Indeed, the most taxing part of a career is for the student to win acceptance into an Ivy League school. Lauren A. Rivera writes in her provocative new book, Pedigree: How Elite Students Get Elite Jobs (2016), that it is all about connections. This isn’t missed by aspiring millennials who are not anxious to become adults.

University of Pennsylvania sociology professor, emeritus, Frank F. Furstenberg, Jr., has determined that the twenty-something college crowd is gifted loafers holding maturity in abeyance. The common age for leaving the nest (home) was once 18; it is now 27 and creeping up to 30 for millennials.

College costs are punitive, yet considered necessary in the present cultural climate in which everyone has to be a college graduate. College is believed to be the only ticket to a better life, while poorly paying jobs with little or no benefits are plentiful (e.g., working in a restaurant, bar, supermarket, department store, being a security guard, night watchman, etc.). Consequently, college-like health care insurance is considered a necessary evil and begrudging investment to a somewhat better life. The only problem with this rationale is that mom and pop as motel keepers, restaurateurs, food and laundry services, and primary caregivers of their aging children are expected gratis.

The college crowd plans to marry later if at all, and having children even later, while living at home with their girlfriends or wives as long as they can, contributing little if anything to the cost of their care or stay.

Furstenberg’s team at the University of Pennsylvania’s Network on Transition to Adulthood identified five benchmarks of aspiring young college students:

· Completing their education;

· Getting married;

· Having a child;

· Becoming financially independent;

· Leaving home.

The problem with these benchmarks is that they have little to do with growing up and assuming the role of adults. They are all about “me!”

Anyone can have a child obviating the other accomplishments while a disproportionate number of this twenty-something crowd (in this study) were still living at home when compiling this list. Indeed, if they do have a child, given their circumstances, they would undoubtedly expect their parents to act as permanent babysitters as they could not afford to work and pay for such services.

Therefore, it is not uncommon for this crowd to be pushing thirty, still living at home, still dependent on their parents for their total emotional well-being, while living a life – if they have a job – with insouciant glee, not saving any money – after all, you’re only young once! But still, as Nowhere Man incredibly optimistic ensconced in Nowhere Land confident that Paradise is just over the horizon, and with it winning the lottery.

Adulthood, as Philadelphia Inquirer columnist Karen Heller describes in a 2014 featured piece, is much less about being self-centered, and much more about demonstrating an inclination to help others, even financially; giving freely of one’s time to charitable causes along with financial support; picking up the check-in restaurants; purchasing one’s staples for work and not expecting anyone else to purchase one’s furniture, bedding, and linens, or other household supplies; not expecting anyone else to pay for one’s food, utilities, car insurance, car maintenance, gasoline, or any other incidentals. On the other hand, these are items that the twenty-something crowd dodges with deftness.

Unfortunately, the twenty-something crowd if forced to live at home, is not apt to be good at keeping appointments, or being on time for them, returning calls expected outside their narrow band of self-interests, or sending thank you notes to relatives who send them gifts.

For someone born during the Great Depression, performing Heller’s list does not signify adulthood, but survival. They expected no kudos or accolades for being civil and responsible for the little they had they depended on each other.

It is no accident that this was the perfect generation to create the dynamics of the magnificent war machine that the United States became in WWII. David Halberstam writes in The Next Century (1991):

With our great assembly lines and our ever-expanding industrial core (and protected as we were by two great oceans in an age when weaponry could not yet cross an ocean), we became the industrial arsenal for the mightiest war efforts. In 1942 and 1943 (after the United States declared war on Japan and Germany after the Japanese bombing of the U.S. Navy and Army facilities at Pearl Harbor in Honolulu, Hawaii on December 7, 1941), America alone produced almost twice as many airplanes as the entire Axis. In 1943 and 1944, we were producing one ship a day and an airplane every five minutes.

· Adults know how to say “no” to social pressure, and “yes” to a common cause. That meant that the Great Depression generation could say “no” to its children, but paradoxically, its children, the boomer generation as parents could not say “no” to their children.


· Adults know how to behave as adults in public such as at social parties, church and school functions, athletic contests, and in other public places.

· Adults listen to their moral center and are guided by their moral compass. They avoid going into debt or living beyond their means to impress others. They steer clear of people who irritate them or find them irritating. They don’t expect to be waiting on but to share the load when appropriate.

· Adults are not prisoners of smartphones, FaceTime, or other social media as are those who treat cyberspace as reality.

· Adults avoid Nowhere Man status by staying clear of a utopian existence by living responsible lives in the present.


There is little evidence that Ms. Heller’s benchmarks carry much weight with millennials or this twenty-something college crowd. Nor do their parents display the temerity to kick them out the door to be on their own and find their way.


THE ENTREPRENEUR IN THE DYING SENSATE CULTURE

To give you a sense of the dichotomy between yesterday when reality dictated behavior, and today, consider the case of Ronald Gerald Wayne (born 1934) as one of the original owners of Apple Computer, along with Stephen Wozniak and Steve Jobs.

Wayne was an electronics industry worker and kind of the father figure to Wozniak and Jobs when reality dictated behavior whereas the surreal is prominent today. He was the pragmatic voice providing administrative oversight for the new venture.

Wayne in an interview with BBC television (April 1, 2016) explained that it was his idea to get Wozniak to go into business with Jobs. "Jobs thought that I was somewhat more diplomatic than he was," Wayne recalled in the BBC interview.

"Jobs was anxious to get this thing into production. But Wozniak, being the whimsical character that he was, everything he did was for the pure fun of it.” Wayne laughs, “Woz had no concept of business or the rules of the game.”

It took 45 minutes of persuasion for agreement to be reached, with Wayne even typing the agreement out on his IBM electric typewriter. So, on April 1, 1976, the three signed a contract as Apple Computer (now Apple, Inc.) as owners: 45 percent to Wozniak; 45 percent to Jobs; and 10 percent to Wayne with part of his role being acting as umpire should the partners have a fallout.

But Wayne was uneasy from the start with the first deal of the embryonic company a contract to supply 50 machines to a local computer chain. This required Apple to borrow $1,500 working capital to pay for the parts needed to build the computers.

“If something went wrong with the deal,” Wayne asserts, “the three of us would be responsible for the debts.

"Jobs and Wozniak didn't have two nickels to rub together. I had a house, and a bank account, and a car… I was reachable!"

Rather than run the risk. Wayne sold his stake 12 days later. His other contribution was to design Apple’s first logo in its 40-year history. Wayne sold the contract to an autograph seeker for $500. It sold at auction in 2011 for $1.6 million.

In defense of what he did, he told his BBC interviewer, "If money was the only thing that I wanted, there are many ways I could've done that, but it was much more important to do what appealed to me.

"My advice to young people is always this - find something you enjoy doing so much that you'd be willing to do it for nothing… and you'll never work a day in your life."

So, Wayne sold his interest in Apple Computer for $800 US dollars and later accepted $1,500 to forfeit any claims against Apple. Today, that 10 percent interest would be worth $60 billion US dollars.

*   *   *

A reader born after WWII, which was the case of Steve Jobs (1955), Wozniak (1950), and Bill Gates (1955), might have trouble understanding the actions of Ronald Wayne born at the height of the Great Depression (1934).

Wayne would never think of doing what both Jobs and Gates did – drop out of college and start a business in their garage, leading to them both becoming billionaires, or Bill Gates doing the same and becoming the richest man in the world. They dropped out of school in the 1970s, were bored with the process, and got away from the crowd.

Jobs and Gates pursued the laboratory of their minds, which came to be associated with entrepreneurship, something that had been around for a few hundred years, but they gave the process style.

Earlier, Nobel Laureates in Literature, Ernest Hemingway and William Faulkner, both indifferent high school students, both high school dropouts, did the same thing.

Hemingway as an 18-year-old became an ambulance driver for the American Red Cross during WWI on the Italian front. He used that experience as entre to a series of internationally successful novels and short stories, creating the mystique of a man of boundless bravado and courage.

Faulkner engaged in a footloose life, basking in the counterfeit glory of a WWI veteran. He had gone on a bizarre sojourn to Canada during that war, came home to Oxford, Mississippi creating the mystique of wounded warrior veteran, purchasing Canadian RAF uniforms and posing for photographs with the rank of lieutenant, a rank he never attained. He even feigned the limp to add credence to his fabrication.

In 1919, Faulkner enrolled at the University of Mississippi in Oxford under a special provision for war veterans, even though he had never graduated from high school. He lasted a little over a semester.

F. Scott Fitzgerald flunked out of Princeton his freshman year, partying too much but managed to finagle a U. S. Army First Lieutenant commission in WWI, then spent the duration of the war in the south wooing the debutante southern belle Zelda Sayre, whose father was a Justice in the Supreme Court of Alabama.

Academia was not a desirable creative climate for these two groups: the “Lost Generation” of the 1920s of Hemingway, Faulkner, and Fitzgerald, or the “Boomer Generation” of the 1970s of Jobs, Wozniak, and Gates. The irony is that most college students today are required to read these authors as part of their curriculum, while Apple’s smartphone and laptop are necessary accessories as is Microsoft software for students to complete their academic studies successfully.

Strangely, this shows the prescience of Sorokin’s envisioned Ideational Culture from Nowhere Man and Nowhere Land as implacable institutions and industries will of necessity be abandoned for the creative tomorrow.

In a high-risk society, far different than the Great Depression, a disproportionate segment of the social order turns risk behavior inside out and upside down to live on the edge. It is the addict’s retreat from reality into Nowhere Land while walking on acres of diamonds. The addiction may be alcoholism, drug addiction, gambling, promiscuity, pornography, or several other obsessive-compulsive behaviors, behaviors Sorokin claims are the last grudging signs of the dying Sensate Culture.

Failure of most people to grow up and act adult is the elephant in the room.

Descriptive psychology, psychiatry, and medicine label these behaviors “diseases” when they are more likely preferred lifestyles that emanate from boredom and the failure to find a cause with which to act responsibly. In an excuse-driven society, a safety net is expected to be provided out of thin air as a cushion to the inevitable fall from grace for those so inclined, or whenever that occurs.

We are earthly fumblers and stumble forward guided by the empiricism of science without a theological counterbalance. Roy Porter (1946 – 2002) calls this situation a metaphorical chimera in his Flesh in the Age of Reason: The Modern Foundation of Body and Soul (2003).

Porter would die shortly after completing this book, his opus magnum at the age of 55. Here he explores ideas about health and disease, the soul and what awaits us after death, the decline of religion, and the relationship of the mind in the body in which it resides.

He takes a romp through early Greek medicine to the Glorious Revolution of 1688 to the rise of Romanticism in the 1800s. Along the way, he gets inside the unwritten ideas implied in autobiographies and novels with the express purpose of showing the relationship between the body and consciousness and why we have such a problem with it today.

The turbulent world of modernity and postmodernity has given birth to the justification of the idea of unconscious self-gratification carrying the body forward with the mind on automatic pilot as company along the way.

A LOOK BACK TO SEE AHEAD

Modernity could track its intellectual origins to the Greek sophist philosopher Protagoras (c 490 – c 420 BC) who remarked, “Man is the measure of all things,” which is to say not answerable to a higher authority.

More directly, modernity begins with Rene Descartes (1596 – 1650), whose famous statement, “I think therefore I am,” is the fountainhead of the modernist claim that the self on its own can detect the truth.

Descartes’ ideas led to the notion that reality is verified by humanity, alone, while his twofold conception of human nature (mind and matter) was not just mysterious – “The subtle knot that makes us man,” as John Donne wrote – but impossible. Descartes insists that the part of man that thinks is immaterial (spiritual) while the material part that acts (body) is a mere machine operated by the mind or a separate dimension.

Human autonomy is further advanced by the German philosopher Immanuel Kant (1724 – 1804). Like Descartes, he argues that an individual can discern the truth. Kant made a valiant attempt to give religion and morality a rational foundation in his skeptical age.

Kant thought if the individuals were freed from the control of church and state, each person world discover the same moral, spiritual, and scientific truths, and therefore be the final arbiter of truth. This Enlightenment philosophy encouraged the notion that human feelings and emotions have a superior moral and spiritual authority equal to if not superior to reason. Consequently, the individual, not God, or society became the primary interpreter of human experience, and rationale for world-weariness.

We will meet these and other men of ideas as this story unfolds, among them Isaac Newton (1642 – 1727), David Hume (1711 – 1776), Jean-Jacques Rousseau (1712 – 1778), and Ludwig Wittgenstein (1889 – 1951). Ideas from these men have been at the forefront of Western thought to the present age. Their ideas contributed to the building of a persuasive argument for the divided self, not a holistic man in the traditional sense of Eastern philosophy. We find the divided self of:

· Descartes is between behaving (material world) and believing (spiritual world):

· Newton is between thinking and feeling, seeing man as essentially a well-turned mechanical clock;

· Hume is for attacking reason’s certainties with reason itself (Sensing is believing and what is not sensed is nonsense);

· Rousseau is for ejecting God from government;

· Nietzsche is for expelling God from human consciousness;

· Wittgenstein for dividing man between subjective and objective reference; and

· Kant is for claiming the categorical imperative that the human mind was wired to perceive what was truthful and best for all people.

A spate of books in the last century gives evidence of how preoccupied we are with ourselves. This is a representative listing but far from an exhaustive display of this genre:

· Roy F. Baumeister, Escaping the Self, 1991

· Nathaniel Branden, The Disowned Self, 1972

· Robert H. Schuller, Self-Love, 1980

· Mihaly Csikszentmihalyi, The Evolving Self, 1993

· Charles M. Fair, The Dying Self, 1970

· Kenneth J. Gergen, The Saturated Self, 1991

· Arno Gruen, The Betrayal of Self, 1986

· R. D. Laing, The Divided Self, 1965

· Arnold H. Modell, The Private Self, 1993

· Avodah K. Offit, The Sexual Self, 1977

And then there Antonio Damasio’s Self Comes to Mind: Constructing the Conscious Brain, 2010. Damasio previously took Descartes to task with his Descartes’ Error, 1994. He writes:

(My problem is) the idea that mind and brain are related, but only in the sense that the mind is the software program run in a piece of computer hardware called the brain, or that the brain and body are related, but only in the sense that the former cannot survive without the life support of the latter.

What, then, was Descartes’ error? One might begin with a complaint, and reproach him for having persuaded biologists to adopt, to this day, clockwork mechanics as a model for life processes. . . And since we know that Descartes imagined thinking as an activity quite separate from the body, it does celebrate the separation of mind, the “thinking thing” from the non-thinking body, which has extension and mechanical parts.

The divided self that Descartes and others imagined has created a “cut & control” fracture between the mind and body.

Long before the men of ideas came on the scene in the Age of the Enlightenment, man had already declared war on Nature and set himself apart from it as its designated conqueror. Unfortunately, being of Nature, it also set him on course to be fractured as well between his thinking and feeling self.

The “cut & control” philosophy of Western man is mediated by ideas that have changed the natural world. Nothing has been more powerful or more emphatic. The soaring trajectory of self-belief that first emanated out of the Renaissance is now spiraling downward into obsessive self-contempt with no one calculating the costs along the way.

Over the past half-millennium, the Western man has read with pride his story of individual freedom and economic security with truth discovered, alone, in human reason and scientific inquiry. God was dethroned as the ultimate authority, failing to realize that God is a symbolic connection impossible to explain or justify but an essential one.

Once a man was left with no challenges to his earthly supremacy and no conscious spiritual connection to anything beyond his material world, he stepped beyond the limits of his spiritual connection that had given him identity for eons, to unwittingly enter Nowhere Land without awareness of the fact. Now, as arbiter to nature and all things, he finds he is lost to himself.

Perhaps that would not be quite fair as man with his scathing self-assurance has produced incredible wealth, technological innovation, philosophical insight, and political freedom, so why is he not happy and civil, cooperative and engaged? Why is he excessively self-critical, paranoid, cruel, and avaricious?

Historian Jacques Barzun asks and answers this question in From Dawn to Decadence: 500 Years of Western Culture Life (2000), seeing the present not as a culmination but a decline. He is not however a prophet of doom but shows how decadence and creativity are about to burst forward into a creative tomorrow. Here he shares Sorokin’s confidence that the Ideational Culture of the Creative Tomorrow will replace the dying Sensate Culture of today.

Westerners rushed through the 20th century seeming to be too busy to learn anything, enduring horrific wars and economic setbacks in the process, however benefitting from these struggles, largely self-imposed, to discover a resilience that might not otherwise have known.

Part of this insane restlessness seems endemic to Western man, blundering forward, falling, picking himself up again, and stumbling on while denying his false steps filing them away in the closet of his mind.

This finds him with a chronic inclination to repeat the same errors. At the same time, it allows him to constantly press forward against limits, constraints he refuses to acknowledge much less address. So, now we find him in the second decade of the 21st century trapped in a cultural cul-de-sac- of his own making in Nowhere Land.

It doesn’t stop there. Western utopianism is catching. The economic imperialism of the West is now the goal of Russia as it builds pipelines across Siberia to deliver oil to Europe and Asia, while China and India attempt to “out-west the West” by exploiting nature to the limits. In Africa, where colonizers once exploited the natives, now Africans exploit their own people in the same manner. The United States may be declining as the lone superpower, but its waste-making model is being robustly replicated across the globe.

*    *   *
This has all taken a considerable toll on the United States as a people. Author David Awbrey writes In Finding Hope in an Age of Melancholy (1999) of the hidden costs to the optimistic smugness and swagger of American post-WWII generations:

“The best intentions sometimes produced disastrous results. The more Americans manipulated nature and their own psyches, the more anxiety they stirred. The closer people got to having it all, the unhappier they became. For all the nation’s military power, financial wealth, and technological complexity, the American Century became the Age of Melancholy.”

Awbrey, like millions of his generation, worked his way to the top – in his case a journalistic career – only to discover material success did not compute with immaterial (spiritual) satisfaction, resulting in a tailspin into paralyzing depression. The “cut & control” remedies of psychotherapy and therapeutic drugs proved no palliative relief to his anxiety and little benefit to moderating his melancholy. Small wonder that depression is the most commonly reported complaint of the affluent.

As this writing, we are in the midst of the most acrimonious political campaign in my lifetime for the nomination of the Democratic and Republican candidate for President of the United States.

This scrum is past bizarre or even insane as there appear to be no adults in the process. This extends beyond the contenders to the media that aspires to cover every waking move of these candidates. It has become reality television in the age of the absurd with the lowest common denominator civil finding the uncouth and the ill-mannered on display by candidates and supporters in equal measure. The popular notion that brains have replaced brawn in the workplace is not evident in this arena.

In the age of political acrimony, economic insecurity, artistic anarchy, intellectual confusion, religious intolerance, fear of the future, and a debilitating sense of lost permanence, is it any wonder people have lost their psychological and spiritual moorings?

People feel victimized by uncontrollable forces beyond their control or comprehension. Personal relationships are unstable, morals are fluid, and the social foundation is shaky. Even war no longer makes sense, as the enemy lurks in the shadows and could be anyone. Warring factions destroy total cities displacing millions of citizens and killing tens of thousands more, yet there are no clear winners or losers.

Terrorism is a gambler’s war. We know gamblers are obsessed with losing because winning would deny the legitimacy of their self-contempt, and the same goes for terrorism and war.

In Ambrey’s melancholy, the Western man cries, “I have everything and nothing at all!” The Mid-Eastern terrorist responds, “I have nothing to lose and everything to gain!” The irony is that the two are aching with the same problem.

Financial security and personal stability have not alleviated the emotional misery of the Western man, nor will killing the ghost most feared, the terrorist, end the nightmare. The dance of death finds these two unconscious partners on center stage mirroring each other with identical fears, while both having found refuge in Nowhere Land as Nowhere Man.

We wonder how people who have everything could become engaged in fraud; how people who have more power than they can handle could become involved in corruption; how people with no taste for food could become gluttonous; how people with waning libidos could become engaged in shameful debauchery; how people who have everything but still obsessively envy people who have little.

Wonder no more. This behavior is evidence of the death rattle of the dying Sensate Culture of our magnificent 600-year-day as we transition through Nowhere Land as Nowhere Man to the Ideational Culture of our creative tomorrow.

Ideas have been a bane as well as a blessing. They have changed the world and changed us in the process. We have allowed ideas to take us wherever they might without much thought where that might be, or what impact these ideas might have on us and others. We take pride in being a conscious thinking animal, which in this transition if our actions are any indication suggests this is an oxymoron.

NEXT: Is Nowhere Man “Soulless”? – Seven

No comments:

Post a Comment