Wednesday, July 21, 2010

BUSINESS DEVELOPMENT OUT OF SYNC WITH THE TIMES

BUSINESS DEVELOPMENT OUT OF SYNC WITH THE TIMES


James R. Fisher, Jr., Ph.D.
© July 22, 2010

REFERENCE:

This is written in response to an invitation from The Journal of Technology and Investment “call for papers.” Over the past forty years working and living on four continents as a chemist, chemical sales engineer, and corporate executive, I have witnessed and participated in the transformation of the workforce from blue collar to professional, but with lagging institutional support. My published works have drawn attention to this as economics behave badly if the psychology and cultural programming of corporate systems fail to be in sync with change.

* * *

BUSINESS DEVELOPMENT OUT OF SYNC WITH THE TIMES

James R. Fisher, Jr., Ph.D.
© July 22, 2010

CORPORATE MYOPIA

We recently saw Goldman Sachs of Wall Street fined $550 million for suspect investment practices. Goldman Sachs took the slap on the wrist admitting no culpability. We have also seen Steven Jobs of Apple, Inc. reacting to criticism of the latest iteration of the iPhone. Jobs went public providing a menu of options for customers including a complete refund of the purchase. Goldman Sachs will pay the fine and go back to business as usual, which is how the crisis developed in the first place.

Goldman Sachs is a traditionally managed corporation structured to function with a hierarchical chain of command with the appearance of infallible authority firmly entrenched in the status quo. It is a closed system with well-defined roles and relationships, policies and procedures. The climate is constructed to support business as usual practices where loyalty is construed to mean obedience.

Apple is open and fluid in roles and relationships enabling it to be responsive to changing demands in a timely fashion. Workers are self-regulated in a creative environment with the free exchange of ideas.

These complimentary opposites in business development, both autopoietic and homeostatic, compose the greater whole of enterprise as the past confronts the future:

(1) Hindsight thinking (Goldman Sachs); foresight thinking (Apple)
(2) Rigid collective conformity (Goldman Sachs), individualism (Apple)
(3) Infallible (Goldman Sachs), fallible (Apple)
(4) Rules (Goldman Sachs), process (Apple)
(5) General (Goldman Sachs), local (Apple)
(6) Means emphasis (Goldman Sachs), ends orientation (Apple)
(7) Heavy emphasis on analysis (Goldman Sachs), synthesis (Apple)

Most corporations, including Goldman Sachs, although replete with professionals, have never left the “Golden Age” of American capitalism. This age extended from the end of World War II through the 1970s when college drop outs such as Jobs and Bill Gates made their presence known. Robert Reich in “Supercapitalism” (2007) sees these innovators as the wrecking ball of the common culture.1 What he marks as the collapse of core values of institutional society, such as the notion of the “common good,” only changed to self-interest and “personhood.”2

The infallible, closed-minded, status quo authority oriented corporation is losing its traction. Everyone seemingly wants to climb on the nonexistent managerial pyramid. The times are replete with such contradictions. The working middle class is disappearing while the wealth gap has grown the widest since 1929. Meanwhile, an army of college "drop out" innovators has created the new electronic economy that marches to a different drummer. Corporate society over the last quarter century has not only lost its steam but its way.


* * *

Apple and firms like it have left the institutional model. There is no special ethics for professionals. They do not have a special moral status. Obedience to moral law of self-creative individuals is by taking responsibility and is motivated by respect for the moral law. Obedience to civil law is by giving duties and is motivated by fear of punishment. .

A generation ago, Jobs stepped down and back to the traditional institutional model by elevating John Scully of Pepsi fame to run Apple. The corporation nearly went under. Jobs came back in 1997. He restored the open innovative climate and business stability. Easily missed in this transition was a clash of cultures between institutional hindsight and progressive foresight thinking. Not only has institutional infallibility been stultifying for young Turks such as Jobs and Gates, it was not organized to respond to accelerating external demands.

Two decades earlier, Jobs was designing video games for Atari, while his engineering buddy Stephen Wozniak was working on pocket calculators for Hewlett-Packard. Wozniak designed a personal computer but couldn’t get H-P to produce or market it. Jobs was touring a Xerox facility in 1979 when, suddenly, his world changed.

The Xerox’s computer graphic screen, overlapping popup windows, icons, fonts, and mouse immediately intrigued him. He found himself leaping and jumping around yelling, “Why aren’t you doing something with this?” He was saying to himself, “If you don’t, I will,” and of course he did.

The Xerox engineers who designed the computer didn’t have the will, the way, or the persuasive skills to convince Xerox management that it was sitting on gold.3

* * *

In 1980, IBM was asleep at the wheel when hit with the personal computer surge. It lost more than $70 billion of stock valuation and eliminated more than 200,000 jobs. In a panic, it rushed to get into the pc business by putting together existing components and technology instead of launching into complete engineering scheme, having the most sophisticated facilities in the industry. It looked for an operating system and decided on Digital Research (DR) whose operating system CP/M was the market leader for pc’s at the time.

IBM could not come to agreement with the firm’s engineering head. He was out flying his plane. His wife, Dorothy McEwen, who handled the business end, met with IBM and rejected its offer as too lopsided to IBM.

Waiting in the wings was a fledgling company, Microsoft, which had more chutzpah than cutting edge technology. It could not say “yes” fast enough. It looked at the contract as a vehicle that would allow it to sell its real products, the programming languages. So, late in 1980, IBM signed the agreement that would turn Bill Gates and his partners, Paul Allen, and Steve Ballmer into multi-billionaires.

To satisfy IBM, Microsoft had to do something posthaste. It bought the rights to what was then called the “Quick and Dirty Operating System” (QDOS) from the small firm, Seattle Computer (SC). Microsoft first paid $25,000 for non-exclusive rights. SC had no idea how valuable DOS would become.

Microsoft then quickly paid an additional $50,000 for exclusive rights. In 1986, six years later, Microsoft paid SC nearly $1 million to settle a dispute over the rights to DOS. Microsoft was now home free and on its way.4

In all this excitement, missed was the fact that IBM, the quintessential infallible institution with a bible of prudent industrial practices, was yesterday’s story because it was molded in an early twentieth century mindset.5

* * *

We have watched the automotive industry spin out of control with General Motors going bankrupt and having to be bailed out by the Federal Government. It was clear in the number of appearances of GM’s CEO Rick Wagoner before Congress in 2009 that GM was paralyzed to do anything. GM’s rise to power and decline towards insolvency parallels the rise and fall of infallible institutional authority.

An indicator of the slow motion death of big institutions such as GM is that decades pass between the first authoritative portents of erosion and the end. In the case of GM, a candidate for GM's CEO, Elmer Johnson, on January 21, 1988 wrote this memo to his peers:

“We have vastly underestimated how deeply ingrained are the organizational and cultural rigidities that hamper our ability to execute.” 6

The mindsets are passed down year after year, intact, giving free reign to accumulate. Even a drop in market share in 1988 from 54 percent to 19 percent failed to trigger remedial action.


THE NEW REALITY

Obedience to authority is blind to operational consequences and indifferent to both stated goals and damage inflicted on clients. In the new reality, pragmatic foresight has no hierarchy. Autonomy is packaged with responsibility only for outcomes. On the other hand, strict rules are essential to maintain a hierarchy with a preoccupation with means.

Since nearly all corporations preserve the command configuration that defines them, autonomous individualism becomes a center of heresy. The choice is not a wide one, submit and be crushed, or find your way to another disposition.

Jobs and Gates did so by forming their own companies. Individuals that stay on are in the system but not of it. They cannot change the culture. The core values of operative life lie deeper than verbal descriptions and are organized into belief systems that never have to be stated. Conformity is an exacting pressure.

Individuals, confounded by this programming yet defying of it, retreat into passive behaviors.7 This is especially true of professionals. They believe themselves trained to lead, but forced to fit. Nothing cripples an organization more than its corporate sins.8

Unfortunately, when corporations confront out-of-scope disturbances, they often confuse what has become a poison with medicine. Several iterative steps over the last eighty years have come to roost.

In 1927, Elton Mayo did an industrial engineering study at the Hawthorne Works of the Western Electric Company in Chicago. Workers responded to the attention, launching the humanistic movement. Human resource management followed.

It seems clearly evident now that the tealeaves were misread. In 1929, the Great Depression hit throwing the world into economic chaos. The Robber Barons of the nineteenth century had clashed with workers. They felt forced to hire strike busters, which sullied their reputations. To distance themselves from this daily confrontation, they created a new mechanism called “management.” Managers rose out of the dark days of the depression when unemployment reached twenty-five percent of the labor force.

Desperate to get people back to work, the Roosevelt Administration launched a number of public works projects. These only met with modest success. World War II changed the dynamic. The civilian population was mobilized into industrial production in support of the military. This succeeded in lifting the economy out of the depression where government intervention had failed. David Halberstam captured this remarkable performance in “The Next Century” (1991):

“With our great assembly lines and our ever-expanding industrial core (and protected as we were by two great oceans in an age when weaponry could not yet cross an ocean), we became the industrial arsenal for the mightiest of war efforts. In 1942 and 1943 America alone produced almost twice as many airplanes as the entire Axis. In 1943 and 1944 we were producing one ship a day and an airplane every five minutes.”9

It was American management’s greatest hour. Management as much as any other single factor proved instrumental in winning World War II. Typically, there were only three levels of management. Workers often enjoyed some independence and control of their work, workstations, setting up machines, purchasing materials, planning schedules and even charting their productivity.

This all disappeared following the boom years of the 1950s.10 The union movement gained steam with such leaders as Walter Reuther of the United Auto Workers. The UAW sued for wage and benefit concessions at the expense of control of work. Management, which was in its ascendancy was willing to concede massive entitlements.

Management believed this would lead to greater loyalty, productivity, and labor peace. It did just the opposite. The UAW’s demands of automakers could not stem the growing disconnect between workers and work.

In a strange way, the automotive industry and the UAW became mirror images of each other in corporate style and greed. The problem with entitlements was that workers felt entitled to them irrespective of the level of productivity.

John Strohmeyer captured the essence of this disconnect in “Crisis in Bethlehem” (1986). The steel industry created a “furlough program” in the 1960s. The senior half of the workforce was given an additional 13-week paid leave every five years. Steel workers already had about every benefit and financial concession imaginable. The idea was to manage manpower requirements more effectively, provide the workforce with the incentive to pursue self-enhancing interests, and ultimately, to improve productivity.

Most workers got a second job. The program was a colossal failure. Strohmeyer muses, “The steel industry’s excesses crippled the goose that laid the golden egg.”11

THREE DOMINANT WORKPLACE CULTURES

Over the last forty years, the systematic process of making workers management dependent and increasingly counter dependent on the corporation resulted in learned helplessness and passive behaviors.

Many workers have become suspended in terminal adolescent, immature and reactive with wants treated as needs seeing themselves as victims of circumstances.

Such programming cannot be changed overnight. Cultural change is not brought about by war or plague, but in using the new artifacts instead of the old. Cultural change proceeds when corporations assimilate new artifacts. The old ways are not displaced but wither away by disuse.

In the present climate, it is best to allow the culture to die out. There is no point trying to change the unchangeable, but to engineer around the unmovable constraints. Once the signature constraints are identified, and the ironies accepted, a bypass highway can be built.

Workers are sensitive to the power structure of the organization and to their own personal goals and welfare. Company policies are followed on the presumption they are infallible and will solve problems. Too often the policies describe a system that causes the problems with practices responsible for creating problems it then fails to solve.

We have seen this demonstrated with the Culture of Comfort. A downward spiral is fashioned in which the presumed solution makes matters worse, causing a redoubling of efforts to resolve driving operations instead into the Culture of Complacency and not the Culture of Contribution.12

Complicating the picture further, management and organizational dominance was tolerable for decades by workers when position power and knowledge power were assumed to be the province of management. Blue collar workers were understood to be poorly educated, and as Frederick Winslow Taylor, author of “The Principles of Scientific Management,” (1911) once put it:

“Now one of the very first requirements for a man who is fit to handle pig iron as a regular occupation is that he shall be so stupid and so phlegmatic that he more nearly resembles in his mental make-up the ox than any other type.”13

Granted, it is a cognitive bias, and Taylor is not alone in having his. The significance of such biases operates to displace information from its axis of truth. Cognitive biases have proven counterproductive.

For one, the workforce in 1950 was ninety percent blue collar and ten percent white collar; in 1960 it was already fifty percent blue collar and fifty percent white collar, in 2010, it is ten percent or less blue collar and ninety percent white or pink collar. The professional class has arrived but is essentially managed, mobilized, motivated and manipulated as if it is still the 1950s.

It is the expediency of decisions making at the level of consequences that is critical to corporate success. Bottom up communication has functionally displaced top down operations even if the corporate and military organizational charts stay the same.

Alas, this is reflected in the dominant cultures in the workplace. They vary from a Culture of Comfort with workers management dependent to a Culture of Complacency with workers counter dependent on the organization for their total well being when the workforce is primed with professionals for the Culture of Contribution.14

Author William Livingston claims this is a cultural predilection to a hindsight orientation at the expense of foresight. He writes:

“The snub of intelligence-based foresight can be seen embedded throughout society. For example, more than 99% of all courses offered in the educational system, top to bottom, are hindsight based. The subject matter taught, art or science, is based upon history, lessons-learned, and past discoveries. Pragmatic foresight is not offered at all. It takes hindsight to get into the university and hindsight is all you take out of it.”15

We have cosmetic change instead of real change because we have failed to understand:

(1) The structure of work determines the function of work;
(2) The function of work creates the workplace culture;
(3) The workplace culture dictates the organizational biases and behaviors;
(4) The organizational biases and behaviors establish the vitality and viability of the organization.

An invisible hand guides the process. You only have to step back and observe to see how effective it is.

THE PARADOXICAL DILEMMA

There has been a nearly imperceptible shift in business development that has been going on since the beginning of the last century. That shift has been very much in line with a variance of the predictions of Marx and Engel. Workers have not gained control of the means of production as they insisted, but the means of information, which has proven even more dramatic, as this is the Age of Information.

Organizational scholars have for the past half-century focused on management and the corporation giving far less attention to the individual worker and the transformation of the American workforce from predominantly unskilled to predominantly professional.

Douglas McGregor led the charge with “The Human Side of Enterprise” (1960) with his Theory “X” and “Y” style of management; Blake and Mouton followed with “The Managerial Grid” (1964) with managers either task or people centered; years later there was William Ouchi’s “Theory Z” (1981) where he applied Japanese management to the American corporate culture; and then Peters and Waterman following with “In Search of Excellence” (1982) profiling the best run companies; and Harold Geneen in “Managing” (1984) with the audacious claim a good manager could manage anything.

Strangely missing was the worker in the trenches, a worker who no longer needed management, motivators, elaborate schemes to put him in the mix, but only required a positive work climate to do creative work in an open system. Work Without Managers (1991) was written with this in mind, recognizing the passive behaviors toxic to operations.16

* * *

“In Search of Excellence” proved the opposite of its own hypothesis. Business Week illustrated this in its cover story “Who’s excellent now?” (November 5, 1984), as two years after its publication many of the companies profiled were struggling. As for Geneen’s declaration that a good manager can manage anything, John Scully of Apple proved the exception to this idea.

Fast forward to today. Commentators on business, management and leadership still write books only obliquely directed to the new professional. Most have management in the center of the story. The exception are such books as “Trust & Betrayal in the Workplace” by Dennis Reins, “Succeed on Your Own Terms” by Patrick Sweeney, “The Versatile Leader” by Robert Kaplan, and “BeliefWorks” by Ray Dodd.

When it comes to personhood, such books as “Moral, Believing Animals: Human Personhood and Culture” by Christian Smith and “A Whole New World” by Daniel Pink get it about right, especially Pink.

Pink has the professional clearly in his sights. He recognizes that exclusive left-brain vertical thinking with deductive reasoning is not sufficient to transition from the Information Age to the Conceptual Age, which is just ahead.

The paradoxical dilemma is that perhaps only twenty percent of the professional workforce works in a climate and workplace culture conducive to creative thought. It is the reason for this discussion and other efforts to that end.17

* * *

1 Robert B. Reich, Supercapitalism: The Transformation of Business, Democracy, and Everyday Life, Knopt, 2007.
2 James R. Fisher, Jr., Work Without Managers: A View from the Trenches, The Delta Group Florida, 1991, pp. 33-36.
3 Randall E. Stross, Steve Jobs and the NeXT Big Thing, Atheneum, 1994; David Sheff, Game Over, Random House, 1994; Charles H. Ferguson and Charles R. Morris, Computer Wars, Time Books, 1994.
4 Stephen Manes and Paul Andrews, Gates: How Microsoft’s Mogul Reinvented an Industry and Made Himself the Richest Man in the World, Touchstone Books, 1994.
5 Paul Carroll, Big Blues: The Unmasking of IBM, Crown, 1994.
6 William L. Livingston IV, Design for Prevention, FES Publishing, Ltd., 2010, p. 89.
7 James R. Fisher, Jr., Six Silent Killers: Management’s Greatest Challenge, CRC Press, 1998.
8 James R. Fisher, Jr., Corporate Sin: Leaderless Leadership and Dissonant Workers, AuthorHouse, 2000.
9 David Halberstam, The Next Century, William Morrow & Company, 1991, p. 59.
10 Management swelled to as many as twelve levels. Executive compensation rose from five to ten times that of workers in 1945 to sixty six times that of the typical GM worker for the CEO at General Motors in 1968. Today this would be considered modest.
11 John Strohmeyer, Crisis in Bethlehem, Adler & Adler, 1986.
12 James R. Fisher, Jr., “ The Three Dominant Cultures of the Workplace, National Productivity Review, Spring 1997, pp. 37 – 48.
13 Frederick Winslow Taylor, The Principles of Scientific Management, W. W. Norton & Company, 1911, first published by Norton Library, 1967, p. 59.
14 James R. Fisher, Jr., “How a Culture of Contribution Gives Your Company a Grow-Up Call,” AQP Journal, July/August 1999, pp. 7 – 11.
15 William L. Livingston IV, Design for Prevention, FES Publishing, Ltd., 2010, p. 14.
16 See "Six Silent Killers."
17 James R. Fisher, Jr., “The Need for Lateral Thinking in the New Century,” National Productivity Review, Spring 2000, pp. 1-12.

No comments:

Post a Comment