CATEGORII DOCUMENTE |
Bulgara | Ceha slovaca | Croata | Engleza | Estona | Finlandeza | Franceza |
Germana | Italiana | Letona | Lituaniana | Maghiara | Olandeza | Poloneza |
Sarba | Slovena | Spaniola | Suedeza | Turca | Ucraineana |
This basic, fundamental human constant is due to the fact that we are all pretty stupid, and no amount of information, learning or technological expertise seems to alter this subtlety one iota. The problem is that we have ready-made, socially condoned, psychologically acceptable explanations for crucial events. Unexplained is the curiosity that things routinely go wrong without evident cause and despite everyone's best efforts.
The trouble really is, of course, with the explanations, which contribute to failure by explaining away not only the inexplicable but the explicable as well. We need the assurance of having answers, so if necessary, we make them up. These myths, in turn, can prevent us from discovering valid answers to our questions. Particularly elusive is the answer to the perpetual human riddlewhy are our best efforts not good enough?
Well, first of all, our efforts may not be our best because we are biased toward the particular schema which defines our ability to cope. Not only does this bias inhibit cultural improvement by limiting competence, but the majority of people, with marginal abilities, support those who goof up, feeling that they will then get similar support when their turn comes. Thus, the weak support the corrupt, because just as efficiency is regarded as a threat by the inept, accuracy of perception and analysis is regarded as a threat by the powerful.
If we want to escape this self-constructed impasse, we would do well to make fresh inquiries into our shortcomings and imperfections. Our cultural liabilities are so decisive in the way they undermine our institutions that we are compelled to understand them if we intend to be exceptions to the rule of civilized failures. Thus far, the balance sheet on Western Civilization is more extensive but no more favorable than that of any society that has passed before us. As fast as wealth piles up here, poverty springs up there. Increases in material abundance are matched by increases in bitter resentment as production and success beget scarcity and jealousy. Scientific advances are matched by human failings, construction by decay and happiness by misery. These harmonious equations are maintained by the characteristic errors, ignorance, ill will and general stupidity of civilized people.
Western Civilization owes its technological predominance to the application of reason to the study and control of nature, but a major stumbling block to the study and control of ourselves has been the assumption that, since we can use reason, we are reasonable. We have had 100 years since Freud to acknowledge that we are basically irrational, but the models for human behavior proposed by the methodical scientific community are invariably idealized constructs which are much more self-consistent and orderly than people would ever want to be. The problem for behavioral scientists is that logic must be used to explain irrationality.
Although reason is useful for extending a line of thought to the next point, it is of limited value in untangling complexity. Logic is certainly a useful analytical tool, but the overall physiological condition of an organism, for example, is not particularly rational and cannot be comprehended by anyone limiting his thinking to linear logic. (E.g., there is no logic in balancing hunger and thirst, sleep or sex. These are drives or states by which competing physiological systems cooperate to maintain the dynamic imbalance we recognize as life.) The best that can be done in analyzing such phenomena is to use polygraphs to provide data for statistical models which allow us to predict the probability of normal behavior. In fact, approximation is the best way to represent matters of such complexity.
This basic principle is even more important when one attempts to understand human behavior. Behavior is very much a compromise phenomenon. It may be analyzed logically, but as a functional whole, it is comprehensible only in terms of relationships among interacting systems. Only by accepting a compromise model of the human being in all its inconsistent ineptitude based on misperceptions of the environment can one begin to understand what being human means. Although we gather a lot of information, we also ignore a lot and may even be pointedly ignorant in matters of great importance to us simply because our schema directs us to be ourselves. Likewise, the information people possess may be used inappropriately because certain behavioral patterns are preprogrammed into or excluded from the response repertoire. This is both human and stupid.
As all indications are that there was and now is more than enough stupidity to go around, to the extent that the past is a guide to the future, we should expect stupidity to continue to be our constant companion as history unfolds. Certainly, it has been an integral component of Western Civilization since the beginning. The ancient Greeks indicated their firsthand familiarity with it when they formulated Cassandra's Cursethat those who prophesy the truth will not be believed. There have been numerous examples throughout history of accurate warnings wasted because recipients were not disposed to alter their beliefs simply to accommodate new and better information.
In his last plays, Euripides paired moral evil with folly and asserted that people would have to confront both as part of their being, but we have been very reluctant to do so. The problem seems to be that however brilliant the human mind may be in other ways, it is not geared to compensate for its own deficiencies. The reason for this is that cognitive deficiencies (which take the form of opposition to integrity) are expressions of the social dimension of life. It is this which shapes the schema as an individual becomes a member of a reference group.
The condemnation of idealism is a constant theme coursing through the history of Western stupidity. Socrates was a case study in the stupidity of civil obedience. Christ was crucified for living up to ideals. John Huss was a religious reformer burned at the stake as a heretic in 1415. Giordano Bruno was perhaps a little too philosophical a philosopher to have profited from Huss's experience and so followed his fate in 1600. Not long thereafter, Galileo was forced, under threat of physical torture, to disavow the truth about motion in the solar system. As shameful as all this was, it is embarrassing to note that for all our sophistication and technological expertise, contemporary civilization is as morally retarded and ethically handicapped as any that ever existed. In this sense, there has been no progress throughout history. Worse yet, there is no prospect for any because apparently no one in the research oriented educational establishment is even aware of the problem much less addressing the issue.
Thus,
we are still imprisoned in our belief systems. For millennia, Western
Civilization was enslaved by its belief in God. After She died in the
eighteenth century, there was a period of enlightened rationalism when
Europeans sank by their own bootstraps into revolutions and intercontinental
wars. During the nineteenth century,
Around
the turn of the twentieth century, Dr. Freud reinforced
Sometimes
not only truth is sacrificed but the general supporting culture as well. This
willingness to write off one's extended human environment for the benefit of
the self-aggrandizing in-group is most obvious in the mighty. In contemporary
This
catchy image expresses little more than that two contrasting trends have
coexisted throughout history. One is the tendency of people to accept their
fate; the other is the tendency to rebel against it, and the history of
Whatever its superficial appeal, the missionary complex is often darkened by a deliberate effort to create fate. Those determined to remake the world in their own image cannot accept the stupidity of the world as it is: they feel compelled to add to it. In 1961, the Kennedy administration suffered a crusading compulsion to guide the Vietnamese away from their own objectives and toward those of American policy. This mission was doomed because we could not perceive the native anti-colonial sentiments as anything but Communist threats to democracy.
In
the case of
When events fail to confirm beliefs, the mental condition of cognitive dissonance exists until some adjustment of or to the incoming data can be made. Usually in the face of challenge, the schema becomes rigid, and data conflicting with it are sacrificed for the sake of emotional and ideological stability. During the Johnson years, the administration was frozen in a dream world completely at odds with clear evidence that official policies were not just ineffective but counter-productive. As would happen again five years later, it remained for the media and the people to save the country from the government, since those loyal to the President had become incapable of making realistic assessments of the effects of their actions on the real world.
It is noteworthy that the Kennedy team liked to refer to themselves as 'Crisis managers'. In a similar vein, before becoming President, Richard Nixon wrote a book which covered his six favorite crises up to that time. One must wonder to what extent our leaders may be disposed to create crises to test themselves, to discover how much control they have, what their limits are and who they are. Too often, rulers give themselves the choice of the disastrous or the incorrigible and then choose both.
Since many of the major, specific problems confronting contemporary civilizations are not cultural universals, they should be (theoretically, at least) solvable. Poverty, racism, sexism, family disorganization, political exploitation, ideological oppression and war are not defining characteristics of the human condition. They are all products of certain circumstances which could be altered. Whether they will be altered or not is the solemn matter we address here. The great human tragedy is that we know which conditions to alter and how to alter them in order to eliminate most of the problems mentioned above. Nevertheless, we usually fail to do so because our leaders keep us from adapting to new conditions while our schemas keep us from compensating for our cultural limitations.
Indeed, it is the mark of a truly wise person to be able to put himself in his own placeto view the world accurately from his own perspective. Making due allowances for one's own values permits accuracy in perception so that behavior may be based on relevant considerations. However, it is most difficult for people to penetrate their religious myths, comprehend their plight and then apply their cognitive skills objectively so as to deal successfully with their problems. In fact, what we really need to overcome ourselves is as little humanity as possible in the scientific process of gathering and analyzing data and as much as possible in the technological process of applying knowledge and understanding. Generally, however, there is no clear distinction between the two processes of gathering and using information. As we interact with our environment, we monitor the results of our behavior; we apply what we know toward the solution of problems and then learn how effective we have been.
Unfortunately, society is better set up to learn what it believes than how ineffective it is. If lessons of life cannot be massaged into conformity with ideology, they will be rejected for the good of the directing schema. It was this very human commitment of cultural priests to cognitive consistency which led such stubborn visionaries as Huss and Bruno to the stake and Galileo to humiliation.
There is nothing so unnerving for established powers as having their assumptions challenged, but challenging assumptions has been the stock in trade of great scientific revolutionaries throughout the ages. Copernicus was the first. In fact, the term 'Revolutionary' is derived from his notion that the earth revolves around the sun. A major step in the development of this insight was his realization that the prevailing astronomical assumption of his daythat the earth was a fixed point around which everything else rotatedwas just a subjective view which everyone took for granted. Although this view was fundamentally incorrect, it was part of an ideology which was considered a consistent whole by the religious establishment. An attack on any part was construed as an attack on Christianity in general and was met with determined resistance in the form of extraneous criticisms. Basically, the gist of the counterargument was that the earth had to be the center of the universe because that was where God obviously would have placed the home of important creatures like ourselves.
Although objective observations and rational theories count for little when one attempts to refute the absurdities which sustain the power structure, science can help us understand our universe and our place in it. As a heuristic device, it has been remarkably successful, but as both a schema breaker and maker, its potential was and is invariably affected by the human need for a positive self-image. This may very much affect the selection of research projects and evaluation of gathered data. The ubiquitous and eternal human reluctance to know who we really are is, nevertheless, yielding to those committed to finding out. Naturally, the success of science has often been at the expense of those wishful fantasies which stifled our cognitive development for centuries.
Science dealt human narcissism three devastating blows courtesy of Copernicus, Darwin and Freud. In all three cases, the scientific explanations (of cosmology, biology and psychology) were resented and resisted by all those who favored the more flattering established notions that we were rational beings created especially by God and placed in the center of Her universe. The scientific theories survived despite the fact that they lacked any intrinsic appeal to people in love with themselves.
Scientific theories are appealing only in an icy, intellectual way. Science is really a system of established rules for gathering and analyzing data and is supposedly accepting of conclusions derived by the process regardless of their emotional appeal. In fact, the success of science is due to the institutional establishment of the means of schema formation so that the popularity of a particular interpretation will have minimal impact on the evaluation of experimental results. As the end of science is understandingnot the establishment or perpetuation of any particular idea, it is something of a contradictory institution, being set up to both confirm and refute prevailing theory. In their ideal moments, scientists are totally objective, and they replace bias and prejudice with accuracy and integrity.
Unfortunately, real scientists are all too human, so the institutionalized enterprise of science is too encrusted with stupidity for it to save people from themselves. A classic and tragic example of scientific stupidity was the vacuum of indifference which greeted Gregor Mendel's work on the genetics of pea plants. Scientists of the day simply were not able to appreciate his findings. He would have had greater impact had he ben able to generate some controversy. As it was, he simply presented his results, which were roundly ignored by everyone else as irrelevant to what they were doing and thinkinguntil, thirty-five years later, biologists were doing things and thinking about problems which led them to comprehend the value of his contribution.
Although it may take a generation or two, the scientific community will eventually catch up with its unnatural selection of ideas and correct the markedly unscientific tendency of laboratory priests to adhere to familiar theories. Their typically human reaction to a new revelation is to compare it to the prevailing schema (i.e., theory). However, this is usually a one-way process, with the entrenched explanation being accepted as the defining standard of reference to which data and new hypotheses are expected to conform. Scientists are quite human in their propensity to ignore or reject, for as long as possible, findings inconsistent with the popular theory of the day.
The bottom line is that science is really a religion, with the devoted believers sticking to dogma whether it makes good sense or not. Every difficulty is placed in the path of the heretic who dares challenge a sacred tenet of the faith. Research which might disprove an established theory may not be funded because it would prove to be at best a waste of money and at worst rather disturbing. Experimental results which are at odds with holy expectation are scrutinized very carefully if they cannot be rejected outright. If valid, disquieting results still may not be published by journal editors indoctrinated in revered theory and likely to perceive novel findings only as threats deserving of suppression. If published, original interpretations and hypotheses can always be ignored by practitioners of ye olde-tyme religion.
It is rather tragic to note some of the works which were not even ignored. A case in point was John J. Waterston's paper on the kinetic theory of gases. This was rejected by the Royal Society of London in 1845 as being 'Nothing but nonsense'. It was finally published in 1892 when it no longer posed a threat to the re-establishment. One can but wonder how many possible advances in scientific thought have been thwarted by professionals oppressing their conventional expectations onto new, inventive ideas. For all their training and sanctimonious pronouncements about objectivity, scientists are no more tolerant than people when their self-evident, hallowed, unassailably correct and righteous views are challenged.
In fact, a Young Turk starting out in science (or any other field for that matter) should keep to himself any good ideas of importance which might threaten to advance his profession or improve his reference group. Specifically, the young scientist is well advised to begin his career by contributing some bricks of knowledge to the wall of ignorance. Initial research proposals should not challenge the major theories of the day. Revolutionary ideas should be put on 'Hold' for a few years until the initiate is clearly a member of the club. Then he will have the prestige needed to get any offbeat ideas he might still entertain accepted for publication. Of course, this is all good advice well wasted on anyone cursed with an ounce of integrity or a passion for understanding.
It will come as no surprise to cynics that the payoff in science is not fallible knowledge but money, with most going to those who publish most. These tend to be ideological conservatives who concoct little research projects which support established theory. Coupled with this tendency toward financial support for the orthodox is an organizational trend toward teamwork in research groups at the expense of the individual go-getter. The scientist is becoming decreasingly an independent thinker and increasingly a fellow worker who fits in and gets along with the team.
Outside the lab, the relationship of science to the community is supposed to be one of mutual support. Scientists are really specialists at converting money into socially acceptable knowledge, so from the standpoint of the scientist, the need for financial support can be a restriction on the questions which may be asked and the answers which are permitted. In the Third Reich, anthropologists produced research which supported policies of racial supremacy. On the other hand, Arthur R. Jensen found that contemporary American culture is generally hostile to his suggestion that there is a genetic basis for the difficulties black children have in academics. Fortunately, our interest here is not in the validity of this or any other study but in the social attitudes which cause controversial findings to be embraced or rejected by a given culture. It is simply irrelevant to evaluate the scientific validity of a theory or research results in terms of the effects they might have on a particular social cause. Still, that is usually how societies judge which research programs will be supported and what results will be accepted.
In a similar way, a basic concept like 'Mental health' has been shaped by two stupid cultural factors. The first of these is confusion as to just what kind of world it is to which the mentally ill are supposed to adjust. The second is the tendency of those who use and define labels to take them and themselves a bit too seriously.
In terms of mental health and illness, the problem confronting all of us is that we are expected to adjust to an idiotic society. This is what makes the goal of most psychotherapy so tautologically self-defeating. As therapy proceeds, the individual is to become more self-accepting and more realistic, which is just fine, if the 'Self' is realistic. However, what is to be done when realism leads one to the overwhelming conclusion that the self is a bundle of contradictory needs and emotions, maniacal and depressing drives, brilliant and stupid ideas? The problem then becomes a matter of accepting this while trying to adjust to a wacky world of contradictory organizations and institutions.
When
the problem of adapting to ourselves boils down to the prayer of Alcoholics
Anonymous to have the serenity to accept what cannot be changed, the courage to
change what can be changed and the wisdom to know the difference, one is
practically driven to drink. Even professional staff members in mental
hospitals do not know the difference when they attempt to distinguishing sane
from normal people, or sick from healthy patients or whatever it is they are so
subjectively doing. David Rosenhan demonstrated this with a study in which
seven '
Labels
can be used not only to make people look sick to doctors but to cure them as
well. This was accomplished by the trustees of the American Psychiatric
Association on Dec. 5, 1973, when they voted to remove homosexuality from the
psychiatric classification system. In one deft stroke, millions of people
formerly labeled as mentally ill were redefined as healthy. It is sad to note
that the general medical community has not picked up on this method of
legislating health. Cancer is so common that it could be voted a '
1973
was a good year for cosmetics, as that was also the year in which 'Mental
retardation' in
Physicists must envy social scientists who can cure the ill by voting and convert the abnormal to acceptability by redefining terms, but physical scientists are actually busy playing their own subjective games with nature. In fact, they have gone overboard to the point of giving up on 'Reality' as a limiting condition in research. Modern physics is built on the principle that anyone's version of reality is so structured by his schema that there are as many realities as there are observers.
In the good old days of Newtonian mechanics, physicists worked in a precise, objective, determined universe which ran along like some grand celestial clock. Quantum mechanics has changed the clock into something even Dali could not have recognized. The universe is now perceived as a grand expression of undetermined microevents from which humans can garner only generalized statistical conclusions.
If anything, modern physicists seem a bit too willing to dismiss reality as simply a field for subjective impressions. There is reason to believe that physicists find what they seek because they create conditions which will produce results supporting their assumptions. If they expect to observe particles, they find particles; if waves, they get waves. Electrons spin as expected, and the axis of spin conforms to the investigator's prediction. Such findings prove little except that the subatomic world is as determinate as it is accommodating to experimenters. There is, thus, an alternative explanation for what physicists assert are undetermined microevents: it may be they are determined by methods of investigation which are too crude to permit objective studies of subatomic phenomena.
It is one of the great ironies of science that the assumption of cause/effect cannot be proven. Events may be correlated, but all a true scientist can assert is that under certain conditions, particular, specified couplings are more or less probable. For example, there is a good correlation between mangled bodies and car wrecks, but which causes which cannot be proven. An unfortunate result of this philosophical limitation is a tendency to disregard the obvious fact and basic tenet of science that events are caused, much to the benefit of many of our stupider myths.
The classic case is, of course, the controversy over the presumed effects of tobacco on the health of those who use it. The general presumption is that smoking causes cancer, heart disease, strokes, etc. However, spokespeople for the tobacco industry are philosophically if not morally justified in pointing out the possibility that both smoking and ill health maybe due to a common cause. Hypertension, for example, might make one prone to disease (by lowering general, systemic resistance) and given to smoking for the release of tension provided by oral gratification.
Of all the myths which thrive in the face of scientific limitations, however, 'Free will' is the most fundamental. Although study after study confirms that human behavior is conditioned by the interactions of the environment and people on each other, the Western belief in freedom cannot be laid to rest. Although every successful experiment in the behavioral sciences theoretically undercuts the notion of freedom, there is no great soul searching confrontation developing on this issue. Just as God adapted to Charles Darwin, freedom is adapting to B. F. Skinner and his behaviorist colleagues so that our traditional schema may be retained. In this great unacknowledged battle between science and our favorite secular religion, our cultural priests play 'Mindguards', ignoring and interpreting accumulating evidence so as to minimize our awareness and anxiety as to just who and what we are. In this sense, the concept of human freedom is to the contemporary Western world what the Ptolemaic planetary system was to medieval culturean idea that makes us feel important.
This myth is sustained not only by those who revel in the limitations of statistical analysis but also by the Existential-Humanists. These are behavioral philosophers who sort of play the sad clowns in the circus of psychology. They are very much in love with the illusion of human freedom and feel the behaviorists' assertion that humans respond predictably to combinations of internal and environmental factors robs people of their dignity. They prefer to view people as creative and inherently good beings who are striving to fulfill their potential. According to them, Adolf and Attila the Huns were essentially good people just trying to realize themselves. Collectively, they constitute the 'Aw, shucks' school of psychology, and if there ever was a religious myth masquerading as philosophical idiocy, this is it.
By way of sympathy, it should be noted that the Existential movement developed as an attempt to understand how people, during the horrors of World War II, could 'Rise above themselves' and find meaning in their lives. Sartre, who made a career of telling people what they wanted to hear and already believed, emphasized self-determination, choice and responsibility for rising above immediate circumstances. The maxim was 'We are our choices', as existence and meaning were considered to be in our own hands. We alone are supposed to decide freely what our attitudes and behavior will be.
Of course, this is nonsense! Specifically, it is nonscientific nonsense. It may make good religion, but it is lousy philosophy and no kind of psychology at all. The phrase 'Rise above themselves' may sound better in French, but it is meaningless in any language. Self-control, choice and responsibility are elements of a conceptual schema people can learn, and it may be awesome but not totally surprising that some people clung to them during their desperate experiences during the war. A pat on their collective heads by self-serving Humanists might make people feel good about themselves, but it will not help anyone understand anything.
The one thing we do not want to understand is that our vaunted self-control is so patently superficial. Self-control is the ability to change behavior by consciously directing actions to achieve specific goals. However, this whole notion is rendered irrelevant by the realization that the selection of the specific goals is predetermined by a person's cultural background and individual experience. Further, people usually are and wish to remain unaware of themselves and thus may unwittingly create more problems than they solve while trying deliberately to achieve their subconsciously determined goals.
Although self-control may be illusionary if not impossible, belief in it and in personal freedom have been, are and probably will continue to be major contributing factors to the normal malfunctioning of Western society. This beliefas opposed to a belief in determinismis easy for us to accept because the English language is so implicitly moral in connotation:e.g., 'Innocence', 'Guilt', 'Courage', 'Pride' and countless other words imply a sense of 'Free responsibility'.
However, what we please ourselves to perceive as our choices have been conditioned by our personal history, immediate environment and future expectations. This means, among other things, that the concept of 'Guilt' is totally inappropriate in our legal system, as there is no possible justification for punishing those who chose 'Wrong'. More important, our final criterion for determining stupidity is invalidated. We found earlier that the criteria of 'Knowing' and 'Maladaptiveness' are much too subjective to be reliable guides to stupidity. Now we find that people cannot even choose to be stupid: they just are or are not stupid, depending on circumstances with which they interact but cannot control.
Nevertheless, and as nonsensical as it seems, there remains a moral dimension to Western stupidity simply because of our abilityimperfect though it may beto anticipate the results of our actions. By virtue of our intentions, we must accept responsibility for our actions. Regardless of external and subconscious factors, the fact that people consciously direct their behavior toward certain ends places a moral burden on them to be accountable for the future.
This Western ethic based on individual responsibility is simply our specific form of the universal human requisite for a moral code. Although the particular code will differ from group to group, within the microcosm of a given society, its system of ethics has significance and meaning. Every group has behavioral guidelinesusually both formal laws and informal morals. All of these systems reflect the cultural imperative of people to pass judgment upon themselves.
The odd thing is that we are so often 'Wrong'that is, we are stupid according to our own standards of judgment. Often, we are wrong because we really cannot perceive what is right or wrong when we are actively and emotionally involved in a situation. The cause of this perceptual difficulty obviously is that we have schemas which guide the misapplication of misinformation by misconstruing our behavioral context.
It is all too human to know better but still do something wrong. The drug addict knows what his habit costs him day to day and may cost him in the future, just as we all know the price of deficit spending in terms of both personal credit cards and the national debt. Nevertheless, to the extent that personal and official stupidity of the future will be the result of conscious, unethical efforts on our part to permit our schemas to keep us unaware of the dangers of our behavior, we will be stupid for the worst of all possible reasonsbecause we want to be.
One of the reasons people so often seem to want to be stupid is that they are trying to achieve subconscious goals rather than those formally defined by society. For example, a public official may indulge in graft for his own short-term aggrandizement and counter to his role of public trustee. Likewise, your archetypical 'Pig' policeman may eschew law and order for the immediate satisfaction of pushing around some hapless soul. On the other hand and a grander scale, the Watergate and Vietnam debacles might not have occurred had the irresponsible megalomaniacs involved restricted themselves to acts which were both legal and conscionable.
More to the point, the Presidents and their advisors would have fared better had they limited their behavior to what the average American considered conscionable. The real problem with the insiders of both the Johnson and Nixon administrations (as well as those involved in the Irangate scandal under Reagan) was that they considered their actions conscionable. According to their standards of evaluation (covered by such catchwords as 'National security' and 'Executive privilege'), their behavior was at least acceptable if not correct. Even more telling, in the case of the Watergate cover-up, acts known to be illegal were not considered illegal. Instead, they were simply considered political tricks or public relations ploys. Somehow, the country managed to survive those leaders who considered their acts to be both legal and moral and were sure no one would catch them anyway.
In their sordid way, Nixon's advisors were simply striking examples of people who let loyalty to a person or reference group replace intellectual honesty as a higher form of morality. In such cases, personal integrity is not so much sacrificed as it is redefined by group values, which become the standards for judging everyone and everything. Members may come to believe in their leader or reference group with religious devotion to the point that even attempts to improve him or it may be construed as attacks. Followers and members may prove their loyalty and gain the immediate social reward of group support by lying and falsifying and distorting information. Of course, anyone who questions group assumptions or subscribes to the explicit values of the general culture is regarded as a heretic and treated as an outsider. Still, occasionally, a whistle blower will arise and assert that any leader or organization that suppresses truth and punishes virtue is not worthy of loyalty.
It is sad enough that stupidity is built into the human condition by language and social reinforcement. Much of this is effected subconsciously and must be accepted as a given of human life. However, if we have contributed anything to the cosmic design of stupidity, it is that we have converted innocent animal stupidity into conscious immorality. In the zoological kingdom, neural systems (i.e., brains) have always blocked relevancies and some have paired irrelevancies. We have compounded subjective stupidity with rational, arbitrary stupidity as we engage in calculated efforts to be unfair and dishonest. When lying and distorting information became a conscious, rational effort, stupidity became a problem with a moral dimension. We became the first and only species to take pride in and credit for deliberately blundering into disaster after disaster. If we can but survive ourselves, stupidity is all but assured of a bright future by leaders who insult our intelligence in order to gain support by making themselves appear sanctified and righteous.
In
a grander sense, thanks to modern technology, righteous leaders can find
themselves suffering the throes of mental anguish for knowing more than they
might have wished. Such was the ordeal of Winston Churchill when experts
decoded a German message indicating an imminent attack on
It is sad enough that our leaders must play God, but it is then all the more disturbing that our culture has coupled the most awesome technology with a general indifference toward the human problems that technology creates. In the simple world of the !Kung tribes, the technology of bows and arrows and spears is complemented by knowledge of the total environment. In the sophisticated world of modern, computerized stupidity, technology is the environment. We have created an artificial culture which we believe floats above and independent of nature. Our telephones call each other up; our machines talk to each other; our computers amuse themselves with chess matches, and the robots are delighted.
While
we glory in our hardware, what has become of people? They starve by the
millions in
These are but some examples of a general and disturbing trend in the world today. Clearly, our cultural compromise between technology and humanology is out of balance. Not only the individual but humanity itself is obsolete. In the American political tradition, there is an amusing myth that the government exists for the people. In our technological tradition, we do not even have such a myth. We exist for our machines. We do not have computers; they have us.
As a cultural force, technology is narrowing and dehumanizing in its methodology. It is very effective in its limited range, but computers tend to limit the range of those devoted to them. Although the scientific method in the form of the social sciences has been successfully applied to human affairs, this success has been confined to what we can learn about ourselveswhich is all science can and should do anyway. What we do with the knowledge we gain from science is another matter entirely, and it is on this point that we are floundering. The problem is that all our scientific and technological know-how and knowledge, all our machines and computers cannot tell us what we should do. Scientific methods may project what results we can expect if we select a particular course of action, but that is not the same as indicating that we should or should not do it.
Thus, our faith in and commitment to scientific research are misplaced because no amount of information is going to make us better people. No amount of data would have made Hitler or Nixon better leaders: more knowledge might have made them more efficient but not better. Hence, at the most basic and general level, the crisis in Western Civilization is due not to a need for more knowledge and research data but to a failure of our ethics of action and shortcomings of our informational morality.
As for our ethics of action, there is good news and bad news. Currently, we are in a phase of consolidating, organizing and institutionalizing stupidityconcentrating it in a technoelitist computer/communication complex whose effects are broadly distributed democratically. In the future, we should expect more and more planned stupidity, as centralized, standardized bureaucrats base blunders and design disasters upon our ever deepening foundation of amorality and for an ever expanding base of dependent victims. If this is not to be, if this prognosis proves false, it will be because we finally recognize that science and technology are ethically barren and morally neutral. That is the good news.
The bad news is that our used and abused moral values have provided the ethical guidelines, rationalizations and justifications for all the political corruption, social ills and idiotic wars we have forced ourselves to endure. If the past is any guide, it will not be much of a guide for the future. If our past (im)morality brought us to the brink of nuclear war, created slums, fostered crime, starvation and misery, how will those values help us cope with the new challenges technology imposes upon us? Now that we can transplant organs, someone has to decide when the donor is dead. Euthanasia will become more common as an alternative release from the lingering suffering of those afflicted with incurable but non-fatal conditions which modern medicine can prolong indefinitely. If we are to maintain our historic tradition of stupidity, we are going to have to devote more time and energy to planning our immorality. Further, we will have to develop new forms of stupidity to prevent us from coping with the problems we are creating. Futurists should take note that stupidity will be one of the more dynamic fields in our coming cultural development.
Genetic engineering and eugenics are but two fields which will pose increasing problems for society. For years, people have selectively bred birds, dogs, horses and cattle and peas, beans and melons. Is it or is it not stupid to improve our own species by similar methods? Whatever the answer, it is based on morality, if not intelligence. Historically, the answer has been 'No' to the suggestion of selective human breeding. It is considered immoral to use the knowledge we possess in this field to improve ourselves by deliberate planning. The basic problem is that of finding broad agreement as to just what would constitute 'Improvement'. While this is a difficult matter, it should be possible to find some general principles to which everyone would agree, if we were to but try.
Such principles will themselves be determined by the values used when we judge the application of knowledge in the cause of humanity. Unfortunately, 'Sci-tech' will not be much help in this regard and, as suggested earlier, may even be limiting the ethical development of Western culture by its very success with 'Quantitative reductionism'. Science helps us learn about nature by breaking down complex phenomena into measurable units. However, all the essential complexities of biological and social systems do not lend themselves to being reduced to quantifiable bits of information. Nor do these complexities of life readily lend themselves to the stepwise logic of linear analysis. Computers which can help analyze simultaneous interactions of phenomena help overcome this limitation of dealing with one thing at a time, but they are limited to handling information which can be reduced to Computerese. Hackers just do not seem to be able to appreciate the vital importance of the human element which cannot be translated into their language.
A tragic example of this was the failure of the modern American Army to calculate morale as a factor in the Vietnam war. Secretary of Defense Robert McNamara was the consummate computer man, and everything that could be was quantified and analyzednumber of troops, amount of equipment, tons of supplies, etc. Not only on print-outs but in reality as well, the government forces enjoyed a ten to one ratio in everything calculable over the Vietcong. However, all this was outweighed by the fact that the Vietcong troops were at least ten times more willing to fight and die than the soldiers of the South Vietnamese Army. The inability of the Pentagon to appreciate the crucial element of motivation and incorporate it into their intensely statistical schema was a major contributing cause of American stupidity during the conflict.
Looking forward in more general terms, it is with discouraged resignation that we must accept our fate of a future shaped by all kinds of stupidity, with the specific dominant form depending primarily on the evolving relationship of technology to the society it creates. As life becomes reduced to a silicon chip, knowledge will become an end in itself to the point that society is dehumanized. The best that we might hope for is that scientists will honor their own ethics for gathering information and, secondarily, promote a humane technology when applying knowledge to the creation of problems. In any event, stupidity will be an integral part of the compromise condition of social life in the future, with its precise role and style being shaped by what we expect of ourselves.
If we want to make our expectations a bit more realistic, there are a number of questions we can ask when analyzing our stupid behavior. Was it an individual or group effort? Who made the crucial decision? Did he know what he was doing? Was he trying to find out? What made it a defective decision? Did external conditions contribute to making it stupid?
For such clear-cut questions, there are ambiguous answers. To the extent that stupidity is behavioral irrelevance, one source may be found in the subjectivity of decision makers. They may be excessively concerned with their own status (maintaining or advancing it), or they may be preoccupied with the social cohesion of their reference group. On the other hand, one can be stupid by pushing objectivity to the point of social disruption, as when pointing out the silliness of someone else's religion. Normally, stupidity tailored to enhance a leader's status or a group's cohesion tends to be conservative, with relief provided when some crackpot devises a new way to be idiotic.
To the extent that future stupidity will be caused by individuals making defective decisions, an understanding of individual stupidity will help us appreciate the irrationality of the years ahead. Unlike corporations and institutions, which are incapable of feelings, a person may be emotional. That certainly can reduce one's objectivity and mental efficiency. Further, an individual invariably has developed blind spots due to the specifics of his particular life experiences. Finally, shortcomings of information processing by any single mind prevent an individual from comprehending all the complexities of any but the simplest decisions.
Unfortunately, the growing trend toward institutionalized stupidity will not change the essential fact that it will still be stupidity. Only the type will change somewhat as the past predominance of individual idiocy created by enthusiastic bursts of brilliant lunacy will be overshadowed by plodding, methodical committees which can draw upon the collective and compounded drawbacks and limitations of their members. While being unemotional may encourage institutional logic, the resultant rationality may run over people's feelings and moral sensibilities. Finally, perceiving the complexities of a situation could lead to no decision at all. After all, very few polices are totally pleasing to everyone. At some point some kind of action must be taken, and it is stupefying to analyze and debate each and every possible ramification of each and every possible act under all possible contingencies.
Nor will computers really help us avoid stupidity in the future. First, much of the human experience cannot be programmed. Feelings, hopes and emotions are not reducible to quantified bits of Computerese. Neither can any program work out all possible costs and benefits of contemplated actions. Worse yet, although computers can help us deal accurately with the data we deem relevant to the major issues relating to a given problem, these suffer deification once they are entered. Computers have become our sacred cows, and their contents and pronouncements are now holy beyond critique. Disputes are considered settled when the computer speaks, and to many priests in the field, the 'Garbage ingarbage out' problem is secondary to the systematic processing of garbage. One seldom finds computer operators enthusiastically rushing to make corrections of either input or programs so that they can improve the quality of their garbage.
Garbagewise, normal human language will also make its contribution to stupidity in the future. As long as we communicate by language and use it to construct our cognitive schemas, we will misperceive events, misinterpret data and misapply principles. After all, that is what being human is all about.
If computers and language need an ally in frustrating informational morality, the basic commitment of people to adhere to their reference groups all but guarantees stupidity a rosy future. While there is no iron law of stupidity which dictates that people have to wreck their own civilizations, it just always turns out that way, and nothing in the contemporary world indicates that we are going to be exceptions to this rule. It might help were we to establish an 'Information ethic' (i.e., let the facts speak), but society probably could not stand the strain of cognitive honesty and cultural consistency. A demand for intellectual integrity might reduce the establishment's abusive application of information possessed, but no onein or out of powercan claim to be objective: everyone's schema is a composite synthesis of the obliquely interrelated worlds of factual data, social cohesion and political power, so any information ethic must be somewhat compromised by our inherent subjectivity.
While there is sure to be stupid behavior in the future, there are some strategies which can be adopted so as to minimize its role and impact. At the personal level, idiocy often results from misguided efforts of people trying to avoid the psychic discomfort of cognitive dissonance. It is unfortunate that the methods adopted usually result in a maladaptive schema being preserved at the expense of crucial information about the environment. Warnings go unheeded; facts are ignored; and behavior becomes less and less relevant to reality. Although education should and could be a way to develop in people effective coping strategies for dealing with such challenges to their schemas, the history of modern science indicates that academic training as practiced up until now is no guarantee against stupidity. In fact, most educational institutions seem to inculcate specific belief systems rather than training people to find their own when traditional schemas bring themselves into disrepute.
Within institutions, stupidity can be inhibited by breaking down the isolation and compensating for the bias which contribute so much to the collective idiocy of groupthink. Those indulging in it could correct their resultant errors if they are willing to reverse an earlier decision. Unfortunately, egos often trip on themselves as people become so committed to a course of action that even its obviously negative consequences cannot induce a reconsideration of the matter.
The well known Peter Principle, whereby people are promoted one grade above their ability to function effectively, is another example of institutional stupidity which can be corrected if options remain open. If promotions were made provisional for a short period of time so that performance could be evaluated, there might be fewer people put permanently into positions beyond their abilities to cope. (The military's 'Brevet' promotional system is a step in this direction, but it is usually used to save money by paying a person the salary of his lower rank while he assumes greater responsibilities.) There would be, of course, some loss of face for any workers who were returned to their earlier positions after provisional trials, but their short-term disappointment would be the price they would pay for finding the level at which they could function effectively. In the long run, this probably would be best for everyonethe institution as well as the individuals.
The likelihood of institutional stupidity can also be reduced if decision makers acknowledge the dangers or negative consequences which may result from their actions. There often is a tendency to minimize risks inherent in a given policy. This penchant to ignore risks can be an open invitation to disaster. Risks should not be minimized nor maximizedjust recognized. They should be given probability and severity ratings which then should be multiplied by each other, with the product granted due consideration in deliberations.
An explicit discussion of the morality of a contemplated act might also prevent stupid behavior. Along with the legal, political, economic and social consequences of an act, its morality should be considered as well. Morality is an underlying, defining factor in any controversial endeavor, and anyone who ignores it may well wish he had not.
In fact, many people might have profited from the advice a former country lawyer gave a young man starting out in the legal profession. 'Strive to be an honest lawyer,' he said. 'If you can't be an honest lawyer, be honest.' The former country lawyer was, of course, Abraham Lincoln, who made something of a career out of embodying the mores of society beyond petty role playing.
At
the institutional level, the best way to promote honesty is publicity. As
awkward as it would be for major political and corporate figures to conduct
their business in goldfish bowls, steps in that direction would induce them to
behave responsibly when considering the data at hand and attendant options.
Certainly, we would not have had the Bay of Pigs and
Finally, although we must use language, jargon should be avoided or at least minimized. The use of loaded terms can distort judgment by inducing a sense of self-righteous overconfidence. Especially when referring to an enemy, use of respectful labels may prevent an underestimation of the opponents' capacities and abilities.
While
it is nice to have a list of strategies for reducing the role of stupidity in
the future, it is appropriate to ask whether it is really possible for any
organization to protect itself from something so characteristically human. Is
it possible, for example, to have an intelligent, enlightened government? The
answer is, apparently, 'Not really'! Plato's ideal of breeding and
nurturing an elite of rational and wise leaders for government service was
never tried in its purest form, although the medieval Catholic Church came pretty
close to the order he envisaged. In another case,
If we are justly concerned with how to reduce stupidity, we must also consider by how much it should be reduced. After all, stupidity lets us live together while making it difficult for us to live with each other. The stupidest thing of all would be to eliminate stupidity completely, as we would soon be at each others' throats in a rage of realism and rationality.
Thus, future reformers who aspire to get people to live up to or (in the idiotic terms of the Existentialists) transcend their potential would do well to bear in mind the plight of Nietzsche's Superman as well as that of Nietzsche himself. In order to be happy, his Superman had to overcome his Will to Powerthat obsession with dominance and control which usually nets disdain and resentment. In short, he had to overcome himself. As the mighty rarely care to exercise this option, idealists may have to accept that, for better and worse, people are going to be themselves.
As for Nietzsche, he was happiest when he was clearly insane. The Will to Truth was for him and still is something of a terrifying, hostile, destructive principle because we really do not want to know our own nature. Like the physicists who create the phenomena they want to observe, we create the perceptions we want to hold. Whether it is to our advantage or not, we can create anything out of human nature because it is and we are so subjective.
It is this subjectivity which makes operational definitions of stupidity (and so many other behavioral attributesaggression, intelligence, etc.) so elusive. While there is a temptation to throw up our hands in dismay at the confusion inherent in the ambiguity of subjective phenomena, we must realize that this is not an end point for us but a beginning. It is our subjectivity which makes it not only possible but probable that we can and will be stupid, since it permits us to rationalize our behavior with unlikely explanations which are psychologically gratifying and socially acceptable. In our relativistic culture, both our abstract art and absurd theater indicate that the answer to the human riddle is not that there is no answer but that there is any answer we want. The bottom line is that there is no bottom linejust a number of fuzzy borders, each of which provides a suitable perspective for a given reference group. Subjectivity has triumphed, and all things being considered equal (whether they are or not), humanity will both flourish and fail.
As for stupidity, we may as well accept it as a limitation language and society place on our intellect. Like death, which clears away the old for the new, stupidity is an incongruity inherent in life. Humans have certainly developed, expanded and promoted it. We do so each time we endeavor to construct yet another flimsy utopia while doing our worst to keep the power structure evermore entrenched within itself. What we cannot acknowledge is that ideals are the rainbows of lifeonly the pursuit of illusion is real. It is an ultimate of human stupidity that we must seek what we cannot attain in a manner which prevents us from attaining it.
What we need in order to survive are systems which are not too systematic. They must be both functional and credible. This is the great human trade off. A realistic, functional system is unacceptable to super-ego standards which require inspiring beliefs. On the other hand, trying to live according to a static moral system leads to insurmountable, pragmatic problems. Fortunately, stupidity permits us a compromise blending so that we can entertain beliefs in all kinds of self-contradictory, conflicting systems while coping with some problems and creating others.
While we are capable of all kinds of compromise blendings, that needed for survival is fortunately not one of trading off the conflicting opposites of nihilists and realists. Nihilists aver there exists no eternal standard by which to judge and live, while traditional realists have argued society must be based on some universal, absolute truthwhich invariably turns out to be a subjective viewpoint at best. What we all need is an eternal moral compounded from a respect for intellectual ethics and a commitment to human rights. Such a moral would be compatible with academic integrity, consistent with individual dignity and based on the compelling need for people to find meaning in their lives.
Equally compelling is the need to find meaning for the deaths squandered in all the bloody crusades of the past and the lives wasted in the quiet despair of our ghettos. If experience gives us the opportunity and wisdom the ability to recognize mistakes when we repeat them, we must be very stupid indeed to have been party to so much carnage and indifference so that we can create more. In honor of all those who have been sacrificed so pointlessly at the altar of stupidity, we can resurrect meaning by reflecting on our behavior and reexamining ourselves. There is no shame in admitting that our basic flaw is our need to belongthat our greatest fault is our need to be loved.
Notes
us7.htm
us7.htmusa.htm
usa.htm
Politica de confidentialitate | Termeni si conditii de utilizare |
Vizualizari: 1332
Importanta:
Termeni si conditii de utilizare | Contact
© SCRIGROUP 2024 . All rights reserved