Jay C Brown. 21st Century Psychology: A Reference Handbook. Editor: Stephen F Davis & William F Buskist. 2007. Sage Publication.
As the old millennium closed, there was a shift from behavioral perspectives in psychology to cognitive perspectives (Goodwin, 2004)—that is, a shift from a strict reliance on empirical sources for knowledge to an acceptance of rationality as a source for knowledge. In the first decade of the new millennium, psychology is taking on another new look, or perhaps it is simply returning to whence it came. Psychology, and in fact all of society, is embracing spirituality, accepting faith as a once-again legitimate source of knowledge. The media is awash with words such as intuition and faith (Malcolm Gladwell’s book Blink, a best seller, attests to the popularity of intuitive thinking), thus reminding us of this acceptance of non-empirical sources of knowledge. Advances in information technology, such as the Internet and the 24-hour news networks, have made psychology in all of its forms accessible to everyone. This increased globalization of psychology is mirrored in the most recent edition of the DSM (Diagnostic and Statistical Manual), which now acknowledges that mental health practitioners should be aware of cultural issues when diagnosing and treating mental illness.
At the close of the previous chapter, Goodwin presented a section titled “At the Close of the 20th Century” and discussed four trends in psychology that have carried over into the new century: neuroscience, evolution, computers, and fragmentation. This chapter includes a discussion of these four trends and concludes with a discussion of the future of psychology by examining the applications of psychology in the world today.
Trends In Psychology
The American Psychological Association (APA) declared the 1990s the “Decade of the Brain.” An acceleration of neuroscience research marked the 1990s. For example, researchers discovered that well over half of the human genes are devoted to brain-related functions and those genes can influence a range of behaviors. Brain scans had long shown the structure of the brain. Crude neuroscience techniques of the past (such as ablation) had indicated which “areas” of the brain were responsible for which behaviors. The major recent advance, however, is the ability to scan the brain while the participant performs complex tasks. This technique reveals much more precisely which areas control which behaviors. At the turn of the century, a multidisciplinary initiative was formed (including the APA) creating the “Decade of Behavior” to bring social and behavioral science into the mainstream. This initiative was founded under the assumption that though an understanding of the brain is critical, it is ultimately human behaviors that are at the root of all accomplishments, problems, and solutions.
In addition to the recognition in the academic sphere, there has been an increased understanding of the importance of neuroscience research in the private sphere as well Articles dealing with brain functioning appear frequently both in magazines and the news; rarely does a week goes by without some new neuroscience development being reported by the cable news networks. Moreover, the Allen Brain Atlas (2006), created to be freely available to all, was funded totally with private donations by the philanthropist Paul Allen. This project was created because a private citizen realized the necessity for a comprehensive resource to which individual researchers could compare genetic patterns.
In an almost Orwellian development, it appears that MRI scans of the brain can predict at least some psychological characteristics. The brains of highly intelligent children develop in a unique pattern. Though this pattern is quite complex, the pattern for highly intelligent children is definitely different from that for children with average intelligence (Haier et al., 1995; Shaw et al., 2006). Because of this different pattern, it should be (at least theoretically) feasible to “predict” the intelligence of an unborn child. Brain differences in personality, temperament, abnormal behaviors, and other cognitive functions have been reported for years, and more are sure to be found shortly. Obviously, serious ethical discussions need to be made in this area, particularly in the potential for prediction. If a person’s potential is known beforehand, why waste precious resources on people who cannot benefit from them? Similarly, if the potential for violence in an individual can be assessed through brain scanning, should society prevent such “potential” violence from occurring? Not surprisingly, such a scenario was proposed in the recent hit movie Minority Report, in which crimes could be predicted before they occurred and the criminals were arrested for crimes they had not yet committed. Neuroscience may be the route that will allow psychology to finally fully advance beyond the realm of social science, where predictions are probabilistic, and into the realm of the natural sciences, where prediction is more of an absolute. Rather than merely predicting the behavior of the “average” individual, neuroscience may someday allow us to predict the behavior of a specific individual.
Since its inception, the theory of evolution has been woven into the history of psychology. The Structural School of Psychology, which had asked about the “what,” “why,” and “how” of mental life, was supplanted by the Functionalists, who incorporated evolutionary thinking to ask about the “what for” of mental life (Hergenhahn, 2005). In the 1960s, the field of psychobiology (currently called evolutionary psychology), in an effort to explain altruistic behavior, changed the basic driving force behind evolutionary thinking from the concept of “survival of the fittest” (implying that characteristics that allow an individual to survive and reproduce would be passed on) and replaced it with the propagation of an individual’s genetic material (which could be enhanced by ensuring that ones with similar genetics survive and reproduce).
A recent cover story in Time magazine asks, “What makes us different?” and refers to those characteristics that separate us from other animals, namely chimpanzees (Lemonick & Dorfman, 2006). The human genome project was launched in 1986, and a preliminary count reveals only about 30,000 genes in the entire human genome. Of these 30,000 genes, we appear to share about 99 percent with chimpanzees (Pan troglodytes), our nearest ancestor. Similarly, as of this writing, work is in progress to reveal the genome of Neanderthal man (Homo sapien Neandertalis). It is expected that we will share even more genes than we do with the chimpanzee. Full understanding of the human genome should reveal important insights into behavioral, mental, and physical illnesses. Early applications include the ability to screen for predispositions for certain disorders, such as breast cancer. Such “predictions” have opened other ethical debates that are being played out in medicine today. For example, if a woman’s genetic testing reveals a predisposition for breast cancer, should she have a radical mastectomy just to be safe? Taken further, if genetic testing reveals predispositions for any number of illnesses (or cognitive impairments) in a fetus, should the fetus be given a chance?
From the information-processing approach to memory of the 1960s, to the connectionist models of language in the 1980s, to the field of artificial intelligence and neural networks today, computers have been influencing the way we think about and study the mind and its processes. However, there is a consensus among researchers that current computer technology, which is very, very fast, is limited in its ability to truly imitate a human brain due to the current limit of serial processing. Until a true parallel processing system can be devised, the field of artificial intelligence will forever remain artificial. In the words of Donald Hoffman (1998), “You can buy a chess machine that beats a master but can’t yet buy a vision machine that beats a toddler’s vision” (p. xiii).
Computers have also been shaping psychology via their roles in research. For example, computerized presentation of test items allows for far more standardization and better control in research than was ever possible in the past. Additionally, statistical software programs have allowed researchers to pursue complex statistical relationships that might otherwise be lost.
Long gone are the days of the “gentleman scientist” of the 19th century. The trend of the past century, in which psychology has become more and more fragmented (specialized), looks only to continue into the future. The APA currently has 54 divisions, with more inevitably to follow, each specialized in its own small slice of psychology. We find that researchers in one area of psychology might belong to a very specific society such as the Society for Research in Child Development (SRCD), which uses very specialized terminology that is unrecognizable to someone researching in a different area of psychology. Researchers in one area can hardly keep up with advances in their particular slice of the research world. Human beings are incredibly complicated, and it feels hardly possible to understand one area of psychology in isolation from any of the others. However, evidence of an increased amount of interdisciplinary cooperation (particularly between psychology, biology, and computer science) is encouraging. Academic psychologists are often no longer fully recognizable as psychologists; the bookshelves of a psychologist might be filled with books written by anthropologists, linguists, economists, paleontologists, and so forth. It has been suggested that B. F. Skinner was perhaps the last completely recognizable figure in the field of psychology.
Psychology, like all sciences, must ultimately produce useful knowledge, knowledge that can benefit all of society. Psychology’s usefulness and acceptance by the general public has been rising. We have seen a trend of an increasing merger of psychological research and ideas into the mainstream of 21st-century life. Psychology is being applied to almost all aspects of our daily lives with little understanding by the general public that it even is psychology. One particularly clever integration of psychological research into the mainstream would have made B. F. Skinner very proud—the Fisher-Price Throne of Their Own, a toilet-training chair that uses music to reinforce successful elimination. Similarly, Newsweek recently published an article titled “How to Read a Face” discussing an emerging new field called social neuroscience (Underwood, 2006). Perhaps the most interesting part of this article is that the word “psychology” is not mentioned even once! Reminiscent of George Miller’s impassioned plea in his APA presidential address that we give psychology away, it seems that the very success of psychology could lead to its demise as a separate discipline and its fusion with mainstream thinking. In the following sections, we briefly highlight a few of those areas in which psychology (and the research tools it has created) has gained considerable influence.
Psychology and Opinions
The foremost name in the history of psychology related to consumer behavior is John Watson (Goodwin, 2004). Watson was credited with implementing classical conditioning principles at the J. Walter Thompson Advertising Agency that he originally developed with Rosalie Rayner while studying Baby Albert. Today we are constantly bombarded with new products (CS) being paired with people or places (US) that have positive feelings (UR) associated with them. Through constant bombardment (acquisition), the companies hope that the presence of the new product (CS) will eventually come to elicit positive feelings (CR).
When it comes to the development of new products, no successful company would even attempt to market a new product without testing it. The research techniques of polling and focus groups come directly out of psychological research. The availability of mass audiences via the Internet and e-mail makes this type of work faster and cheaper (though not necessarily better).
Political strategists have taken the psychological tools of polling and focus groups and created a new art form. Sometimes it appears as though the image created by a politician is more important than what is actually being said or done (especially in the middle of hot political campaigns). Everything a candidate plans is shown first to a focus group to find how it would affect people’s opinion of the candidate. Newer techniques involving micro-polling allow political strategists to predict with much more precision which candidate each household is likely to vote for. This knowledge is then used for concentrated “get out the vote” efforts.
Psychology and the Legal System
The presence of psychological principles in legal areas is hardly news. The insanity defense was one of the first applied uses of psychological principles in any domain. However, newer developments, such as the rash of “recovered memory” trials in the 1990s, bring to light the basic notion of the legal system—namely, that the words that come out of a witness’s mouth (for either the defense or the prosecution) are supposed to be a valid reflection of reality.
In a strange twist highlighted by the recent movie Runaway Jury, we are reminded that the truth of a situation is of little importance when considering the outcome of a trial. In the movie, psychologists were employed to ensure that the jury members who were selected from the pool of potential jurors were sympathetic to the prosecution or defense, as the situation dictated. Situations such as these remind us that as psychologists we must maintain high ethical standards that go far beyond patient confidentiality.
Psychology for Prediction and Assessment in Education and Industry
Beginning with the first large-scale use of psychological tests by Yerkes and his colleagues during World War I to test draftees’ intellectual functioning (the modern ASVAB [Armed Services Vocational Aptitude Battery] is a testament to the ongoing efforts in this vein), employers and academic institutions have been employing psychological tests at an ever greater pace. Despite protests concerning the “fairness” of aptitude tests (such as the SAT, ACT, or GRE in higher education), they continue to remain useful and valid predictors of job and academic success. Other predictive uses of psychological testing, including the uses of the Myers-Briggs Type Indicator (MBTI) and the Minnesota Multiphasic Personality Inventory (MMPI), are also plagued with questions of validity and fairness (as well as proper administration). It is likely that psychological tests will have their largest future impact in the area of assessment. More and more, employers and educational institutions want “proof” that both new and accepted techniques are working. Proper implementation of the No Child Left Behind Act of 2001, the largest change in education in a generation, requires assessment techniques in order to determine which schools reach their goals. Such developments guarantee the future of psychological assessment techniques.
In therapy, changes have been brewing for several generations. Traditional psychoanalytic therapists are now a minority, their ranks being overshadowed by cognitive therapists. The cognitive approach to therapy empowers patients by putting them in charge of their own recovery, though of course it can only be applied to a certain range of disorders.
Similarly, there have been shifts in who provides therapy. The cost of medicine has skyrocketed in recent years, and the rise of health management organizations (HMOs) brought with it a push for “cheaper” mental health care. Therapy, once dominated by psychiatrists, is now practiced by psychologists and therapists as well as by physicians and members of the clergy. Also, a shift from individual therapy to group therapy is evident. The shift in mental health care to “cheaper” methods and providers has allowed access to mental health care by a larger portion of the population than had ever been possible in the past. Along with these shifts come further questions of cost efficiency driven by the HMOs, namely prescription privileges. Traditionally only psychiatrists have been allowed to prescribe medications, but there is now a shift toward allowing these privileges to be provided to licensed clinical psychologists as well (currently Louisiana and New Mexico give clinical psychologists prescription privileges, but several other states have laws in the making).
In the fast-paced world of the late 20th and early 21st centuries, in which economic decisions are made based on the “bottom line,” it seems inevitable (i.e., the zeitgeist is right) that Francis Bacon’s (1620/1994) notions at the beginning of the scientific revolution should come full circle. “Human knowledge and human power meet in one; for where the cause is not known the effect cannot be produced” (p. 43). All fields of science, in order to continue to exist, must provide useful information. As seen in the mission statement of the APA (2006), “The objects of the American Psychological Association shall be to advance psychology as a science and profession and as a means of promoting health, education, and human welfare.” Perhaps psychology is finally starting to grow up enough to prove its usefulness.