Historians do not deal with the future. Yet one justification for the writing and reading of history is that it helps us better to understand the present and to interpret more intelligently the future as it rushes in upon us.
In 1846 the young French poet Charles Baudelaire (1821-1867) defined modernity as that phase of experience in which life is lived in fragments, in which the pace of change and an inability to separate the important from the unimportant create a sense of confusion, of one’s life being out of control or in the control of others.
The rapidity of observed change in one’s own time would lead to the illusion that life in the past was both more coherent and understandable and, somehow, actually slower. Such a view of history produces hope: A fixed past that one might study and comprehend, could be contrasted with a future that lacked confining definition and to which no walls had yet been set. But the historian also knows that the past helps to define the future.
What changes of “our times” would be most important in shaping the next times? What might history conclude before the fragmentation of life as experienced takes over? Certainly technology will continue to transform daily life, massive population growth will induce vast social change, and the world will become ever more interconnected. But what beyond this?
One obvious trend of the recent decades was a worldwide movement of peoples, of emigrations and immigrations, so that once ethnically or racially homogeneous peoples were dramatically less so and the cultures of once-distant peoples became parts of other cultures.
Perhaps no impact of World War II and the period of decolonization and localized post colonial wars has been more obvious than the movement of great numbers of Asians, Africans, West Indians, and others into countries once overwhelmingly European in their cultures. Such movements of people have changed clothing styles and eating habits, enriched European languages, and utterly altered communication, education, and medical knowledge.
By the 1990s the concept of human rights was thoroughly embedded in Western societies and increasingly throughout the world, and even where given only lip service, concern for “world opinion” often shaped diplomacy, military strategy, and business tactics. The range of agreedupon rights had been greatly widened to include the concept of the right to privacy, the right not to incriminate oneself, and the right of access to education, health care, and public safety.
Many claimed a right to an unpolluted environment as well. By no means did everyone enjoy such rights, even in the most stable of democracies, but they represented agreed-upon aspirations for all societies. Before our times there was little agreement on what constituted basic human rights at the international level and little concept of “world opinion.”
The last thirty years have brought the most remarkable social progress and increase in standard of living in history. In the Western democracies, purchasing power for the average employed person increased by nearly half, overseas travel increased twenty-fold, and two-family incomes became commonplace. Thirty years ago most of the world lacked personal computers, microwave ovens, audiocassettes, videos, television, FM radio, electronic toys, and credit cards or automatic bank tellers.
There were no CAT scans, lens implants, or artificial joints; there was no microsurgery, magnetic imaging, or in-vitro fertilization. There was no genetic engineering as the term is understood today. The divorce rate was one fifth of its present level, and in most of the Western world “crime in the streets” was not a political issue. Life expectancy in the industrial world rose to nearly eighty, creating an aging population.
The past three decades have also brought, in much of the Western world, the emancipation of women. The contraceptive pill, which became widely available in the 1960s, made family planning possible where social custom or religion did not prevent it. The movement in the 1970s to award equal pay for equal work, followed by equal opportunity and affirmative action programs, opened up a vast new labor market, with millions of new earners.
Families were smaller, income was larger, and thus real income rose for the great majority. Yet, single parent families rose rapidly, with half of the children in such families living below the poverty line as defined by Western societies. In many nations gay and lesbian individuals were also assured of the full exercise of their rights.
Two major changes influenced the way history was studied, giving rise to the question, Who owns history? “Women’s history” was increasingly replaced by gender history. The former had tended to mean simply adding women to the basic historical story, being certain that the achievements of women were recorded. This “add-on” approach also meant that sections of texts, courses, and curriculae included women more systematically and in far greater depth than before.
But gender history meant far more. First, it weakened class as a primary category of analysis, making it no more than simply equal to gender ethnicity, and sexuality. This meant that labor history, for example, which had tended to be heavily male in its orientation, had to be reexamined, for more was required than simply placing women at the scene, in work places and strikes or on revolutionary barricades. Gender does not equate with women, just as race relations are not “about” nonwhites; gender is part of an inquiry even if women are not.
As men had felt threatened by the feminization of work, which assailed their class positions,they also felt threatened by the idea that masculinity was a feminist issue. Thus genderized history insisted on a reconstruction of categories so that women would not be required to fit into units of study that had been developed largely to explain the actions of men. Kinship, family, sexuality, and production were seen to be aspects of the dispersal and use of power.
The second change was a challenge to the idea that there was a coherent story of Western civilization that could be outlined between the pages of a book. That books were written on, and students studied, Western societies as opposed to other world societies was argued to be a form of racism or, at least, cultural blindness in the eyes of some.
At first schools and universities simply turned to the “add-on” solution, adding courses on Africa, Asia, or Latin America, or on a variety of ethnic histories, especially in the United States. The sense that university education was organized around dominant masculine and Western values was attacked by those who were convinced that society must be freed of organizing principles that seemed to “empower” one group or another.
However, demands for revised curriculae often resulted in an irony, in which members of ethnic communities preferred to study only themselves and with individuals of the same background, precisely the organizing principles of which they often had complained when knowledge was said to be dominated by white, middle-class, or Protestant values. For the most part the discipline of history was less caught up in this debate than most subjects of study in the social sciences and humanities, however, since historians could hardly abandon the principal organizing force of historical narrative, which is sequence.
To argue that the great African author Wole Soyinka (1934 ), who won the Nobel Prize for Literature in 1986, was a writer of equal stature to Shakespeare was one thing, while to argue that Shakespeare was irrelevant and Soyinka relevant to history was quite another, simply because one preceded the other: Shakespeare influenced Soyinka, while the Nigerian writer did not influence Shakespeare. Where pure matters of chronology were involved, there was little debate; where aesthetic judgments about relative merit or, even more dramatically, about relevance to understanding the present time were at issue, there was frequent discussion. Such discussions also led to the question, Who owns history?, something not often asked in the previous century.
Typically Western governments responded to unemployment, soaring interest rates, falling educational levels, and rising crime, poverty, and illness by creating social welfare programs. Some, as with medical care in Great Britain following World War II, or sweeping programs concerning sexual mores in Scandinavia, or housing loan insurance in Australia, became long-term changes within the society, while others were mere palliatives, tried and abandoned, attempted largely for their political effect. By the end of the 1980s the United States, for example, had 200 separate welfare programs operated by the federal government.
Taxes rose throughout the Western nations, in many countries to half the income of the middle class, and while substantially less in North America, nonetheless to the point by 1991 that taxes were the single largest item in the budget of a typical citizen. The state, the government, had in all societies become a player in the life of the individual to a greater extent than ever before.
By 1995 nearly everywhere citizens appeared to want less government. Still, in most of the West less than five years from the twenty-first century there continued to be a basic faith in democracy rather than some allegedly more efficient form of government, in the value of one’s labor, hope for one’s fellow beings, respect for the environment, and love of country.
There was, by the 1980s and 1990s, much talk about the decline of the West. Certainly there was a decline, if a monopoly on technology or the ability to force the world into a Western mold by trade, industrial efficiency, or military expertise were the criteria. Yet looking beyond nation to culture, the historian could easily note the continuing global ascendency of the English language and of popular Western mass culture, especially in its American variant, if not of elite culture.
American ideas continued to absorb the ideas of others, to cross borders, to change; attempts to stave off this global culture, most notably in Muslim lands, posed vast transnational problems in terms of respecting other societies, and the future was by no means certain.
But the cultures customarily described as “Western civilization” remained vigorous, demonstrably attractive to the rest of the world, and relatively coherent. A historian could guess that in the twenty-first century the body of ideas embraced by that collective term would remain interesting, significant, and usefully true.