Science is progressive.
—THOMAS JEFFERSON TO JOHN ADAMS, 1813
All that is human must retrograde if it does not advance.
—EDWARD GIBBON, 1776
To speak of the world as having made progress has gone out of fashion. “Writers nowadays who value their reputation among the more sophisticated hardly dare to mention progress without including the word in quotation marks,” observed the economist Friedrich Hayek in 1960. The situation had got worse by 1998, when, as one journalist noted, “No one talks seriously about ‘progress’ any more.” Yet like sex in Victorian England, progress is constantly present even if shunned in polite conversation. By almost any empirical measurement, the centuries since the advent of science and liberalism have seen the greatest improvements in human health, wealth, and well-being in all history. So why do so many sophisticated thinkers disdain the very concept of progress as inappropriate or naïve?
Some feel that the terrible violence of the twentieth century invalidated the promise of science and liberty. Others argue that material gains were purchased at the price of environmental degradation. Many object less to the general prospect of progress than to the utopian assertion that progress is inevitable. They critique Enlightenment thinkers who imagined that the future would produce a “new man,” whose rational judgment would render war and injustice obsolete; as no such new man has appeared, the scholars dismiss the concept of progress itself.
It certainly is the case that several otherwise sensible Enlightenment thinkers pushed the prospects of progress to extremes. One such was the Marquis de Condorcet. His “Sketch for a Historical Picture of the Progress of the Human Mind” was prescient in many regards. It portrayed a future in which scientific, liberal-democratic government has become the norm, food is abundant and cheap, and the human population has grown significantly; many lethal diseases having been eradicated, the average man’s lifespan extends to “the time when naturally, without illness or accident, he finds life a burden.” In this happy future, prejudices are in retreat—notably “the prejudices that have established inequality between the sexes, fatal even to the sex it favors”—while universal public education works to “speed up the advances of [the] sciences.” So far, so good: By and large, these predictions came true. But Condorcet did not stop there. In his enthusiasm for rationality—“The time will therefore come when the sun will shine only on free men who know no other master but their reason”—he lapsed into advocating what he called “the perfectibility of man.” This notion has cropped up in thinkers from Plato to the behaviorist B. F. Skinner, misleading so many of them that the Australian philosopher John Passmore was moved to devote an entire book, The Perfectibility of Man, to debunking it. Human beings “are capable of more than they have ever so far achieved,” Passmore writes, but their achievements will always “be a consequence of their remaining anxious, passionate, discontented human beings. To attempt, in the quest for perfection, to raise men above that level is to court disaster; there is no level above it, there is only a level below it.”
Setting aside as imaginary the perfectibility of human nature, two steps may be taken to clarify the question of progress.
First, the present is compared to the past, rather than to an imagined future, our concern being empirical facts about what has actually happened rather than the rise and fall of utopian ideas about what might have happened.
Second, progress is defined as the creating of more options, for more people, over time. To eat rather than to starve is an option. To see a doctor rather than suffering or dying from a curable disease is an option. To leave your place of birth and seek education and employment, and to enjoy these opportunities notwithstanding your race, creed, or gender, is an option. To speak your mind, vote, and send money home to your aging parents are options. Such a definition may fail to satisfy those who would prefer to see the people as toiling together toward a common goal, but when people are free to do as they please they pursue many goals, some of which look contradictory. In the liberal-democratic world today people watch television, surf the Internet, and buy more books than in the past; play video games, and score higher on standardized IQ tests than their parents and grandparents did; drive cars, and run marathons; squander money on frivolities, and graduate from college in unprecedented numbers. Some eat junk food, others organic produce; some fear that science debunks religion, while others contentedly integrate science and religion into their lives. Such societies may look haphazard, but their diversity and freedom is their source of strength, and challenges the notion that progress must be planned. It may well be, as Oliver Cromwell used to say, that a man never mounts so high as when he knows not where he is going.
The greatest eighteenth-century exemplar of a commonsensical, nontranscendent view of progress was Benjamin Franklin. Unwaveringly cheerful and companionable, Franklin drew much of his optimism from his firsthand experience in conducting scientific investigations and coming up with inventions. He pioneered the science of weather forecasting (his fascination with electrical storms leading him to conceive of his most famous experiment, in which lightning captured by flying a kite is stored in a battery), made the first studies of the Gulf Stream (his maps comporting well with today’s infrared satellite images), and discovered that a thin coat of oil will calm troubled waters. (To impress the ladies, always a personal priority, Franklin wielded a hollow cane loaded with oil with which he could as if by magic quiet roiled ponds at his command.) His lightning rods saved homes and lives despite the protestations of certain divines that they interfered with the fiery instruments of God’s righteous wrath. Believing as he did in practical progress, Franklin championed universal education, freedom of (and from) religion, freedom of speech and of the press, free-market competition, self-improvement, and the other middle-class virtues of what he liked to call “we the middling people.” Initiates into the Junto, a club of young workingmen that Franklin founded in Philadelphia in 1727, were asked to affirm that they respected the current members, felt affection for people regardless of their religion, profession, or religious beliefs, and would love and pursue the truth for its own sake. His Poor Richard’s Almanack of 1733–58 imparted scientific and technological knowledge in a spirit of personal improvement (it was sometimes titled Poor Richard Improved), seasoning the mix with a sage realism: “Love your neighbor; yet don’t pull down your hedge” “Three may keep a secret, if two of them are dead.”
Franklin’s career, like Voltaire’s, fell into three acts—first the pursuit of wealth and fame, then of scientific knowledge, and finally of liberal political reform—and he generally kept his priorities straight. When he signed the Declaration of Independence he was a wealthy, powerful, seventy-year-old celebrity who appreciated the risks he and his fellow delegates were taking—telling them, “We must, indeed, all hang together, or most assuredly we shall all hang separately”—but he anticipated the future with cheerful benignity. “The rapid Progress true science now makes, occasions my regretting sometimes that I was born so soon,” he told Joseph Priestley, predicting that as science advanced, “Agriculture may diminish its labor and double its produce; all diseases may by sure means be prevented or cured, not excepting even that of old age, and our lives lengthened at pleasure.” He suggested to Sir Joseph Banks, the president of the Royal Society in London, that, “Furnished as all Europe now is with academies of science, with nice instruments and the spirit of experiment, the progress of human knowledge will be rapid and discoveries made of which we have at present no conception.” He literally bet on the future, providing in his will for the disposition of £2,000, plus accrued interest, two hundred years after his death.
Franklin’s cheerful practicality sat poorly with Romantics like the poet John Keats, who dismissed him as “not sublime” (predictably so, since Keats thought that “the humanity of the United States can never reach the sublime”) and Keats’s friend Leigh Hunt, who sniffed at Franklin’s having been a printer, saying that “something of the pettiness and materiality of his first occupation always stuck to him.” Franklin’s technological optimism was ridiculed by Henry David Thoreau, the American Rousseau, who camped out a mile from his boyhood home and wrote a book about it, subtitled Life in the Woods, full of lamentations about the alleged spiritual poverty of civilization. “Our inventions are wont to be pretty toys,” wrote Thoreau, “which distract our attention from serious things…. We are in great haste to construct a magnetic telegraph from Maine to Texas; but Maine and Texas, it may be, have nothing important to communicate.” Railroads, which would have delighted Franklin, aroused Thoreau’s particular contempt: “A few are riding, but the rest are run over…. No doubt they can ride at last who shall have earned their fare, that is, if they survive so long, but they will probably have lost their elasticity and desire to travel by that time.” Franklin’s bourgeois values were satirized in the works of Sinclair Lewis, who denounced “the easy-going descendants of the wise-cracking Benjamin Franklin” (It Can’t Happen Here, 1935) and invited his readers to sneer at the fictional Babbitt, who, if asked about his religion, “would have answered in sonorous Boosters’ Club rhetoric, ‘My religion is to serve my fellow men, to honor my brother as myself, and to do my bit to make life happier for one and all’”—a flawless sentiment that Lewis discounted as self-evidently unsublime.
Yet people somehow did find uses for the telegraph, the telephone, and the railroads, and persisted in believing that they could do worse than to toil on behalf of themselves and their brethren in hopes of making the world a somewhat better place. And what did they achieve, in terms of health, wealth, and happiness?
Regarding health: In 1800, a decade after Franklin’s death, half of the world’s nearly one billion people died by age thirty. Two centuries later, the global population had swelled to six billion but life expectancy had more than doubled, to over sixty-five years of age. (In developing nations it was sixty-three; in the developed nations, seventy-nine. The World Health Organization estimates that average global life expectancy, which was forty-eight years for those born in 1955, will be seventy-three years for those born in 2025.) Thanks to advances in agriculture, medical research, and social hygiene, the rate of premature deaths at the end of the twentieth century had dropped to under 6 percent worldwide, and in the developed nations to under 1 percent, while the number of people aged sixty-five and higher rose to 15 percent of the general population. (In 1800 it had been under 5 percent.) These gains were attained despite such disastrous twentieth-century setbacks as the influenza epidemic of 1918–19, which killed over forty million people, the two World Wars (over seventy million dead), the communist Chinese famine of 1959–61 (fourteen to thirty million), and the HIV/AIDS epidemic (twenty-five million and counting).
People not only lived longer but became taller, more robust, and generally healthier. Stature—the maximum lifetime height attained by the average individual—increased throughout the nineteenth and twentieth centuries, due in part to significant reductions in vitamin-deficiency diseases such as scurvy, pellagra, rickets, and goiter (which are prevented by vitamin C, niacin, vitamin D, and iodine respectively). It is estimated that almost half of the labor involved in building the canals, railroads, highways, bridges, and buildings associated with the Industrial Revolution came not from machines but from the increased strength of bigger and stronger laborers, whose personal energy levels increased by roughly 50 percent from 1790 to 1980. Such progress continues today: World food production grew by 52 percent per person from 1961 to 2001, and malnutrition in the developing world dropped from 45 percent in 1945 to 18 percent in 2001. While much remains to be done in reducing such scourges as diarrheal diseases, which killed almost 3 million infants and children worldwide in 1990; measles (2 million deaths annually, almost all of them children); and tuberculosis (over 2.5 million deaths, mostly adults under age fifty), humankind has already come a long way toward realizing what the historian James C. Riley calls “the hope that all people will enjoy a long and healthy life.”
Regarding wealth: In 1800, the world’s per capita GDP was about seven hundred dollars per year, and was growing at a fraction of 1 percent annually. With the Industrial Revolution the global economy took off, reaching a per capita GDP of $6,460 and a growth rate of 3 percent for all humanity by 2008. Although this unprecedented increase in wealth was concentrated in the liberal-scientific nations, the poor were not entirely left behind. In 2007 the developing world enjoyed growth rates of over 7 percent, three times that of the developed world, and the peoples of emerging nations such as Mexico were earning more money than British subjects did in 1900, when England was near its peak of power. The efficacy of science and liberalism in realizing such results is underscored by the tragic example of sub-Saharan Africa, which has persistent extreme poverty, and where neither science nor democracy has yet made many inroads. By contrast, in East Asia, where science and democracy have been on the increase, the number of people living in extreme poverty has dropped from eight hundred million to under three hundred million in only three decades.
Happiness is harder to measure but a good place to start is with education, the root of which is literacy. In a very real sense, those who cannot read are unable to make much sense of the wider world. “We are all illiterate here and so we don’t know what is happening,” a Chinese farmer told Patrick E. Tyler of the New York Times in 1996. “If something is harmful, we don’t know it. If something is good, we don’t know it.” A neighbor added, “Because I have never been to school, I am like a blind person, even though I have eyes.” So it is cheering to observe that world illiteracy has long been in retreat. Europe went from a medieval adult literacy rate of under 10 percent (estimated by checking old contracts to see how many signed their names rather than making a mark) to nearly half the adult population by 1750 and 99 percent today. Japan’s 1872 Fundamental Code of Education, aimed at ensuring that there would be “no community with an illiterate family, nor a family with an illiterate person,” was so effective that by 1913 more books were being published in Japan than in Britain. More than 80 percent of all the adults in the world today are literate, up from 73 percent in 1985, and the literacy rate continues to grow by about half of 1 percent annually. In 1970, 37 percent of all people over age fifteen were illiterate; today, fewer than 18 percent are. Lingering illiteracy is almost entirely a matter of supply; there is little lack of demand. When several African nations eliminated school fees in 2004, they found their classrooms crammed with new students almost overnight. Uganda’s public-school enrollment doubled in one year, while in Kenya, a million pupils enrolled within a matter of days. (When one of them, Joseph Lolo, age sixteen, was urged by his impoverished father to return to work to help support his family, he replied, “What will save me if I don’t go to school?”) Except in sub-Saharan Africa, where almost 40 percent of the children still have no access to education, nearly all the world’s children are enrolled in primary schools; 70 percent of those of high-school age are enrolled, and one in four goes on to some form of higher education. In 1953, only 7 percent of adult Americans had a college degree; today nearly a third do.
Wider-reaching indicators of happiness include the United Nations Human Development Index (which combines literacy, income, and life expectancy) and various surveys of “subjective well-being.” Such studies indicate that the residents of wealthy nations like France and Japan are on the whole happier with their lot than are those of poor nations like Oman and Albania, but there are some interesting wrinkles. A sense of happiness and contentment grows with increasing income up to about $15,000 per year—but above that point, more money evidently makes little difference. Intriguingly, those living in nations that distribute their income the most equitably report themselves happiest: The inhabitants of relatively socialistic nations like Iceland, Holland, Finland, and Sweden show up as happier than Americans, although the Americans make more money. (The Danes, who fork over nearly half their income to the tax collector, were named the world’s happiest people in 2008.) Saudi Arabia has a per capita GDP of nearly $9,000 but distributes its wealth so vertically—Saudi princes living like princes while many ordinary Saudis cannot even go to a decent public school—that it ranks lower in perceived quality of life than do poorer nations like Poland and Costa Rica. The picture is complex—overall, the wealthiest people in virtually every nation are happier than their poorer compatriots—but it seems safe to say that the general rise in global income has at least made most people today feel happier than their grandparents were. If any deep spiritual gratification is to be realized by being poor and unhealthy, nothing of it shows up in the social-science polls.
All these gains were made following the ascent of science and liberal governance. Global spending on science and technology R&D now surpasses a trillion dollars a year, while the number of liberal-democratic nations has increased from about a dozen in 1900 to thirty-six in 1960 and sixty-one in 1990. (In 1977, only 28 percent of the world population was classified as “free” by 2007 the ranks of the free had grown to 47 percent.) But while the connection between science and improving health seems clear enough, to what extent do science and liberty deserve credit for the world’s increasing wealth?
Science generates profits, of course—most of America’s annual GDP growth comes from science, technology, and invention, while nearly half of the world’s one hundred largest corporations make their money chiefly from science and technology—but liberalism also plays a role. Worldwide, the growth of wealth correlates with “economic freedom”—which is not the same thing as political freedom (see China) but is characteristic of the liberal democracies. The Economic Freedom of the World index defines economic freedom as “personal choice, voluntary exchange, freedom to compete, and protection of person and property.” Its 2007 survey of more than 120 countries found that the least-free fifth (or “quintile”) of nations, which included Syria, Myanmar, and Zimbabwe, had a per capita GDP of only $3,305 and were growing economically by less than half of 1 percent annually, while the most-free quintile, including Hong Kong, Switzerland, the UK, and the U.S., averaged a GDP of over $26,000 and a 2.3 percent growth rate. So economic liberalism, at least, correlates with wealth.
Worldwide, per capita income rose eightfold in constant dollars from 1820 to 2000, a remarkable achievement. But this would mean little if only a few were enriched while the majority stayed poor. (If Warren Buffett and Bill Gates walk into the restaurant where you’re having dinner, statistically each diner is suddenly worth hundreds of millions of dollars, but that’s no help in paying your check.) In the United States all five quintiles enjoyed rising incomes from 1980 through 2005 but the rate of rise for the top quintile outpaced the rest, making the nation more economically vertical than it had been since the Roaring Twenties. In the long term, though, the American financial picture has not been one of titans ruthlessly exploiting workers. The number of hours put in by the average American laborer declined steadily from 1870, when it stood at just under 3,000 hours per year, to less than 1,600 hours in 1998, yet per capita GDP increased nearly tenfold during the same period, rising from $6,683 to $55,618. In the late fifties, when John Kenneth Galbraith characterized the United States as an “affluent society,” the average American made under $9,000 a year, compared to $20,000 in 2005, by which time the median American family’s net worth (meaning assets minus liabilities) exceeded $100,000.
What about the world as a whole? Are the rich nations getting richer at the expense of the poor nations? Many have claimed that this is the case, and that globalization—meaning international free trade—is to blame: “Globalization has dramatically increased inequality between and within nations” “global income inequalities are widening” “inequality is soaring through the globalization period” and even, “The attacks on the World Trade Center and the Pentagon stemmed from tensions created by the widening gulf between rich and poor nations.” Alarming if true, but is it true?
A conspicuous gap opened up between rich and poor in about 1820, when the Industrial Revolution reached critical mass, and it continued to widen until about the early 1970s. Globalization got going around 1980, so the relevant question is whether economic inequality has increased since then. It may stand to reason that a booming global economy benefits the rich and leaves the poor behind, but applied reason is only as good as the data on which it is based. An objective analysis of the best available data (which, admittedly, are far from flawless) indicates that while larger inequalities did crop up here and there, global inequality overall ceased growing from 1980 to 2000 and in many respects began to shrink. More to the point, income inequalities tend to be largest within the nations least touched by globalization. The poorest 10 percent of those living in the rich, free, “globalized” nations earn 2.5 percent of the total national income, while their counterparts in the least-free, least globalized nations get only 2.2 percent of the pie. (The disparity is even worse in absolute terms: The bottom 10 percent in the least economically free nations make under $1,000 a year, while in the freest nations they earn over $7,000.) So there is little evidence that the rich got that way by exploiting the poor.
One source of confusion on this point arises from the statistical matter of how economists bin their data. Even if economic inequality were increasing within every nation (which it is not) and between all nations (which it also is not), the gap between rich and poor for all humanity could still be decreasing, if for instance large numbers of poor people in the most populous nations were making more money. And this is just what is happening, notably in India and China, the two biggest countries in the world. China in 2004 had a per capita GDP of under $4,000, compared to over $35,000 for the United States, but it also had four and half times the population of the United States and was growing economically at triple the U.S. growth rate. Nor was this just a brief spurt: From 1980 to 2000 China’s economy grew an average of 8.3 percent annually, and India’s 3.7 percent, while the highest-income nations grew by under 2 percent. So a massive elevation of poor people in a large nation can reduce global inequality, even if that nation grows more economically vertical within its borders.
The index customarily employed to measure economic inequality is called the Gini ratio. It ranges from zero to one, where zero represents perfect equality (everybody has exactly the same income or worth) and one indicates absolute inequality (one person or nation has all the money and the rest have none at all): The lower the Gini ratio, the less the inequality. A few studies suggest a modest rise in the Gini since 1980, but most indicate that it has been falling—meaning that nations are becoming economically more equal, not less so. This tendency holds up even if we set aside 80 percent of the world’s population and compare only the top and bottom 10 percent. Growing inequality is found mainly by restricting our attention to just 1 percent at each end of the economic spectrum—the very richest of the rich, and poorest of the poor, with about 60 million persons in each category. There, inequality is indeed on the rise. But since the extremely poor, many of whom are in sub-Saharan Africa, are the least touched by globalization, the disparity argues for more, not less, global trade and economic liberalism.
These findings are consistent with what Nathan Rosenberg and L. E. Birdzell Jr., in their book How the West Grew Rich, call “an oddity of Western economic growth that, while it made some individuals extremely rich, it benefited the lifestyle of the very rich much less than it benefited the lifestyle of the less well-off [because] the most lucrative new products were those with a market among the many, rather than among the few.” These include high-tech products once regarded as the province of the well-to-do but now widely available: Worldwide, one in every five persons circa 2008 had access to a computer, and more than half had mobile phones.
In retrospect, it is clear that the Industrial Revolution generated global economic inequalities and that colonialism may have exacerbated the situation. But inequality is to some extent the unavoidable consequence of any sudden increase in wealth. Imagine a nation in which everybody is equally poor: If the population is divided into fifths by income, each quintile has about 20 percent of the income. The citizens of this imaginary nation now set about training themselves in, say, computer science and technical support—as has happened in Ireland and India—and are rewarded with an economic boom so powerful that soon, 80 percent of them have doubled their incomes. Since nearly all this fresh money was earned by the top four-fifths of the population, the table now shows a much more unequal picture. This may strike some as undesirable or even unjust—one hopes that the bottom fifth will come to share the wealth—but for a formerly poor nation to have doubled the earnings of four-fifths of its population is a major achievement, one that cannot be dismissed as a statistical artifact of the “Warren Buffett and Bill Gates walk into a restaurant” variety.
Liberalism fosters scientific and technological progress by accommodating change without pretending to know where changes will lead. Unlike the political right, it denies that the answers to contemporary problems are necessarily to be found in the past; unlike the left, it refuses to sacrifice extant freedoms in order to attain progressive goals. “All institutions of freedom are adaptations to [the] fundamental fact of ignorance,” notes Friedrich Hayek. “Nowhere is freedom more important than where our ignorance is greatest—at the boundaries of knowledge, in other words, where nobody can predict what lies a step ahead.” Scientific and technological creativity stands at those frontiers, which by virtue of their inherent unpredictability are ill-suited to hierarchical organization and planning. “The Western scientific community was more, rather than less, efficiently organized by reason of its lack of a hierarchy,” write Rosenberg and Birdzell. Top-down organization may spur development in the early stages of a scientific or technical endeavor, when the questions are still relatively simple and the variables few, but thereafter it will tend to hinder progress unless it gets out of the way. Joseph Needham, whose magisterial Science and Civilization in China helped awaken Europeans and Americans to the technological ingenuity of the East, blamed government bureaucracy for the fact that China failed to expand its scientific research after about the fifteenth century: “‘Bureaucratic feudalism’ at first favored the growth of natural knowledge and its application to technology for human benefit,” he writes, “while later on it inhibited the rise of modern capitalism and of modern science” progress “was constantly inhibited by the scholar-bureaucrats.” Notwithstanding a few exceptions such as the Manhattan Project’s development of nuclear weapons, neither government nor industry can expect to get results by assembling a lot of scientists and ordering them to produce a wished-for result—as was confirmed by President Richard Nixon’s futile War on Cancer.
Planners seek to direct human effort toward useful endeavors, but in a changing world even the most fruitful projects can look pointless at first. When an associate commiserated with Thomas Edison over his having conducted nine thousand unsuccessful experiments in trying to devise a new type of battery, saying, “Isn’t it a shame that with the tremendous amount of work you have done you haven’t been able to get any results?” Edison, grinning, replied: “Results! Why, man, I have gotten a lot of results! I know several thousand things that won’t work!” The imperious motto of the 1933 Chicago Century of Progress exposition, “Science Finds—Industry Applies—Man Conforms,” was off the mark on all three points. The most fertile facts found by science are seldom of immediate technological value, although they may become so eventually. Faraday’s research laid the basis of modern electronics, but it took a century for the electronics industry to come into its own; Einstein’s relativity showed the path to nuclear energy, but the path was so obscure that Einstein himself originally thought it impassable. Nor does industry merely apply scientific knowledge; it often makes its own discoveries, for its own reasons. Food preservation was invented in 1810 by the Paris confectioner Nicholas Appert, who won a cash prize, put up by Napoleon Bonaparte, by establishing that foods would not spoil if they were sealed in boiling-hot bottles with no room for air; all this happened sixty-three years before Louis Pasteur established why it worked. Superconductivity—the ability of an electrical current to flow at zero resistance through certain materials—was discovered decades before scientists began to comprehend its physics. Velcro was the brainchild of a Swiss electrical engineer who knew little about materials properties, and while computers were pioneered by scientists, the first PC was assembled by a couple of youthful hobbyists working in a suburban garage. Nor have the products created through free scientific and technological inquiry tended to make people conform; rather, they have presented them with new choices. People say that they hate their mobile phones but cannot live without them. If they are conforming to anything when they use their cell phones—which, among their other virtues, have proven effective in reducing poverty—it is to their own preferences rather than to the dictates of industry, which had mobile-phone technology in the 1940s but could find no market for it.
This is not to argue for anarchy. Like liberalism, scientific and technological progress requires law, an underlying structure that works to keep things fair and honest and to protect the intellectual property rights of inventors; hence the importance placed on the U.S. Patent Office by the American founders. Scientific experimentation takes place within a structured framework of publication, peer review, grant applications, and free—often scathing—debate, the general effect of which is to weed out inferior results. Free-market capitalism requires legal and ethical restrictions too, not only to protect ordinary folks from fat-cat predators but also to assure the normal functioning of markets. It is no coincidence that the British, who borrowed the old Roman phrase “laws of nature” and applied it to scientific findings, are widely respected for their legal system and the quality of their scientific research alike.
One of the most hopeful, if least appreciated, signs of world progress is growing urbanization. Improved agricultural productivity is now having much the same effect in the developing world as it previously had in Europe and America: Freed from the necessity of remaining on farms, billions of people are moving to cities. The number of urban dwellers quadrupled between 2000 and 2005; today, for the first time in history, the majority of people live in cities. Although the political left likes to portray urbanization as driven by ruthless capitalism or environmental problems, in the main it appears to be the result of free individual choice. Pastoral romances to the contrary, over the course of history the vast majority of those residing in the countryside did so because they had no real alternative. They lived as their ancestors had lived, farming or fishing at subsistence levels, and if they happened to visit a city—which typically required a costly, arduous, and often dangerous journey—they seldom did more than briefly conduct their business, gawk at the sights, and express astonishment at the prices before hastening home. This historic logjam broke when scientific agriculture diminished the demand for farm labor. No longer needed at home, inspired by the prospect of better jobs and schools in the cities, and able to get there by bus on decent roads rather than by trudging through dark forests infested by highwaymen, people—young, enterprising people especially—moved to town.
Their vast migration, involving roughly two billion people in just a half century, has put the brakes on population growth, which in the developing world slowed from 2.06 percent annually in 1955 to 1.48 percent by 2005. (Rural families tend to have more children, since they can help with the chores, than do urbanites, for whom children mean more mouths to feed in cramped quarters.) As cities now account for most of the world’s population growth, the global growth rate is expected to continue to decline. The civil engineer Robert Ridgway had a point when he said, back in 1925, that “perhaps the most notable effect of the application of those laws of nature which have been brought to light by the patient investigations of the scientist during the past century and a half is evidenced in the wonderful growth of cities.”
The new arrivals are seldom greeted with open arms. Poor, undereducated, and unable to compete for any but menial jobs, they have congregated in the vast shantytowns that now ring megacities like Rio de Janeiro, Mexico City, Nairobi, Lagos, Jakarta, Karachi, Shanghai, Delhi, and Bombay. There they contend with inadequate water supplies, sewage disposal, electricity, and health care. Even the most hard-hearted capitalists should like to see their living conditions improved, if only for the self-interested reason that slums are potential breeding grounds for epidemics that could break out and sweep across a jet-girdled world.
But while this new rise of slums is unprecedented in scale, the same phenomenon has occurred many times before, whenever agricultural improvements freed the rural poor to urbanize themselves. In Europe, urbanites rose from 10 percent of the population in 1800 to nearly 30 percent by 1890. The pace of urban growth was particularly swift in industrial regions such as the northern British Isles; the population of Manchester and Leeds tripled between 1810 and 1841, and Glasgow grew 40 percent in a single decade, 1831–41. Little new housing was built to accommodate the influx, and the resultant overcrowding and squalor fueled epidemics of typhoid, typhus, smallpox, cholera, and tuberculosis. Life expectancy at birth in Manchester in 1841 was 26.6 years, half that of Bangladesh and Haiti today. The grim spectacle of the working poor crammed into pestilent slums and working in dank, dangerous factories created an image of industrial progress as having been purchased at an exorbitant cost in human lives, environmental desolation, and the rise of a hard-hearted, class-conflict society in which every human interaction was reduced to cash on the barrelhead.
London, beset by successive waves of immigration from the countryside beginning in the late sixteenth century, was a conspicuous laboratory of this rapid, unplanned and widely unwanted new urbanization. Barred from entering by the restrictive employment practices of the urban guilds, immigrants from the farms squatted outside the city walls in shantytowns called rookeries (or suburbs, meaning “under the city”). Alarmed to find growing hordes of rustics at their doorstep—many of whom had brought their farm animals with them, giving rise to the term “pigsty” to mean a filthy dwelling—Londoners from 1662 to 1697 passed a series of Poor Laws that required the poor to return home if they wanted to get on the relief rolls or obtain certificates of urban settlement. These harsh measures had little effect, and the slums had become even more overcrowded by the nineteenth century. Charles Dickens’s Oliver Twist, serialized in 1837–39, described “unwashed, unshaven, squalid, and dirty figures…thieves, idlers, and vagabonds of every low grade…constantly running to and fro” in “nearly ankle-deep…filth and mire.” The London Times protested, in an editorial of October 12, 1843, that “within the most courtly precincts of the richest city of GOD’s earth, there may be found…FAMINE, FILTH AND DISEASE.” Fyodor Dostoevsky, visiting London in 1863, called it “a biblical sight, some prophecy out of the Apocalypse being fulfilled before your very eyes.”
Reformers were quick to blame the British economic system: “Rich London is the creature of slum-London, of poor London,” wrote William Morris in 1881,
and though I do not say that the London slums are worse than those of other big cities, yet they together with the rich quarters make up the monstrosity we call London, which is at once the centre and the token of the slavery of commercialism which has taken the place of the slaveries of the past…. The sickening hideousness of London, the metropolis of the nation, which has worked out the sum of commercialism most completely, seems to me a mark of disgrace branded on our wire-drawn refinement to show that it is based on the worst kind of theft—legal stealing from the poor.