Skip Navigation
Angelo State University

Search Site

Information for:

Historical Perspectives on the American "Underclass"

by Arnold R. Hirsch

A conference on "Genetic Factors in Crime" was scheduled this month at the University of Maryland before its cancellation amidst charges of "veiled racism." The conference was to be part of a larger federally funded "violence initiative" announced by Frederick Goodwin, director of the National Institute of Mental Health. Comparing decaying inner cities to "jungles" and its youth to "rhesus monkeys," Dr. Goodwin hoped to launch a massive program to identify potentially violent inner-city children on the basis of biological or genetic "markers." Ignoring the social, economic, and political contexts for urban poverty and violence, the search for highly individualized psychiatric or behavioral controls betrays a disturbing -- if hardly novel -- perspective. It reveals the bitter-fruit that has grown out of a decade's worth of attention devoted to the so-called "underclass."

For historians, the invention of the "underclass" in the 1980s is an old story. It echoes debates concerning the discovery of a "culture of poverty" in the 1960s, as well as those regarding "pauperism" and the "undeserving poor" that stretch back to the first quarter of the 19th century. Its attractions are similarly timeless. Its very imprecision makes it a nearly empty vessel into which almost anything can be poured. The "underclass" is, after all, not a scientific concept, despite some early pretensions to that lofty designation. It is a social construct, a popular perception that fits a particular political context and purpose. Its ultimate attraction is that it explains the paradox of persistent poverty in the midst of plenty in a satisfying way to a majority of the American people.

This is not to say that the so-called "underclass" is a mere figment of our collective imagination. There is something new, or at least something newly visible in the last generation that gives the notion some of its cachet. The feminization of poverty, the segregation by class within urban minority communities, and the growing concentration and isolation of the poorest members of those communities are striking recent developments. Nevertheless, the fig leaf of cover provided by these new fragments of reality is not enough to hide the character of the "underclass" phenomenon itself. It represents the homogenization and objectification of a segment of our poor in the attempt to split the American people into a "them" and "us." It also represents a diversion that moves our eyes away from long-standing structural economic and social problems, and fixes them on a clutch of individuals that are massed indiscriminately at the base of our social ladder.

The image of a society placed in danger by the immoral, reckless, and uncontrolled behavior of its "have nots" took deep root in the early 1980s because of the climate of that particular time, but also because it tapped deeply rooted, almost visceral feelings among the American people. Before we can lay bare the inner appeal of the notion of an "underclass," however, we must be clear on what we are talking about.

What is the "underclass?" It seems immediately apparent that, as commonly used, the notion of the "underclass" does not encompass the majority of American poor, who are white. Neither does it account for historical, regional patterns of poverty that have particularly encumbered rural America and especially the South. The accepted usage seems -- whether accompanied by winks and nudges or not -- to focus on those racial minorities among the urban poor that haunt the metropolitan shells of a fading industrial America. It is they who are imputed to hold values alien to "mainstream" culture.

This comprehension of a perceived "underclass" comprised of poor, urban minorities combines a triad of deeply rooted biases that carry great historical weight. A pronounced public disdain for the poor emerged in the United States with the appearance of a modern, commercial economy early in the 19th century. Such class prejudices, when reinforced by a traditional suspicion and fear of cities themselves, have led Americans since the Revolution to be leery of the seemingly alien cultures that have flourished in our urban cores. Indeed, bouts of urban mob violence emanating from the depths of the American social order -- or at least the fear of such eruptions -- have posed perceptions of threat to the very existence of the republic numerous times. When overlaid, finally, by the deepest anxieties held in American society -- those related to race and ethnicity -- we have a deadly mix.

What I propose to do is trace the emergence and evolution of this mutually reinforcing trio of fears and to show how they have joined in a particularly devastating popular image in today's time. A look at traditional American conceptions of the poor, the cities, and of racial minorities will help explain contemporary calls for "targeting" troublesome individuals and demands for their social, or perhaps literal, sedation by respected analysts. The point is that the "underclass" is in the eye of the beholder, and those who must have their values scrutinized in this debate do not all reside in the central city.

America's initial confrontation with urban poverty on a massive scale, as well as its attempts to come to grips with the problem in modern terms, encompassed the century that stretched from the 1820s to the 1920s. It was an era of intense urbanization, industrialization, and immigration that transformed a pastoral nation into the world's leading economic power. At the time of the first census in 1790, only 5 percent of all Americans lived in cities and the nation had only five urban centers with a population of 10,000 or more. By 1860, the end of the antebellum era, there were 101 cities housing 10,000 or more people, and by 1920, a majority of Americans lived in urban centers. This represented a shift of epic proportions that redefined the nation and -- holding questions of class and race in abeyance for the moment -- witnessed the emergence of new urban social values that subverted the idyllic image of the independent Jeffersonian yeoman farmer.

This transition was made all the more jarring by the social fallout that accompanied a booming capitalist economy. The growth of cities simultaneously generated the concentration of a rapidly rising number of poor people and enhanced the visibility of the social problems attendant to their gathering -- the appearance of slums, epidemic disease, crime, mob violence, prostitution, and drunkenness, to cite only a few. The reasons behind the growth of poverty and urban disorder in an expanding industrial economy, if barely discerned at the time, seem clear enough in retrospect. Wage levels were such that even those fortunate enough to find work had great difficulty in supporting families above any rationally defined poverty line. The so-called "working poor" were legion and the vicissitudes of the business cycle frequently pushed those living on the margin into abject poverty. Poor pay and the irregularity of work, in short, contributed mightily to the ranks of those in need. The need for more than one wage earner in a family also meant that any unforeseen adversity -- illness, accident, and death which were all quite common in the 19th century -- could immediately plunge a family into destitution. Widows, given their restricted employment possibilities, were particularly vulnerable, and the mortality rates of working class men who toiled long hours in the most hazardous jobs, in the least healthful conditions, guaranteed that there would be many. In an age before pensions, unemployment compensation, and Social Security, it did not take much to reduce a family to dependency.

Such economic and demographic realities, however, did not occupy center stage as explanations for these disturbing conditions which were evident as early as the 1820s and 1830s. Increasingly, commentators began to distinguish between the poor and paupers; between poverty, which was seen as an accidental condition, and pauperism, which, they suspected, had become a way of life, a cultural imperative. In 1834, the Reverend Charles Burroughs, in opening a new chapel in the poorhouse in Portsmouth, New Hampshire, made the distinction explicit. "In speaking of poverty," he said:

"let us never forget that there is a distinction between this and pauperism. The former is an unavoidable evil, to which many are brought from necessity. . . It is the result, not of our faults, but of our misfortunes . . . Pauperism is the consequence of willful error, of shameful indolence, of vicious habits. It is a misery of human creation, the pernicious work of man, the lamentable consequence of bad principles and morals."

Reverend Burroughs's poorhouse oration and moral critique signaled the dominant response to antebellum urban poverty. The poorhouse itself was symptomatic of a wave of institution building designed precisely to sweep the most dangerous individuals off the streets while instilling the values of deference, order, and prudence. Public hospitals, asylums, prisons, public schools, houses of correction, and professional police departments all emerged in the middle third of the 19th century with precisely those ends in mind. The revival and transplantation of evangelical Protestantism to American cities, which took place in the same period, also produced urban missions, Bible tract societies, temperance movements, Sunday schools, and efforts to enforce the prohibitions of the Sabbath, all to combat the kind of moral degradation detected by the Reverend Burroughs.

In the postbellum era, the professionalization of social services and emergence of the social sciences built on these foundations. The "scientific charity movement" resulted in Charity Organization Societies being formed in most major cities after 1878. Trying to systematize the dispensing of charity, Charity Organization Societies strove mightily to separate the "worthy" from the "unworthy" poor, campaigned to end all "outdoor relief" (what we would today call welfare), and tried to repair relations between the classes while making certain their money was not being wasted by employing "friendly visitors" (agents who would check on the economic and moral condition of their clients). The most striking development, though, was the contribution of social science to the analysis of urban poverty. The Charity Organization Societies, in fact, grew out of a study conducted by Dr. Charles S. Hoyt, secretary of the New York State Board of Charities. The study confirmed popular notions about the moral origins of pauperism. In examining nearly 13,000 institutionalized dependents, Dr. Hoyt charged that:

". . . the greater number of paupers have reached that condition by idleness, improvidence, drunkenness, or some form of vicious indulgence . . . These vices and weaknesses are very frequently, if not universally, the result of tendencies which are to a greater or less degree hereditary. The number of persons in our poor-houses who have been reduced to poverty by causes outside their own acts is . . . surprisingly small."

Ignoring the economic depression that gripped the nation as he collected his data and classifying anything less than total abstinence as the "intemperate" use of alcohol, Dr. Hoyt wrapped a cloak of scientific objectivity around prevailing prejudices.

Hoyt was not alone in this academic enterprise. In his analysis of the 1880 census, Frederic Howard Wines collapsed the "blind, insane, prisoners, deaf-mutes, idiots, paupers, and homeless children" into a single social category. Constructing, in the words of historian Michael B. Katz, a "metaclass" of "defective, dependent, and delinquent" human beings, Wines legitimized what he called a "morphology of evil." In so doing, he railed against the "defective types of humanity" that threatened to overwhelm society and reasoned that policymakers needed a sense of "the whole extent of the evil to be contended against." The cutting edge of American scholarship had consigned those on the economic fringe, those with physical or other problems, to a moral "underclass" of its own design.

The demographic explosion of the presumed "dangerous classes" in the 19th century fed a long-standing suspicion of the cities as unwholesome places with baleful influences. Thomas Jefferson clearly equated republican health with an independent and autonomous rural population, even as he nursed deep fears of the urban mobs he witnessed in revolutionary Europe. He was moved so far as to offer grudging praise of a yellow fever epidemic in Philadelphia for its presumed effect in discouraging future urban concentrations.

By the 1830s, one of the most astute observers of American society, Alexis de Tocqueville, also noticed the combustible elements gathering in American towns and included among them, not just the native white poor, but also freed blacks and immigrants. "I look upon the size of certain American cities," de Tocqueville wrote:

". . . and especially on the nature of their population, as a real danger which threatens the future security of the democratic republics of the New World; and I venture to predict that they will perish from this circumstance, unless the government succeeds in creating an armed force which, while it remains under the control of the majority of the nation, will be independent of the town population and able to repress its excesses."

Despite the familiarity of the rhetoric, we should note that de Tocqueville was not as rhapsodic about the prospect of uniformed and armed young men taking back the streets of our cities as was Pat Buchanan in the 1992 Republican Convention in Houston. The 19th-century French visitor believed, unlike many that followed, that the seemingly "hereditary misery and degradation" that was the lot of the urban multitudes stemmed, not from their genes or culture, but from public opinion and the laws of the land. Still, de Tocqueville's warning and Buchanan's ritual incantation to "take back our cities, . . . take back our culture, . . . take back our country" drew on long-standing, deeply held fears.

There are two additional measures that highlight traditional doubts and suspicions about American cities. The first is the trend toward suburbanization, evident since the middle of the 19th century. Cities have functioned as historical halfway houses as the native white American middle class moved from the small town and countryside to suburbia. Not only did the relatively well-to-do vote with their feet in deserting the central city, almost as rapidly as they arrived, but successful suburban resistance to forced annexations led to the revelation in the 1970 census that America's urban majority had skipped town -- the city's demographic hegemony lasted only a brief half century, from Harding to Nixon.

Second, the seemingly alien character of American cities was reinforced in the late 19th and early 20th centuries by the presence of, well, aliens. The era of mass immigration that spanned the years between 1880 and the city's landmark year of 1920 coincided instrumentally with America's rise as an modern economic power. It was closely associated with the emergence of an urban industrial way of life, and the passing of the American Eden. Indeed, writing in New York at the turn of the 20th century, Jacob Riis, author of How the Other Half Lives (1890), noted that:

". . . one thing you shall vainly ask for in the chief city of America is a distinctively American community. There is none; certainly not among the tenements."

That was precisely the point. More than 25 million immigrants entered the United States between the end of Reconstruction and World War I, and they overwhelmingly settled in the manufacturing metropolises of the northeast and midwest. In the 1880s, over 70 percent came from the familiar regions of northwestern Europe. In the decade before the Great War, only about 17 percent emanated from that source. Most now came from southeastern Europe. English was not their primary language, and they were predominantly Catholic and Jewish -- not Protestant.

Quickly linked to the expansion and intensification of existing urban pathologies, the newcomers were also associated with a host of new ones. A fresh round of coercive moral crusades targeted, not only the saloon, brothel, and gambling den -- three of the more iniquitous institutions then identified with the city's assault on traditional values -- but also lotteries, gambling, pornography, birth control, boxing, horse racing, college football, Sunday baseball, cigarette smoking, card playing, dance halls, amusement parks, vaudeville shows, and movies. Virtually every aspect of modern urban life, particularly in the teeming immigrant quarters, seemed to mock traditional American agrarian values.

Most significant was the rise of the urban political machine and its relationship to the presumed debasement of American democracy. Collectively organized immigrants proved so prominent in this regard -- at least as machine participants, if not always architects -- that they became the focus of a stinging attack in Lord Bryce's 1891 book, American Commonwealth. It was there that Bryce made his famous comment about city government being America's "most conspicuous failure."

The distinction between the "old" and the "new" immigration was subsequently cultivated as assiduously by those in the vanguard of scientific inquiry as was the prior division between the "worthy" and "unworthy" poor. The ethnic and religious diversity of industrializing America, moreover, provided a new edge to this debate. Originally constructed in the 1890s by New England and Ivy League theorists interested in restricting immigration through the application of a literacy test, the border separating the "old" immigrants from the "new" had, within a generation, been defined as a racial frontier. When, finally, the door to continued immigration from southeastern Europe was slammed shut by a series of nationality-based quotas in the early 1920s, it was done so on explicitly racist grounds. The scientific justification for isolating and denigrating the bulk of America's Catholic and Jewish immigration was broadcast to a wide audience by popular writers such as Madison Grant, the respected anthropologist of the American Museum of Natural History, who, in The Passing of the Great Race (1916), had no trouble separating European peoples into hierarchically arranged categories called Nordics, Alpines, and Mediterraneans. The inferiority of the last-named, evident in their social characteristics and rooted in their "blood," represented for Grant, indeed for a majority of Americans of the time, a danger that had to be arrested.

Such popular notions were further legitimized by the government of the United States. A detailed study of the immigration problem ordered by the U. S. Senate and conducted under its auspices reported to the American people in 42 data-laden volumes in 1911. The Dillingham commission took three years, employed a staff of 300, and spent a million dollars to develop a "Dictionary of Races," and to conclude that "the recent immigrants as a whole . . . present a higher percentage of inborn socially inadequate qualities than do the older stock."

Dr. Harry H. Laughlin of the Carnegie Institution, the eugenics expert of the House Committee on Immigration and Naturalization, echoed and extended the Dillingham Commission's findings in a 1922 report that provided the immediate background to the final passage of the National Origins Act of 1924. Employing dubious methodologies and assumptions familiar to readers of the earlier Hoyt and Wines studies, both the Senate and the House research sanctioned popular prejudices and, indeed, granted them the status of conventional wisdom.

The popular attitudes, scholarly literature, and government reports that documented the racial inferiority of the "new" immigrants sparked a series of cultural and religious confrontations that should give pause to today's eager advocates of an American Jihad. The 1920s witnessed not only racially justified immigration restriction, but a revival of the Ku Klux Klan, the imposition of Prohibition, long-standing conflicts in the public schools, and ugly political confrontations that set North against South, city against country, "wets" against "drys," and immigrants against natives. The clash of cultures was perhaps best symbolized by the famous 1925 "monkey trial" in Dayton, Tennessee. Chicago agnostic Clarence Darrow was matched against the former populist and presidential candidate William Jennings Bryan in a legal confrontation over whether or not evolution should be taught in the schools. If the issue could be neatly disposed of in a state such as Delaware, where proposed legislation prohibiting the teaching of theories that humankind evolved from lower animals was buried in the committee on Fish, Game, and Oysters, other states had to struggle bitterly to contain the passions and divisions it evoked.

The "new" immigrants' occupation of distinctive spatial, economic, and social niches in urban America, their very "inbetweenness" (they did not appear black, but they were "different" enough so that they were not perceived as "white") confused traditional conceptions. They constituted, it seemed, a unique peril; one so malignant its roots had to be found in race, and one so novel that a race had to be invented to account for it. The European ethnics' stay, however, in their peculiarly American racial purgatory was relatively brief. Their release came with their acculturation and the maturing of industrial society. In time, they simply became "white."

The rapid assimilation of the ethnics that followed the cutoff of continued immigration severed the link forged between a perceived racial threat and a poverty-stricken urban underclass. The Great Depression of the 1930s similarly severed, at least temporarily, the connection between economic failure and popular explanations couched in individual and moral terms. The massive unemployment of the depression decade did wonders in focusing attention upon the social and political causes of crippling poverty. The new perspective helped usher in a brief season of federal activism that lasted slightly more than a generation -- from the New Deal to the Great Society. The flurry of reform activity that encompassed the middle third of the 20th century succeeded in planting the foundation of a rudimentary welfare state before another round of reaction caused the national government to sound retreat and disengage from urban America.

The very intractability of America's urban problems lent weight to the critiques of those who scoffed at the notion that human agency, through political action, might successfully lighten the burden of city dwellers. More, the renewed linkage between a perceived racial threat and the inner city, and the image of an urban rabble out of control, resurrected a traditional view of American society that eschewed social or systemic analyses of poverty and insisted again on the individual and moral failings of the poor themselves. This time, however, it was not the temporarily exotic descendants of European immigrants, but primarily African Americans who commanded the urban stage.

When Abraham Lincoln signed the Emancipation Proclamation, over 90 percent of an overwhelmingly rural black population lived in the south. Successive waves of migrants deserted the region in the 20th century, largely for the cities of the midwest and northeast, although increasing numbers found urban homes out west and within the south as well. Between 1940 and 1970 alone, nearly 5 million left the rural south, and in the latter year, only 53 percent of all African Americans remained in the region. A majority of blacks had become urbanized by 1950, some 30 years later than whites, but by 1960 their degree of urbanization surpassed that of whites. By 1980 African Americans became the most highly urbanized segment of the American people. The kinship networks that disciplined this migration, the family economies that fueled it, and the geographic mobility of these landless agricultural laborers, revealed more the purposeful resourcefulness of "strivers" facing a slender array of grim choices than the easily assumed "shiftlessness" of the morally deficient. It should also be noted that much the same could be said of the waves of immigrants entering the United States from Mexico and the Caribbean Basin in recent years. Indeed, when combined with Asians, the 7 million immigrants who arrived in the 1970s eclipsed the previous single decade record of 6.3 million new arrivals who entered between 1900 and 1910.

The appearance of the urban ghetto accompanied this demographic revolution. Continued white abandonment of the central cities for suburbia and the flood of racial minorities to our metropolitan cores produced, in the post-World War II era, more rigidly segregated cities. There are two points that need to be made in conjunction with this development. First, it was not the "natural" result of the migrants' own actions or desires, nor the inevitable outcome of a free market economy. Relentless pressures, willful acts of discrimination -- emanating from both the public and private sectors -- and even acts of mob violence dictated minority settlement patterns. Second, an African American mass found itself rooted in the heart of older, decaying industrial cities even as the United Stated deindustrialized. The advantages of central location, which served the European immigrants well earlier in the century, were lost to the later arrivals. Instead, the disappearance of well-paying manufacturing jobs and the turn to a service-based economy placed those dependent on the public services of financially debilitated cities at a competitive disadvantage.

These difficulties were hardly evident, even to the most prescient observers, at first blush after World War II. Indeed, it is telling that when poverty was rediscovered as a national political issue in the Kennedy administration, it was defined neither by an exclusively urban locus, nor a dominant racial component. Much of the earliest attention focused, not on blacks in Chicago and New York, but on whites in Kentucky and West Virginia. Appalachia provided both inspiration and destination. By the mid-1960s, debates concerning a presumed "culture of poverty" had been renewed and the focus shifted toward urban minorities. The limitations of the ensuing War on Poverty notwithstanding, the Great Society programs of the Johnson administration capped an era of reform that -- if it did not eliminate poverty -- served millions well and contributed mightily to the historical reduction of the proportion of Americans suffering genuine want. Using contemporary definitions of poverty which would understate the shift, we can estimate that 40 percent of the American people fell among the truly needy at the turn of the century. Franklin D. Roosevelt was not terribly inaccurate when he noticed one-third of a nation ill-housed, ill-clothed, and ill-fed during the Great Depression. By the 1970s, though, in the backwash of the Great Society, perhaps no more than 15 percent of all Americans fell below the poverty line.

Such real accomplishments carried little weight in the shifting political climate that followed the riots of the 1960s. Moreover, the increasing visibility and spatial isolation of poor urban minorities combined with the intellectual weakness of a spent liberalism and new demographic realities -- the hegemony of the suburbs and the Sunbelt -- to facilitate reaction. A reinvigorated conservatism seized the initiative and refocused public debate.

Two books, Charles Murray's Losing Ground (1984) and Lawrence Mead's Beyond Entitlement: The Social Obligations of Citizenship (1986), captured both the mood and the thrust of the Reagan years while following in the tradition of Charles Hoyt, Frederic Howard Wines, the Dillingham commission, and Harry Laughlin. Murray not only breathed fresh life into the image of the undeserving poor, but he coupled it to a critique of the social programs of the previous generation. In caricaturing the former, he proceeded, as a number of critics have shown, from dubious factual assertions to even more questionable conclusions. In lambasting the latter, he merely misrepresented or ignored the real achievements of social reform -- the decline in poverty among the elderly, increased availability of medical care, the drop in infant mortality, and the near abolition of real hunger before the 1980s. Nowhere, as he heaped scorn and blame on the social programs of the preceding generation, did he display the slightest awareness that the images and rhetoric he employed antedated him by a century or more.

Mead represented a different strain of conservative thought and, in a more secular age, he substituted the state for the church in proposing a succession of coercive measures designed to elicit properly responsible and deferential conduct from the poor. Cyclical recessions, deindustrialization, and the novel demands of a post-industrial economy -- not to mention the continued practice of racial discrimination, let alone the legacies of uprooted past practices -- were airily dismissed, as they were by Murray. "Unemployment," he declared, "has more to do with functioning problems of the jobless themselves than with economic conditions." A somewhat easier argument to make in the middle 1980s than the early l990s, Mead called for an authoritarian state to impose its will on a presumed recalcitrant few. Low wage work would be "mandated" by the state, just as a military draft filled army ranks in days past, and government coercion would be instrumental in socializing the poor to "mainstream" values. If it was Murray's contention that government had tried too much, it was Mead's that it needed to do more. It is not a giant step, then, from Mead's prescriptions for state action to those contemplated by the sponsors of the federal "violence initiative" mentioned at the outset.

Where does all of this leave us? As for the inner-city residents themselves, for the so-called "underclass," there remain warring assessments of what must be done either for them or to them. In late 19th-century New York, Jacob Riis described "a great meeting . . . of all denominations of religious faith, to discuss the question [of] how to lay hold of these teeming masses . . . with Christian influences, to which they are now too often strangers." Seen as an absolutely necessary effort, one sympathetic observer raised a nonetheless disturbing question. "How," he asked, "shall the love of God be understood by those who have been nurtured in sight only of the greed of man?" In the 1990s, following a decade that even conservative analyst Kevin Phillips described as a second Gilded Age, we might profitably begin by asking the same question.