Source: The Meanings of Mobility
The Family in America
March 03, 2012
When President Obama delivered an address at Osawatomie, Kansas, in December 2011, he credited the values of hard work and responsibility with helping America overcome the Great Depression at home in the 1930s and fascism abroad in the 1940s. After those triumphs, he contended the same dispositions “gave rise to the largest middle class and the strongest economy that the world has ever known.” Today, however, “the basic bargain that made this country great has eroded. Long before the recession hit, hard work stopped paying off for too many people.” We confront, as a result, “a make-or-break moment for the middle class, and for all those who are fighting to get into the middle class.” At stake, the president declared, is whether “working people can earn enough to raise a family, build a modest savings, own a home, secure their retirement.”
One doesn’t rise far in our democratic politics, certainly not all the way to the White House, without a talent for discerning and addressing citizens’ worries. President Obama correctly pointed out that widespread fears of falling, combined with fatalistic doubts about the prospects of rising, are much older than the 2008 financial crisis and ensuing recession. A 1989 article by Nicholas Lemann reported that in Naperville, an upper middle-class suburb west of Chicago, “the word ‘stress’ came up constantly in conversations. People felt that they had to work harder than people a generation ago in order to have a good middle-class life.” Lemann connects this sentiment to an idea that “holds sway” in “much of the rest of the country”: The “middle class is downwardly mobile and its members will never live as well as their parents did. . . . The feeling is that anyone who becomes prosperous has beaten the odds.”
The president also correctly reflected the popular belief that the three decades after the end of World War II were a golden age for the American economy. As Lemann wrote, “Suburbanites of the fifties were confident of a constantly rising standard of living, level of education, and gross national product in a way that most Americans haven’t been since about the time of the 1973 OPEC [oil] embargo.” Surprisingly, there’s now a broad consensus that the postwar era—which artists, writers, and academics derided at the time for being intolerant and inane—were the Good Years. Those on the political left are nostalgic for strong labor unions, the “Great Compression” of income differences between the rich and the rest, and high marginal-tax rates on the most affluent. Conservatives fondly recall those years as the time when economic circumstances permitted, and cultural norms encouraged, men and women to get and stay married, and support a family on the father’s income while the mother raised their children and managed the household.
The Good-Ole Days in Perspective
There are problems with these recollections, however. For one thing, America’s postwar economic boom rested on a set of unprecedented economic circumstances, ones that no conceivable mix of public policies will recreate. As the only major industrial nation to escape World War II without physical devastation or civilian casualties, America enjoyed remarkable preeminence after 1945. With “7 percent of the world’s population in the late 1940s, America possessed 42 percent of the world’s income and accounted for half of the world’s manufacturing output. American workers produced 57 percent of the planet’s steel, 43 percent of electricity, 62 percent of oil, 80 percent of automobiles,” according to historian James Patterson’s book on the postwar era.
These advantages, though enormous, were not eternal. By the 1970s, America was competing against former industrial nations that had rebuilt themselves, countries since joined by others that, until recently, had offered little to world markets besides raw materials. The effortlessness of the postwar boom—the feeling that American prosperity had only to be administered, not continually replenished—had been lost.
Some commentators, left and right, describe the boom as sui generis. Jefferson Cowie, a historian sympathetic to the labor movement, writes, “Perhaps one of the primary interpretive problems of working-class history was that the baseline of comparison had too often been the extraordinary postwar period.” Michael Barone, the conservatively inclined political analyst, emphasizes the sociological uniqueness of an era in which 16 million Americans, more than a tenth of the population, had recently served in the military. As a result,
Americans in the Midcentury Moment were unusually conformist, content to be very small cogs in very large machines: They married and bore children at record rates for an advanced society; they worked as organization men and flocked to mass-produced suburbs; they worshipped in seemingly interchangeable churches. This was an America that celebrated the average, the normal, the regular. . . . The huge menu of lifestyle choices from which we can choose today was a very short menu with very few choices then.
Another complication: the Bad Years that Americans have endured since the Good Years drew to a close in the 1970s have not been all that bad from an economic standpoint, according to a sizeable amount of evidence. A study by the Pew Charitable Trusts’ “Economic Mobility Project” followed a sample of children born in the 1950s and 1960s for three decades, finding that “67 percent of Americans who were children in 1968 had higher levels of real [i.e., inflation-adjusted] income in 1995–2002 than their parents had in 1967–1971.” Children who grew up with less were especially likely to wind up with more: 82 percent of those raised in families in the bottom quintile of the 1968 income distribution had higher incomes by the turn of the century than their parents had lived on, as did 74 percent of those raised in the second quintile.
Furthermore, “the average family size for adults in their 30s shrank from 4.5 to 3.2 persons between 1969 and 1998.” The commonly used adjustment for comparing incomes among households of different sizes divides income by the square root of the number of people in the household. Thus, a household of one with an income of $50,000, a household of two with an income of $70,711, a household of three with an income of $86,603, and a household of four with an income of $100,000 are all treated, for comparative purposes, as having incomes of $50,000. Making that further correction raises the percentage of Americans who had a higher income by 2000 than their parents did some 30 years earlier from 67 percent to 81 percent. (This square root correction for the number of people living in a household is an Aristotelian mean between ignoring household size, which leaves us saying that a family of five living on $50,000 a year is no worse off financially than a single person with that income, and over-correcting by using per-capita incomes as the yardstick, which would mean that a family of five needs $250,000 to live at the same level as a household of one making $50,000. This adjustment, being curved rather than linear, helpfully reflects increasing economies of scale in domestic life. Other things being equal, married couples generally live in larger homes than single people, for example, but not ones twice as large, even as families of six generally live in larger homes than childless couples, but not ones three times as large.)
Other studies tell a similar story. The economists Bruce Meyer and James Sullivan found “evidence of considerable improvement in material well-being for both the middle class and the poor” since 1980. In 1981, for example, only 27 percent of households in the middle quintile of the income distribution lived in homes with central air conditioning; 67 percent did so in 2009. Between 1989 and 2009, the size of the houses or apartments middle-quintile households inhabited increased by 20 percent, measured in square feet and adjusted for the number of people in the household. Even when the focus shifts to households in the bottom quintile of the income distribution, we find that 54 percent lived in homes with central air conditioning by 2009, twice the percentage of middle-quintile households who enjoyed that amenity in 1981. Those homes in 2009 were only 2 percent smaller, on average, than the ones middle-quintile households had lived in twenty years earlier, and 18 percent larger than the ones households in the bottom quintile of the income distribution had inhabited in 1989. Two-fifths of households in the bottom quintile of the income distribution lived in a home with a dishwasher, three-quarters owned an automobile, and 22 percent owned more than one. Overall, Meyer and Sullivan’s data show that Americans’ median household income was more than half-again as high in 2009 as it had been in 1980.
The Impact of Mothers in the Labor Force
If Americans are doing as well as these numbers suggest, why do we feel as if we’re struggling to get by? Part of the explanation for the post-postwar boom’s discontents is that modern middle-class Americans are convinced they have less of one thing than their parents did: time. The Naperville residents Lemann interviewed “talk as if the slack had been taken out of life. They complain that between working long hours, traveling on business, and trying to stay in shape they have no free time.”
This “under-the-gun” feeling was closely connected, Lemann argued, to the widespread participation of mothers in the work force, a fairly recent development in 1989 but a more established feature of American life by the early-twenty-first century. The higher incomes and bigger houses have been purchased, in other words, at the price of domestic tranquility and family cohesion. In 1970, according to the Bureau of Labor Statistics, only 30.3 percent of married women with children under the age of six years held jobs, whether full-time or part-time, outside the home. By 1990, that proportion had nearly doubled, to 58.9 percent, and has stayed around 60 percent since. For married women with school-aged children (those between the ages of 6 and 17), 49.2 percent held jobs in 1970, a proportion that increased to 73.6 percent by 1990, and rose above 75 percent by 2000, where it has stayed.
The most doleful interpretation of these developments is that millions of mothers have been forced to take jobs as the only way to offset their husbands’ declining economic prospects. Two paychecks are better than one, not only because they buy more goods during boom times but also because the higher odds against two layoffs than against just one provide protection against destitution during hard times. A recent study by the Brookings Institution economists Michael Greenstone and Adam Looney showed that inflation-adjusted earnings of American men between the ages of 30 and 50 were 27 percent lower in 2009 than in 1969. The decline is the effect of two causes: diminished earning power and diminished labor-force participation. Men in their 30s and 40s with full-time jobs earned 5 percent less in 2009 than their counterparts in 1969. (The decline correlates, inversely and strongly, to educational attainment: Men who didn’t finish high school earned 38 percent less from their full-time jobs in 2009 than such men had in 1969, while men with college degrees and full-time employment earned only 2 percent less.) Greenstone and Looney also found that men between the ages of 30 and 50—the vast majority of whom were too old to be students and too young to be retirees—suffered a 15.5-percent decline from 1969 to 2009 in the likelihood that they would hold full-time jobs, a decrease that corresponds to a 5.3-percent increase in the number of men working part-time, plus a 10.2-percent increase in the number not working at all.
Are more American women working because men’s earnings have diminished, or have men’s earnings diminished because more women are working? There are plausible arguments for each conclusion about the predominant cause-and-effect flow. If men’s declining economic prospects result from changes in the global and national economy, women probably feel compelled to enter the labor force in high numbers out of necessity. If they’re married, they probably are making up for the money their husbands don’t earn; if they’re single, they are likely strengthening their résumés to prepare for the eventuality of providing for some of their household income when they do marry, and all of it if they don’t.
On the other hand, if more women are entering the labor force because of changing expectations, theirs and society’s, about the kind of lives they should lead, then the supply-and-demand forces within the labor market mean that American men today face more competition for jobs than they did fifty years ago. This makes it harder for men to get jobs, including better and more lucrative ones, and leaves them with less leverage to bargain for compensation packages. As manufacturing and agricultural employment has declined, the number of office jobs that entail indoor work and no heavy lifting has grown, meaning that more men are competing against women for jobs where men’s natural advantages in physical size, strength, and stamina are irrelevant.
Scott Winship, the former research manager for Pew’s Economic Mobility Project, rejects the “gloomy narrative” spun by “economic pessimists” who claim that “household incomes have kept pace only because wives have been forced into work to make up for the shrinking bacon their husbands bring home.” He cites “the long-term trend of women’s obtaining more education in industrialized nations around the world, presumably with an intention to put it to use in the work force someday.” Furthermore, “employment grew more among the wives of better-educated men than among the wives of less-educated men.”
If women were taking jobs mostly as a matter of necessity, one would expect that the wives of men with lower earning prospects would be more likely to enter the work force than women married to husbands with greater earning power. That the opposite is true suggests that women choose to take jobs, the opportunity cost of being full-time mothers being higher for women with more education than for those with less. In 1970, 13.5 percent of American men ages twenty-five and older were college graduates, compared to 8.1 percent of women. By 2010, the proportions were nearly identical: 30.3 percent of men compared to 29.6 percent of women. (Because, largely as a result of longer life expectancies, women outnumber men in the adult population—103.6 million in 2010, compared to 96.3 million men—the number of women who have completed college exceeds the number of men.) Women are certain to constitute an increasing proportion of college-educated Americans for many years to come. In the 1970 census, men accounted for the majority—58.8 percent—of college students. By 1980, women were the majority and have been ever since. Of the 11.6 million Americans between the ages of 18 and 24 who were enrolled in college in 2009, 54.3 percent were female.
Absolute v. Relative Mobility
Leaving aside the complex, contentious question of whether the ubiquity of the two-income household is primarily an economic or sociological phenomenon, there is another reason why growing incomes correspond to growing rather than diminishing dissatisfaction: We have more than our parents did, but expect much more. As a result, Lemann argues, the home-movies in Baby Boomers’ heads about middle-class Americans’ lives during the postwar boom years exaggerate their prosperity: “A ranch-style tract house, a Chevrolet, and meat loaf for dinner will not do any more as the symbols of a realized dream.”
Widespread misperceptions about our circumstances and prospects result from comparing ourselves to our contemporaries, and from making invalid comparisons to our parents and grandparents. As Time recently argued, just as “we don’t feel grateful to have indoor plumbing or multichannel digital cable television, we don’t necessarily feel grateful that we earn more than our parents did.” Increasing the ability to afford more, better, and bigger consumption items will never address those feelings of economic insecurity and even deprivation, since “our sense of well-being is tied not to the past but to how we are doing compared with our peers.”
For this reason researchers, such as those with the Pew Mobility Project, make the distinction between absolute and relative mobility. Absolute mobility concerns having a higher income than one’s parents. Relative mobility, according to Winship, is about “whether those whose parents were at the bottom or at the top relative to Americans as a whole end up in the same place in adulthood.” Indeed, the Pew research suggests that relative mobility in America is more attenuated than absolute mobility. The data in the following table show the distribution of households among the quintiles of the income distribution at the turn of the century according to the income-distribution quintiles those adults had been raised in thirty years earlier:
Adult-Children Income Distribution (Percentages)
In a mobile, fluid society, 20 percent of the children raised in each quintile of the income distribution would wind up, as adults, in each quintile of the income distribution. Quintiles can be vast in a nation as populous and diverse as the United States—in 2011, each one had a population in excess of 62.5 million, more people than live in the United Kingdom—and people might cluster near a quintile’s top or bottom depending on their childhood circumstances. A zealot’s definition of perfect intergenerational mobility would preclude this possibility by stipulating that 1 percent of the children raised in the top percentile of the income distribution wind up in the top percentile as adults, 1 percent wind up in the second percentile, and so forth throughout each percentile of the income distribution, past and present. The deck gets reshuffled, that is to say, constantly and completely. In an absolutely stratified society, by contrast, every child raised in the top percentile of the income distribution would inhabit that percentile as an adult, and none of the children raised in the other 99 percentiles would take even one small step up or down the ladder in their adult lives.
In a highly mobile society, each quintile of the income distribution would look like the middle quintile of the Pew grid. Reading vertically, we see that approximately one fifth of the children reared in the middle quintile of the late-1960s income distribution wound up in each of the quintiles of the income distribution thirty years later. (There’s a slight downward drift—41 percent of middle-quintile children wound up in the bottom two quintiles as adults, while 36 percent wound up in the top two—the significance of which only better statisticians than I can assess.) Reading the Q3 row horizontally, we see that the middle quintile of the turn-of-the-century income distribution is made up, in roughly equal measure, of adults who grew up in each of the income quintiles thirty years previously. (Again, evidence suggests that children will more likely ascend to the middle than descend to it, with 43 percent of those in the 1995–2002 middle quintile coming from the bottom two quintiles in 1967–71, compared to the 33 percent that grew up in the top two quintiles.)
But the income distributions’ middle quintiles, past and present, are not representative of the entirety. The corners of the table tell a very different story, as 62 percent of those who grew up in the top quintile of the income distribution some 40 years ago were living, three decades later, in one of the top two quintiles, as were 58 percent of those reared in the fourth quintile of the income distribution. Reading horizontally, we see that only 35 percent of those who wound up in the top quintile of the income distribution started out south of the sixtieth percentile. By contrast, nearly two-thirds (65 percent) of those who grew up in the bottom quintile of the income distribution wound up in either it or the one just above it, while only one-sixth (17 percent) rose to the highest or next-highest quintile of the income distribution. Downward mobility, of the sort captured in many novelists’ published works and tax returns, does remain a feature of American life: Nearly a fourth of those raised in the top two quintiles of the income distribution found their way to the bottom two in adulthood.
Winship cautions that trying to make sense of these insights by comparing America’s economic mobility to other countries’ is “surprisingly tricky.” Some data needed for those comparisons simply don’t exist, and the rest can be rendered useable for valid comparisons only with considerable difficulty. (The sort of longitudinal data that would confirm, non-anecdotally, whether America is more or less stratified than it was fifty or a hundred years ago is also, apparently, unobtainable.) Nonetheless, it appears that in America, whose citizens emphasize opportunity and mobility when characterizing what makes their nation distinctively admirable, childhood circumstances are more determinative of life prospects than they are in other modern nations. With respect to the nation closest to the United States, geographically and sociologically, Winship writes, “In Canada, a boy whose father earns twice as much as his friend’s dad can expect to have about 25 percent more in earnings as an adult than his friend. In the United States, he’ll have on average 60 percent more.”
As our polity addresses widespread economic anxieties and frustrations, it will matter whether we emphasize absolute or relative mobility. In theory, a high degree of absolute mobility is compatible with negligible relative mobility. We could, that is, have a society where it was routine to have a significantly larger income than one’s parents, thanks to a growing economy, but unusual to wind up in a different precinct of the distribution of income curve than they did. Conversely, we can imagine a society with lots of relative mobility but very little economic progress. Childhood economic circumstances would correlate weakly to adult economic standing, but people in each tenth of the income distribution would be living only about as prosperously as people in that tenth lived a decade or generation ago.
Growing Compensation Differentials Everywhere
Who says we can’t have it all? A society with a high degree of absolute and relative mobility, being economically dynamic and socio-economically fluid, is also conceivable. The practical impediments to attaining it are formidable, however. In modern societies, augmenting wealth is closely associated with augmenting economic inequality. The watchword of the age is William James’s maxim: There is very little difference between one man and another, but what there is is very important. Increased competition makes it valuable to hire, or be, the best performer rather than the second- or tenth-best. The economist Robert H. Frank points out that many people deplored the news that salaries for presidents of some of America’s biggest and most famous colleges recently surpassed $1 million. The real surprise, he says, is that they’re not much higher. A college’s president is its chief fundraiser, and hiring an exceptional president rather than one who’s merely very good could mean tens of millions more endowment dollars.
Similarly, a prospective CEO whose decisions would boost revenues 1 percent higher than those made by an alternative candidate would be worth an extra $100 million per year to a corporation with $10 billion in revenues. The steady increase in economic inequality attributable to the growing compensation differentials between people performing at different levels, Frank writes, is not just a society-wide phenomenon, but also one that has been “replicated for virtually every subgroup that’s been studied. It holds for dentists, real-estate agents, authors, attorneys, newspaper columnists, musicians, and plastic surgeons. It holds for electrical engineers and English majors.”
The studies showing more Americans counting their higher incomes in their centrally air-conditioned homes suggest that greater competition and larger rewards to those who help colleges, corporations, or other enterprises come out ahead have increased absolute mobility. The pie has grown, and even those who have narrower slices than their parents wind up with more on their plates.
A trend is not a law of nature, however, so absolute mobility won’t necessarily keep increasing as steadily intensifying competition and growing premiums paid to the best competitors expand the Gross Domestic Product. For one thing, there will be no next phase, providing third and fourth paychecks, to the shift that has turned two-income households from the exception to the rule in American life. For another, while those determined to arrive at or stay in the top slots may find greater competition rewarding, psychologically and materially, many of those in the broad middle of the income distribution feel beleaguered. Constant competition presents itself to them as a threat to fall back rather than an opportunity to advance.
The growing differentiation of rewards based on each individual’s contribution to some collective endeavor is more clearly detrimental to relative economic mobility. “When the rungs on the ladder are further apart, it’s harder to climb up them,” says Isabel Sawhill of the Brookings Institution’s Center on Children and Families. It’s not clear how much Americans care about doing better, relative to their contemporaries, if the entire ladder stands on higher ground than it did in the past. Winship cites Pew polling data showing 82 percent of Americans ascribing greater importance to financial stability than “moving up the income ladder.” On the other hand, research by behavioral economists shows that Americans harbor complex feelings about mobility, and are willing to entertain tradeoffs between the two types. “Many people would prefer earning $70,000 per year in an organization where most people earn $60,000,” in Cass Sunstein’s summary of several studies, “to earning $80,000 in an organization where most people earn $90,000.”
Americans are not conflicted, according to the Pew surveys, about deploring the growth of a lower caste whose hopes for breaking into the middle or upper parts of the income distribution are increasingly remote. A majority of Americans, including half of self-identified conservatives, consider it a “major problem.” That problem is hard to define, however, much less solve. David Brooks of the New York Times recently argued that “much more important” than the inequality between the top economic percentile and the other 99 percent, Occupy Wall Street’s obsession, is the inequality “between those with a college degree and those without.” This difference has economic consequences—the average income premium for those with college degrees over those whose education ended with high school graduation doubled from 38 percent in 1979 to 75 percent today—but seems to have a sociological rather than an educational basis:
In the 1970s, high school and college grads had very similar family structures. Today, college grads are much more likely to get married, they are much less likely to get divorced and they are much, much less likely to have a child out of wedlock.
Specifically, in the late 1990s, just 11 percent of American women with a bachelor’s degree saw their first marriage end in divorce before celebrating their tenth wedding anniversary. For women without a four-year college degree, the figure is about 37 percent. Likewise, among all births from 2006 to 2008 to women with at least a bachelor’s degree, only 6 percent were to unmarried mothers. Among women who had finished high school but not gone on to earn a four-year degree the figure was 44 percent, while it was 54 percent for women who had not finished high school.
The differences in personal circumstances linked to differences in education extend to less profound but still revealing areas of life. In 2009, according to the National Center for Health Statistics, only 5.7 percent of college graduates smoked cigarettes, compared to 24.3 percent of those who completed high school but went no further, the same percentage as for those who dropped out of high school. One third of adults with a high-school degree or less were obese, while just one fifth of those with a bachelor’s degree or more were.
The lecture on the consequences of reckless, shortsighted, and self-indulgent conduct writes itself. Good things happen—not always, but reliably—to those who assimilate capitalism’s ethos of deferred gratification: Study and work hard, get as much education as you can acquire and use, get and stay married, and avoid the kinds of mistakes or dissipations that can unravel your life prospects, suddenly or inexorably. The return on those investments will be handsome, and severe penalties for ignoring this tried and true advice are at least equally likely.
The logic of deferred gratification can run in the other direction, however. After all, it’s neither a brief for asceticism nor an argument that virtue is its own reward. Instead, deferring gratification is a wager that a regimen of self-discipline now will make us healthy, wealthy, and wise in the future. It is, in other words, pro-gratification. People who, confronting dwindling opportunities for the modestly talented, conclude that years or decades of disciplined self-denial will never leadthem even to the outskirts of the American Dream can infer a sanction to go ahead and get their gratifications now. Deferring, to them, means forswearing.
Advantages of the Marriage Script
The pressing, intricate question, then, is how much of the curtailment of mobility and growth of economic anxiety we should ascribe to sociological causes having economic effects, and how much to economic causes having sociological effects. Are some people stuck near the bottom of the ladder because they don’t apply themselves to meeting the challenges of education, career, marriage, and family, or do they give up on those tests because they’re stuck at the bottom of the ladder? Either way, we’re looking at a foreboding combination. At the same time that economic rewards are growing for top performers and challenges are growing for marginal performers, childhood circumstances—including the marital status of one’s parents—shape adult-life prospects. College graduates “have become good at passing down advantages to their children,” Brooks writes. The corollary is that Americans with less education pass disadvantages on to theirs.
There are three possible explanations for the durable impact of childhood socio-economic status. First, high-status parents can use their social capital and connections to game the system. Generous alumni, for example, might secure a legacy admission for their academically marginal child to a very selective college. Prosperous parents are also likely to have the savvy and wherewithal to get their children tutors and test-preparation coaches, and to underwrite foregone earnings while young Amanda and Jason make connections and acquire skills in unpaid internships. Parents of modest means, by contrast, have fewer cards to play and a more fragmentary understanding of the game’s unwritten rules.
Second, high-status parents inculcate habits and attitudes—such as hard work, attention to detail, figuring out whose favor to curry and how—that make economic success considerably more likely. Young strivers marry other strivers, whom they met while attending striver colleges, or later on in striver professional and graduate schools, or still later in striver law firms, medical practices, or investment banks. These striver couples go on to raise their little strivers to run the same obstacle course with even greater determination. If, the Atlantic’s Megan McArdle worries, high-quintile parents are very good at imparting “a strong education and an absolutely ferocious work ethic,” then we could be witnessing the founding of a new “aristocracy” that perpetuates and justifies itself by bequeathing “the actual skills required to earn more money than everyone else.” The corresponding dilemma is that those in the lower part of the income distribution, some of whom got there by enjoying life today and worrying about tomorrow when it comes, are likely to impart the same outlook to the children they rear.
Finally, to the extent native intelligence is both determinative and inheritable, we face the prospect of increasingly stark, durable “cognitive stratification,” the late Richard Herrnstein and Charles Murray’s term for a world where the smartest people get the biggest paychecks. An article by Murray that appeared after The Bell Curve was published examined a subset of the National Longitudinal Study of Youth (NLSY) on which much of that book’s analysis had been based. The subset consisted of NLSY subjects who had at least one full biological sibling, raised in the same home, also in the study. Murray found that raw cognitive horsepower, measured by IQ tests, explained more of the variation in income than did environmental factors like family socio-economic status. So, assume that I, of average IQ, grow up in the same home with the same parents as my high-IQ sister and my low-IQ brother. Our incomes as adults are likely to be bunched more closely together than the incomes of three randomly selected children who aren’t related to one another and grow up in three different homes, but have the same IQs identical to my siblings and mine—but only a little more closely. My lifetime-earnings trajectory, in other words, is likely to be quite similar to that of a contemporary not related to me, raised in different circumstances, but with the same IQ as mine, and to look very different from my sister’s or my brother’s, who were raised in the same home but with a very different IQ.
Correcting for a system where inequality persists and grows because some parents are more discerning, more aggressive, and better equipped to advance their children’s careers than other parents seems like the easiest problem to correct. Coming up with new, more transparent and even-handed rules may indeed make the competition fairer. Even there, however, the new rules will have their own complexities and offer their own shortcuts. It will be a surprise if the parents who figure out how to game the new system are a very different crowd from the ones who know how to game the old one.
Counteracting the other two tendencies looks much, much harder. A society where strivers impart the Striver Code to their little strivers, and slackers raise little slackers to observe the Slacker Way, is going to have predictable socio-economic outcomes. Imagine an egalitarian dystopia, where all children are adopted, being assigned randomly to parents at birth in order to prevent genetically heritable advantages and disadvantages from compounding over time. Strivers’ children, though not measurably smarter than any other group of children, would have an advantage over the children raised by slackers, and that advantage would grow in the second and third generations.
Now jettison that thought experiment and add in the real-world plausibility that cognitive firepower is, at least to some non-trivial extent, a heritable trait. Striver parents will be raising children who, as a group, are brighter than a random sample of their peers, and slacker parents raising children who, as a group, are of less than average intelligence. Nature and nurture will be interacting to promote socio-economic stratification. If full siblings raised in the same home by the same parents have big differences in the trajectories of their lives attributable to IQ differences, it’s impossible to see how government programs like Head Start are going to make a dent in long-term income inequalities. Unless America is prepared to become a kibbutz of 312 million people, the economic equalizing effected by social programs on a large scale will be a tiny fraction of the modest equalizing that devoted parents raising differently endowed children can bring about.
Americans are comfortable with the bromide about favoring equality of opportunity while opposing equality of result. Following or even understanding that rule turns out to be challenging, however, since one generation’s results play a significant role in defining the next generation’s opportunities. Changes in economic mobility, inequality, and security have had large, widely discussed consequences for the American family. It turns out that the most basic functions of the family—transmitting genetic traits and cultural norms—have a large, disquieting impact on inequality and mobility.
We face, then, a widening gyre, a growing and ominous divergence between a virtuous cycle among the affluent and a vicious cycle among the precarious. For the former, stable families encourage children to pursue educational and career opportunities. Those who do so are increasingly likely to establish strong, stable families on their own. For the latter, family breakdown increases the likelihood that children will make an earlier departure from the education system, which curtails their earning power and increases the likelihood of family instability that negatively affects the next generation. A necessary condition for avoiding a rigidly stratified society, or irresistible political pressures to address that grim reality with massive Rawlsian redistribution, is to reinvigorate the institution of marriage throughout the entire socioeconomic structure. The alternative to recapturing the social patterns and standards that characterized America as recently as the 1970s is to acquiesce in becoming a nation where stable, two-parent families are the norm in the top two quintiles and the exception everywhere else.
|Previous Article: The Crisis Hits Close to Home||Next Article: Come Home, Trent Lott: All is Forgiven|