How the American college went from pitiful to powerful
By David Labaree, Edited by Sam Dresser
From the perspective of 19th-century visitors to the United States, the country’s system of higher education was a joke. It wasn’t even a system, just a random assortment of institutions claiming to be colleges that were scattered around the countryside. Underfunded, academically underwhelming, located in small towns along the frontier, and lacking in compelling social function, the system seemed destined for obscurity. But by the second half of the 20th century, it had assumed a dominant position in the world market in higher education. Compared with peer institutions in other countries, it came to accumulate greater wealth, produce more scholarship, win more Nobel prizes, and attract a larger proportion of talented students and faculty. U.S. universities dominate global rankings.
How did this remarkable transformation come about? The characteristics of the system that seemed to be disadvantages in the 19th century turned out to be advantages in the 20th. Its modest state funding, dependence on students, populist aura, and obsession with football gave it a degree of autonomy that has allowed it to stand astride the academic world.
The system emerged under trying circumstances early in U.S. history, when the state was weak, the market strong, and the church divided. Lacking the strong support of church and state, which had fostered the growth of the first universities in medieval Europe, the first U.S. colleges had to rely largely on support from local elites and tuition-paying student consumers. They came into being with the grant of a corporate charter from state government, but this only authorized these institutions. It didn’t fund them.
The rationale for starting a college in the 19th century usually had less to do with promoting higher learning than with pursuing profit. For most of U.S. history, the primary source of wealth was land, but in a country with a lot more land than buyers, the challenge for speculators was how to convince people to buy their land rather than one of the many other available options. (George Washington, for instance, accumulated some 50,000 acres in the western territories, and spent much of his life unsuccessfully trying to monetize his holdings.) The situation became even more desperate in the mid-19th century, when the federal government started giving away land to homesteaders. One answer to this problem was to show that the land was not just another plot in a dusty agricultural village but prime real estate in an emerging cultural center. And nothing said culture like a college. Speculators would “donate” land for a college, gain a state charter, and then sell the land around it at a premium, much like developers today who build a golf course and then charge a high price for the houses that front on to it.
Of course, chartering a college is not the same as actually creating a functioning institution. So speculators typically sought to affiliate their emergent college with a religious denomination, which offered several advantages. One was that it segmented the market. A Presbyterian college would be more attractive to Presbyterian consumers than the Methodist college in the next town. Another was staffing. Until the late-19th century, nearly all presidents and most faculty at U.S. colleges were clergymen, who were particularly attractive to college founders for two reasons. They were reasonably well-educated, and they were willing to work cheap. A third advantage was that the church just might be induced to contribute a little money from time to time to support its struggling offspring.
Often the motives of profit and faith converged in the same person, producing a distinctive American character – the clergyman-speculator. J.B. Grinnell was a Congregational minister who left the church he founded in Washington, D.C. to establish a town out west as a speculative investment. In 1854 he settled on a location in Iowa, named the town Grinnell, gained a charter for a college, and started selling land for $1.62 an acre. Instead of organizing a college from scratch, he convinced Iowa College to move from Davenport and assume the name Grinnell College.
This process of college development helps to explain a lot of things about the emergent form of the U.S. higher-education system in the 19th century. Less than a quarter of the colleges were in the strip of land along the eastern seaboard where most Americans lived. More than half were in the Midwest and Southwest: the sparsely populated frontier. If your aim is to attract a lot of students, this was not a great business plan, but it was useful in attracting settlers. The frontier location also helps to explain the nominal church support for the colleges. In the competitive U.S. setting where no church was dominant, it was each denomination for itself, so everyone wanted to plant the denominational flag in the new territories for fear of ceding the terrain to the opposition. Together, land speculation and sectarian competitions help to explain why, by 1880, Ohio had 37 colleges – and France just 16.
For American students, it was often a choice of going to high school or to college.
The sheer number of such college foundings was remarkable. In 1790, at the start of the first decade of the new republic, the U.S. already had 19 institutions called colleges or universities. The numbers grew gradually in the first three decades, rising to 50 by 1830, and then started accelerating. By the 1850s they had reached 250, doubling again in the following decade (563), and in 1880 totaled 811. The growth in colleges vastly exceeded the growth in population, with a total of five colleges per million people in 1790, rising to 16 per million in 1880. In that year, the U.S. had five times as many colleges as the entire continent of Europe. This was the most overbuilt system of higher education the world had ever seen.
Of course, as European visitors liked to point out, it was a stretch to call most of these colleges institutions of higher learning. For starters, they were small. In 1880, the average college boasted 131 students and 10 faculty members, granting only 17 degrees a year. Most were located far from centers of culture and refinement. Faculty were preachers rather than scholars, and students were whoever was willing to pay tuition for a degree whose market value was questionable. Most graduates joined the clergy or other professions that were readily accessible without a college degree.
For American students, it was often a choice of going to high school or to college.
On the East Coast, a small number of colleges – Harvard, Yale, Princeton, William and Mary – drew students from families of wealth and power, and served as training grounds for future leaders. But closer to the frontier, there were no established elites for colleges to bond with, and they offered little in the way of social distinction. The fact that every other town had its own college led to intense competition for students, which meant that tuition charges remained low. This left colleges to operate on a shoestring, making do with poor facilities, low pay, struggles to attract and retain students and faculty, and continual rounds of fundraising. And it meant that students were more middle- than upper-class, there for the experience rather than the learning. The most serious students were those on scholarship.
Another sign of the lowly status of these 19th-century colleges is that they were difficult to distinguish from the variety of high schools and academies that were also in abundance across the U.S. landscape. For students, it was often a choice of going to high school or to college, rather than seeing one as the feeder institution for the other. As a result, the age range of students attending high schools and colleges was substantially the same.
By the middle of the century, a variety of new forms of public colleges arose in addition to the independent institutions that today we call private. States started establishing their own colleges and universities, for much the same reasons as churches and towns did: competition (if the state next door had a college, you needed one too) and land speculation (local boosters pushed legislatures to grant them this plum). In addition, there were the colleges that arose from federal land grants and came to focus on more practical rather than classical education, such as engineering and agriculture. Finally came the normal schools, which focused on preparing teachers for the growing public school system. Unlike the privates, these newer institutions operated under public control, but that did not mean they had a steady flow of public funding. They didn’t start getting annual appropriations until the start of the 20th century. As a result, like the privates, they had to rely on student tuition and donations in order to survive, and they had to compete for students and faculty in the larger market already established by their private predecessors.
Colleges survived by hustling for dollars from prospective donors and marketing themselves to prospective students.
By 1880, the U.S. system of higher education was extraordinarily large and spatially dispersed, with decentralized governance and a remarkable degree of institutional complexity. This system had established a distinctive structure early in the century, and then elaborated on it over the succeeding decades. It might seem strange to call the motley collection of some 800 colleges and universities a system at all. “System” implies a plan and a form of governance that keeps things working according to the plan, and that indeed is the formal structure of higher-education systems in most other countries, where a government ministry oversees the system and tinkers with it over time. But not in the U.S.
The system of higher education in the U.S. did not arise from a plan, and no agency governs it. It just happened. But it is nonetheless a system, which has a well-defined structure and a clear set of rules that guides the actions of the individuals and institutions within it. In this sense, it is less like a political system guided by a constitution than a market-based economic system arising from an accumulation of individual choices. Think urban sprawl rather than planned community. Its history is not a deliberate construction but an evolutionary process. The market systems just happen, but that doesn’t keep us from understanding how it came about and how it works.
People did try to impose some kind of logical form and function on to the system. All U.S. presidents until Andrew Jackson argued for the need to establish a national university, which would have set a high standard for the system, but this effort failed because of the widespread fear of a strong central government. And a number of actors tried to impose their own vision of what the purpose of the system should be. In 1828, the Yale faculty issued a report strongly supporting the traditional classical curriculum (focused on Latin, Greek and religion); in the 1850s, Francis Wayland at Brown argued for a focus on science; and the Morrill Land-Grant Act of 1862 called for colleges that would “teach such branches of learning as are related to agriculture and the mechanic arts … in order to promote the liberal and practical education of the industrial classes in the several pursuits and professions in life.” These visions provided support for a wide array of alternative college missions within a diversified system that was wed to none of them.
The weaknesses of the college system were glaringly obvious. Most of the colleges were not created to promote higher learning, and the level of learning they did foster was modest indeed. They had a rudimentary infrastructure and no reliable stream of funding. They were too many in number for any of them to gain distinction, and there was no central mechanism for elevating some of them above others. Unlike Europe, the U.S. had no universities with the imprimatur of the national government or the established church, just a collection of marginal public and private institutions located on the periphery of civilization. What a mess.
Take Middlebury College, a Congregational institution founded in 1800, which has now become one of the premier liberal arts colleges in the country, considered one of the “little Ivies.” But in 1840, when its new president arrived on campus (a Presbyterian minister named Benjamin Labaree, my grandfather’s grandfather), he found an institution that was struggling to survive, and in his 25-year tenure as president this situation did not change much for the better. In letters to the board of trustees, he detailed a list of woes that afflicted the small college president of his era. Hired for a salary of $1,200 a year (roughly $32,000 today), he found that the trustees could not afford to pay it. So he immediately set out to raise money for the college, the first of eight fundraising campaigns that he engaged in, making a $1,000 contribution of his own and soliciting gifts from the small faculty.
Money worries are the biggest theme in Labaree Sr.’s letters (struggling to recruit and pay faculty, mortgaging his house to make up for his own unpaid salary, and perpetually seeking donations), but he also complained about the inevitable problems that come from trying to offer a full college curriculum with a small number of underqualified professors:
I accepted the Presidency of Middlebury College, Gentlemen, with a full understanding that your Faculty was small and that in consequence a large amount of instruction would devolve upon the President – that I should be desired to promote the financial interests of the Institution, as convenience and the duties of instruction would permit, was naturally to be expected, but I could not have anticipated that the task of relieving the College from pecuniary embarrassment, and the labor and responsibility of procuring funds for endowment for books, for buildings etc, etc would devolve on me. Could I have foreseen what you would demand of me, I should never have engaged in your service.
At one place in the correspondence, Labaree Sr. listed the courses he had to teach as president: “Intellectual and Moral Philosophy, Political Economy, International Law, Evidences of Christianity, History of Civilization, and Butler’s Analogy.” U.S. college professors could not afford to have narrow expertise.
In short, the U.S. college system in the mid-19th century was all promise and no product. Nonetheless, it turns out that the promise was extraordinary. One hidden strength was that the system contained nearly all the elements needed to respond to a future rapid expansion of student demand and burgeoning enrollments. It had the necessary physical infrastructure: land, classrooms, libraries, faculty offices, administration buildings, and the rest. And this physical presence was not concentrated in a few population centers but scattered across the landmass of a continental country. It had faculty and administration already in place, with programs of study, course offerings, and charters granting colleges the ability to award degrees. It had an established governance structure and a process for maintaining multiple streams of revenue to support the enterprise, as well as an established base of support in the local community and in the broader religious denomination. The main thing the system lacked was students.
Another source of strength was that this disparate collection of largely undistinguished colleges and universities had succeeded in surviving a Darwinian process of natural selection in a fiercely competitive environment. As market-based institutions that had never enjoyed the luxury of guaranteed appropriations (this was true for public as well as private colleges), colleges survived by hustling for dollars from prospective donors and marketing themselves to prospective students who could pay tuition. They had to be adept at meeting the demands of the key constituencies in their individual markets. In particular, they had to be sensitive to what prospective students were seeking in a college experience, since they were paying a major part of the bills. And colleges also had a strong incentive to build longstanding ties with their graduates, who would become a prime source for new students and for donations.
In addition, the structure of the college – with a lay board, strong president, geographical isolation, and stand-alone finances – made it a remarkably adaptable institution. These colleges could make changes without seeking permission from the education minister or the bishop. Presidents were the CEOs of the enterprise, and their clear mission was to maintain the viability of the college and expand its prospects. They had to make the most of the advantages offered to them by geography and religious affiliation, and to adapt quickly to shifts in position relative to competitors concerning such key institutional matters as program, price, and prestige. The alternative was to go out of business. Between 1800 and 1850, 40 liberal arts colleges closed, 17% of the total.
Successful colleges were also deeply rooted in isolated towns across the country. They represented themselves as institutions that educated local leaders and served as cultural centers for their communities. The college name was usually the town’s name. The colleges that survived the mid-19th century were well-poised to take advantage of the coming surge of student interest, new sources of funding, and new rationales for attending college.
U.S. colleges retained a populist aura. Because they were located in small towns all across the country and forced to compete with peers in the same situation, they became more concerned about survival than academic standards. As a result, the U.S. system took on a character that was middle-class rather than upper-class. Poor families did not send their children to college, but ordinary middle-class families could. Admission was easy, the academic challenge moderate, the tuition manageable. This created a broad popular foundation for the college that saved it, for the most part, from Oxbridge-style elitism. The college was an extension of the community and denomination, a familiar local presence, a source of civic pride, and a cultural avatar representing the town to the world. Citizens did not have to have a family member connected with the school to feel that the college was theirs. This kind of populist base of support came to be enormously important when higher education enrollment started to skyrocket.
The system had to make students happy, which meant an academic programme that was not overly challenging.
One final characteristic of the U.S. model of higher education was its practicality. As it developed in the mid-19th century, the higher-education system incorporated this practical orientation into the structure and function of the standard-model college. The land-grant college was both an effect and a cause of the cultural preference for usefulness. The focus on the useful arts was written into the DNA of these institutions, as an expression of the U.S. effort to turn a college for gentlemen or intellectuals into a school for practical pursuits, with an emphasis on making things and making a living, rather than on gaining social polish or exploring the cultural heights. And this model spread widely to the other parts of the system. The result was not just the inclusion of subjects such as engineering and applied science into the curriculum but also the orientation of the college itself as a problem-solver for the businessmen and policymakers. The message was: “This is your college, working for you.”
All of this was quite popular with consumers, but it didn’t make U.S. colleges centers of intellectual achievement and renown. That, however, began to change in the 1880s, when the German research university burst on to the U.S. educational scene. In this emerging model, the university was a place that produced cutting-edge scientific research, and provided graduate-level training for the intellectual elite. The new research model gave the institutionally overbuilt and academically undistinguished U.S. system of higher education an infusion of scholarly credibility, which had been so clearly lacking. For the first time, the system could begin to make the claim of being the locus of learning at the highest level. At the same time, colleges received a large influx of enrollment, which remedied another problem with the old model – the chronic shortage of students.
But the U.S. did not adopt the German model wholesale. Instead, the model was adapted to U.S. needs. The research university was an add-on, not a transformation. The German university was an elitist institution, focused primarily on graduate instruction and high-level research, which were possible only with a strong and steady flow of state support. Since such funding was not forthcoming in the U.S., graduate education and scholarly research could exist only at a modest level and only if grafted on to the hardy stock of the U.S. undergraduate college. It needed the financial support that comes from a large number of undergraduate students, who paid tuition and drew per-capita appropriations for state institutions. It also needed the political support and social legitimacy that came from the populism and practicality of the existing U.S. college. High-level graduate learning depended on an undergraduate experience that was broadly accessible and not too demanding intellectually. In short, it needed students. And in the 20th century, the students arrived.
By then, the U.S. higher-education system was in a strong position to capitalize on the capacities it had built during its competitive struggle for survival in the preceding years. Compared with the much older and more distinguished European institutions, it enjoyed a broad base of public support as a populist enterprise that offered a lot of practical benefits. It felt like our institution rather than theirs. To survive, the system had to go out of its way to make students happy, which meant providing a rich array of social entertainments – including fraternities, sororities, and, of course, football – and an academic program that was not overly challenging. The idea was to get students so enmeshed in the institution that they come to identify with it – which helps to ensure that later in life they will continue to wear the school colors, return for reunions, enroll their own children, and make generous donations.
One way you see this populist quality today is in the language people use. Americans tend to employ the labels college and university interchangeably. Elsewhere in the world, however, “university” refers to the highest levels of postsecondary education, which offers bachelors and graduate degrees, while “college” refers to something more like what Americans would call a community college, offering associate degrees and vocational training. So when Brits or Canadians say: “I’m going to university,” it carries an elitist edge. But for Americans, the term university is considered a bit prissy and pretentious. They tend to prefer saying: “I’m going to college,” whether that institution is Harvard or the local trade school. This is quite misleading, since U.S. higher education is extraordinarily stratified, with the benefits varying radically according to the status of the institution. But it is also characteristically populist, an assertion that college is accessible to nearly anyone.
By the 21st century, U.S. universities accounted for 52 of the top 100 in the world, and 16 of the top 20.
Coming into the 20th century, another advantage enjoyed by the system was that U.S. colleges and universities tended to enjoy a relatively high degree of autonomy. This was most obvious in the case of the private not-for-profit institutions that still account for the majority of U.S. higher-education institutions. A lay board owns the institution and appoints the president, who serves as CEO, sets the budget, and administers faculty and staff. Private universities now receive a lot of government money, especially for research grants and student loans and scholarships, but they have broad discretion over tuition, pay, curriculum, and organization. This allows the university to adapt quickly to changing market conditions, respond to funding opportunities, develop new programs, and open research centers.
Public universities are subject to governance from the state, which provides appropriations in support of core functions and also shapes policy. This limits flexibility about issues such as budget, tuition, and pay. But state funding covers only a portion of total expenses, with the share declining as you go up the institutional status ladder. Flagship public research universities in the U.S. often receive less than 20% of their budget from the state; for the University of Virginia, the portion is below 5%. Regional state universities receive around half of their funds from the state. So public institutions need to supplement their funds using the same methods as private institutions – with student tuition, research grants, fees for services, and donations. And this gives them considerable latitude in following the lead of the privates in adapting to the market and pursuing opportunities. Public research universities have the greatest autonomy from state control. And the public universities that have long topped the rankings – the University of California and the University of Michigan – have their autonomy guaranteed in the state constitution.
It turns out that autonomy is enormously important for a healthy and dynamic system of higher education. Universities operate best as emergent institutions, in which initiative bubbles up from below – as faculty pursue research opportunities, departments develop programs, and administrators start institutes and centers to take advantage of possibilities in the environment. Central planning by state ministries of higher education seeks to move universities toward government goals, but this kind of top-down policymaking tends to stifle the entrepreneurial activities of the faculty and administrators who are most knowledgeable about the field and most in tune with market demand. You can quantify the impact that autonomy from the state has on university quality. The economist Caroline Hoxby at Stanford and colleagues did a study that compared the global rankings of universities with the proportion of university funding that comes from the state (using the ranks computed by Shanghai Jiao Tong University). They found that when the proportion of the budget from state funds rises by one percentage point, the university falls three ranks. Conversely, when the proportion of the budget from competitive grants rises by one percentage point, the university goes up six ranks.
In the 19th century, weak support from church and state forced U.S. colleges to develop into an emergent system of higher education that was lean, adaptable, autonomous, consumer-sensitive, partially self-supporting, and radically decentralized. These humble beginnings provided the system with the core characteristics that helped it to become the leading system in the world. This undistinguished group of colleges came to top world rankings. By the 21st century, U.S. universities accounted for 52 of the top 100 universities in the world, and 16 of the top 20. Half of the Nobel laureates in the 21st century were scholars at U.S. institutions. At the same time, the system’s hand-to-mouth finances turned into extraordinary wealth. The university in the U.S. with the largest endowment is Harvard, at $35 billion; the largest in Europe is Cambridge, at $8 billion. The largest endowment in Continental Europe is held by a brand-new institution, Central European University in Budapest with $900 million, thanks to a donation from George Soros. This would place CEU in the 103rd position in the U.S., behind Brandeis University.
Rags to riches indeed. No longer a joke, the U.S. system of higher education has become the envy of the world. Unfortunately, however, since it’s a system that emerged without a plan, there’s no model for others to imitate. It’s an accident that arose under unique circumstances: when the state was weak, the market strong, and the church divided; when there was too much land and not enough buyers; and when academic standards were low. Good luck trying to replicate that pattern anywhere in the 21st century.
David Labaree is the Lee L. Jacks professor at the Stanford University Graduate School of Education. He is the former president of the History of Education Society and former vice president of the American Educational Research Association. His most recent book is A Perfect Mess: The Unlikely Ascendancy of American Higher Education.
This essay was originally published in Aeon. Photo courtesy: A group portrait, thought to be members of the Ranters, Bethany College, Virginia, 1851.