This series by American studies professor Ben Railton explores the connections between America’s past and present.
On March 22, President Trump issued an Executive Order that affects one of America’s most significant, shared spaces and communities: our institutions of higher education. Entitled “Improving Free Inquiry, Transparency, and Accountability at Colleges and Universities,” the EO threatens to remove federal funding from institutions of higher education that do not “promote free inquiry” on their campuses. Although the EO itself does not spell out the meaning of that phrase further, it has been celebrated by conservative groups such as Turning Point USA, who believe the EO to be an endorsement of their critiques of universities where far-right speakers have been either disinvited from speaking or protested when they did speak.
There isn’t a free speech crisis on America’s college campuses, nor is this striking Executive Order genuinely seeking to address such an issue. Yet this moment nonetheless offers us an opportunity to consider a legitimate crisis facing public higher education in America: The abandonment of funding of these institutions. This issue threatens the founding and enduring mission and role of these schools in our society.
America’s first public universities were created as overt alternatives to the Revolutionary era’s elite, religious private colleges such as Harvard and Yale. Thomas Jefferson’s University of Virginia, for example, represented the nation’s first non-sectarian university, one intended to have no affiliation with or sponsorship from a religious faith. The other earliest public universities, such as the University of Georgia (founded in 1785) and the University of North Carolina (founded in 1795), were likewise intended to open up higher education to students beyond the wealthy planter elites in these states. While certainly their target audiences remained white male students, they nonetheless represented first steps in the democratization of American higher education.
Over the course of the 19th century, that democratizing promise was gradually extended to additional American communities. The creation of the institutions that have become known as Historically Black Colleges & Universities (HBCUs), a process that began with the 1837 founding of Cheyney University of Pennsylvania, made higher education available to African American students for the first time. Many of the earliest women’s colleges in America were private institutions, but in 1884 the Mississippi state legislature established the public women’s Industrial Institute & College (later Mississippi University for Women), and over the next two decades Georgia (1889), North Carolina (1891), and Florida (1905) founded their own public women’s colleges as well.
The 19th century also saw the founding across the country of a number of public teaching academies, generally known as Normal Schools, that likewise included (if they were not indeed initially limited to) female students and paved the way for another evolving system of public colleges and universities that endures to this day. The history of the institution where I teach, Fitchburg State University in Massachusetts, illustrates that legacy: the State Normal School in Fitchburg was initially founded in 1894, added a Bachelor’s degree in “practical” arts in 1930, changed its name to the State Teachers College at Fitchburg in 1932, and then became Fitchburg State College in 1965 (and Fitchburg State University in 2010).
Yet FSU’s 21st century situation also reflects the central challenge and crisis facing public higher education in Massachusetts and around the country. Over the last two decades or so Massachusetts has largely disinvested in public higher education. Between 2001 and 2018 the amount Massachusetts spends per resident student in our public universities has decreased by 31%. According to the Center on Budget and Policy Priorities, of the 49 states analyzed over the full 2008-2018 period, after adjusting for inflation, 45 spent less per student in the 2018 school year than in 2008.
While this decline in funding affects every aspect of public higher education, it has hit students particularly hard: According to the National Center for Education Statistics, prices for undergraduate tuition, fees, room, and board at public institutions rose 34 percent between 2006 and 2016 (adjusting for inflation). Total student loan debt is $1.52 trillion, a 302 percent increase since 2004.
Each of these trends is linked to and influenced by a number of factors, but the overall trends are all too clear: legislators and politicians have increasingly chosen not to fund public higher education in the 21st century; and the costs have been passed on to the institutions and, especially, their students. While that trend has been perhaps particularly pronounced in Massachusetts (which now ranks 34th in the nation in per resident student state spending), it has at the same time been taking place throughout the nation (and of course is paralleled by similar disinvestments in public secondary and primary education). If these trends continue, public higher education might soon become something it has not been since its earliest iterations (if ever): a community that only the wealthiest and most privileged Americans can afford to join.
Here in Massachusetts, two proposed state laws seek to counter these trends and begin to chart a new way: the Promise and Cherish Acts, which would offer significant funding shifts and increases for public primary, secondary, and higher education over the next five years (among many other features). This past Tuesday, Fitchburg State hosted one of the numerous Fund Our Future forums taking place throughout the state, to highlight these crises and the proposed bills, and to feature the voices of students, faculty and staff, and community members testifying to both the challenges they face and the value of public higher education. What I saw and heard there was the best of what public higher education has meant in America and can mean in the 21st century, if we live up to our legacies, reverse our current trends, and recognize the true crisis facing American higher education.
Featured image: Students at Cheyney University of Pennsylvania, the first historically black college. (Cheyney University)
You won’t get many people interested in discussing “the problems of higher education” until you bring up the trillion dollars. That’s the amount that America’s students now owe on their college tuition.
It’s hard to comprehend how much money $1,000,000,000,000 is. Consider this: It’s more than twice the amount, adjusted for inflation, that America paid to build its vast interstate highway system.
That’s a lot of money to repay. And with today’s sluggish economy and unemployment, more than one economist is losing sleep over whether we’ll ever clear that debt.
It’s no surprise that higher education is starting to draw the same amount of media attention, and criticism, as other big businesses. Critics are now challenging college’s admission policies, the merit of high-prestige universities, the need for traditional college lecture, and, of course, the cost.
Criticizing higher education is nothing new. Back in 1920, a Post article entitled “What a Man Loses in Going to College,” questioned whether higher education wasn’t a handicap to young men and women. “The average college man [loses] association with older people and that intimate contact with concrete issues which are absolutely essential in making a man out of boy stuff,” wrote E. Davenport.
“Instead of thinking men’s thoughts about a world during his most formative years, [the student] becomes engrossed in student activities, which have about as much connection with the real world as a wart on the end of the nose has with vision; it may obscure but it cannot illuminate.”
The author also claimed that young men, after spending four years among a juvenile cohort, became apathetic, vain, egotistical, argumentative, unreliable, and addicted to slang.
The Post editors also held a low opinion of college training. In a 1923 editorial, editors argued that a four-year degree could be earned in half the time if only students were taught a capacity for drudgery and self-discipline. Instead, colleges bred effete snobs. “We see thousands of young men turned out of college who have never learned how to work, who would scorn to yield to the obligation to do any kind of manual labor other than golf or tennis.”
By 1927, Albert W. Atwood claimed colleges were lowering their standards to admit mediocre and marginally intelligent scholars. He quoted criticism from the Association of University Professors, which sounds as if it could be written today: “When a university numbers its students by the thousands and the tens of thousands, when it admits almost anybody and teaches almost anything, when its classrooms are manned, as is inevitable, by inferior teachers, whenever endowment or appropriation must be sought in a vain effort to keep pace with its numerical growth, when each tries to outstrip its rivals in the externals and trappings of education, then the very character of the university is bound to change for the worse.”
In 1938, Dr. Robert M. Hutchins, the University of Chicago’s president, added his criticism. In “Why Go to College?” he admitted that higher education was often used as a dumping ground for young people. It was where some parent sent their children to get them out of the house for four years, and where young men and women went to avoid adulthood and responsibility.
Twenty years later, a journalism professor at the University of Indiana claimed colleges had become little more than marriage mills and fun factories. In “Are We Making a Playground Out of College?” Jerome Ellison wrote that colleges had developed a “Second Curriculum is that odd mixture of status hunger, voodoo, tradition, lust, stereotyped dissipation, love, solid achievement, and plain good fun sometimes called ‘college life.’ It drives a high proportion of our students through college chronically short of sleep, behind in their work, and uncertain of the exact score in any department of life.”
In 1965, Dr. Hutchins returned to the Post, this time declaring “Colleges Are Obsolete.” Higher education, he wrote, had become an industry concerned with numbers, not values. Colleges were only concerned with helping students amass the right number of semester hours for graduation. They couldn’t help students become more intelligent because they were no longer intellectual communities “thinking together about important things.” Instead, the campus had become just a collection of isolated specialties. “The student is never compelled to put together what the specialists have told him, because he is examined course by course by the teacher who taught the course.”
Hutchins’ article touches on the ultimate question of college: “Do our colleges help their students become more intelligent? The answer is, on the whole, no.”
Americans expect a college education to do something important, valuable, and lasting for a student. It’s difficult to assess whether a student is more intelligent after graduation. Let’s consider the value of college by a more practical measure: How much more can a graduate earn?
By this standard, college is still doing its job. Recent labor statistics from 2013 show that average Americans earned almost twice as much per hour if they had a four-year degree.
However, every college student must still individually solve the following problem: Does this earning advantage, extended over a 20-year career exceed the investment of four years and a pre-interest cash value of $18,000 to $46,000?