In 1882, when a journalist asked robber baron, William H. Vanderbilt, about the hurtful impact his railroad monopoly was having on the public good, he scornfully responded "The Public Be Damned!"
Within fifteen years, such arrogance would give rise to a Progressive Movement in American politics, whose goal, among other things, was to protect the public against the excesses of corruption and greed.
While Progressivism was not terribly successful in this regard, it gave rise to the idea that public institutions were a sacred trust of American Democracy. Following the Stock Market Crash, and during the 1930s, labor agitation, widespread anti-business sentiments, and New Deal policies furthered this idea, giving rise to affordable public housing for lower and middle class Americans, public jobs programs for the unemployed and destitute and public retirement pensions (Social Security) to ensure that old folks would not be left penniless. After the Second World War, this humanistic impulse was extended to the fostering of public education, particularly at the university level.
Over the past quarter century, however, the term "public" has become a euphemism for disreputable. Public housing, public assistance, public education and even public "welfare" (definition: health, happiness, and material well-being) have all been assaulted as drains on society, as have the many Americans who have relied on these institutions in good times and bad. Social Security is in danger of being privatized and the idea of a guaranteed public health care system has been blasphemed as antithetical to democratic values. (huh?)
This cynical and well-organized depiction of "public" has provided an effective PR smokescreen for the systematic dismantling of a social contract between the government and the people which a generation of Americans saw as worthy and inviolable.
Today, pejorative views of public institutions have become so widespread that it is routinely taken for granted that private enterprises are of a higher quality than those supported by public funding, even after Enron. For many people, this attitude is a given when it comes to education, ensuring that graduates of public schools or universities are viewed as less educated than those who graduate from "the privates." (In many cases, it should be added, miserly support from well-lobbied politicians ensures that this judgment will become a self-fulfilling prophecy.)
Such attitudes are evident in a December 20, 2006 article published in the New York Times, entitled "Public Universities Vie to Join the Top 10 in Academic Rankings." According to the article, a number of public universities in the United States are engaging in campaigns to "upgrade" their status in U.S. News & Word Report rankings. The Times, in its reporting, never questions the premise that private is better.
As a professor at CUNY (City University of New York) which, collectively, has one of the most eminent faculties of any university in the world, and also one of the most interesting, informed and world wise student bodies, I can attest to the undeserved inferiority complex that affects many public universities. It is seen in ongoing efforts to elevate status by cutting remediation (while it remains available at Harvard), increasing numerical--often abstract--entrance requirements, and creating elite "honors" colleges within universities that treat the chosen few as members of an exclusive eating club. It is also seen in a costly and unprecedented focus on "branding."
A central fact affecting public universities is that states and the federal government have been defunding public education, forcing venerable institutions to remake themselves as semi-privates, raise tuition beyond affordable levels, employ more and more poorly paid adjuncts to teach classes, or face extinction. Doing more with less has been the common reality faced by faculty and students at public universities beginning during the so-called "fiscal crisis" of the mid-1970s, and accelerating during the 1990s when private profits were soaring while budget cuts for education were becoming the rule. Under the present regime, the situation has deteriorated even further in the wake of an overt policy of anti-public propaganda and malign neglect.
Nonetheless, the premise that the privates are superior to the publics is a well-known joke in academia. While most public institutions hold hard and fast admissions rules regarding what it takes to be accepted, even the most prestigious private colleges and universities routinely rely on extenuating circumstances in making admissions decisions, particularly if an applicant is the child of an alumni (a "legacy") or a person whose parents are likely to become big donors.
While graduates of Harvard, Princeton or Yale no doubt benefit from being part of a network that includes scions of the ruling classes, a brief conversation with many of these graduates will provide clear indication that none of these schools require superior intelligence to be admitted or to graduate. Delusions of superiority, however, are easy to find.
While the conservative U.S. News and World Report's college and university ranking list pretty much toes the private line, many familiar with American higher education know that public institutions are among the finest universities in the nation.
Berkeley, University of Wisconsin, University of Michigan and many other public universities, including the oft-maligned City University of New York, offer a dedicated student a far greater opportunity to receive a quality education than do many of the privates, whose faculties are smaller and often less prestigious. Look at where the highest levels of research funding go and you'll see where organizations look in order to have access to the best of the best. The publics and the privates are intimately intermixed. While material conditions at public institutions are getting run-down, their academic quality is, in many cases, incontestable.
With notable exceptions, the privates also tend to be bastions of conservatism in the students they serve and the ideas they promote. While the Yale History Department has enormous cache, the History Department at the University of Wisconsin has, consistently, been the best and most innovative department in the country. This goes back more than 100 years, and most historians are aware of this.
At the same time, the corporations and a corporate-oriented state are at war with public funding for education, as if having a well-educated public is a detriment to society. The Bush regime has been particularly vocal in this libel. This not only endangers the future of education, but endangers the principle of "the common good," the cornerstone of democracy.
It is time for the term "public" to be restored to the honorable position it deserves. This will require challenging one of the most often used, and socially damaging stereotypes of the present era: the idea that "public" is the realm of undeserving deadbeats.
This started out as a comment by Steve Gorelick, but the editors thought it demanded a full post. We invite comments, questions, responses.
The tragedies of Ota Benga and the Venus Hottentot are vivid reminders of a time when virtually no shame or scruples got in the way of loud, brazen, even joyous expressions of racism. Hating was still good public fun. Mammies, coons, minstrels in blackface, African exotics, and pickaninnies were wildly successful “brands” that marketed every imaginable consumer product. They were cherished forms of entertainment. Even our Presidents could still lead us all in a good coon joke, and we could still rely on almost any daily newspaper to include a few humorous anecdotes about the antics of some hilariously stupid black man.
Wasn’t that a time? We could hate as a collectivity and enjoy that special solidarity we only feel when we join hands, mobilize against a common folk-devil, and share an image of the odious “other.” And beyond the sheer fun of racism as entertainment, our common folk-devil served another purpose: He or she could be a stand-in for every social stress, anxiety or problem that we feared. Pull out an old coon joke and everything was alright. Or so we thought.
But in the 20th century we screwed up all the fun. Not content with just a good laugh, we refused at mid-century to even consider laws that would have outlawed lynching, insisting that states should have the right to decide whether it was or was not alright to hang a black man up by his neck from a tree. We overplayed our hand and tried to maintain an educational system that was both separate and unequal. And the result was a whole host of laws and Supreme Court decisions that ended all the fun. Hate had to move underground. It had to adapt to changing political and social circumstances and, like any aggressive virus, had to mutate into a form in which it could survive.
So we entered the “age of veneer.” No more Amos and Andy. We wised up and started answering public opinion polls in ways that touted our new tolerance. And as long as our kids didn’t get bussed across town, we were happy to agree that Brown vs. Board of education was an important decision. And even when our kids were affected, we didn’t have to run around screaming like racists. We could continue to claim the mantle of tolerance while stand-ins like Louise Day Hicks did the dirty work on our behalf. Besides, our neighborhoods were still safely white and segregated – from the suburbs of Los Angeles to South Boston. Why not talk the talk of tolerance? We could even send our kids to all white summer camps and have them come back singing Kum-Ba-Yah. Bless our hearts, we didn’t even get the joke when a songwriter named Phil Ochs wrote that we “love Puerto Ricans and Negros as long as they don't move next door.” We beamed at the progress.
So what perverse impulse leads me to miss the old days when hate was hate? What did we lose when the people in my suburban Los Angeles neighborhood finally started taking down the black lawn jockey statues? And why was I was so thrilled when Michael Richards went postal on stage?
Don’t get me wrong. I don’t miss any hate or racism or stereotyping that has genuinely gone to wherever dead stereotypes go when they are retired. You know, the place where Sambo’s, the national pancake chain went, or wherever they buried all the old radio tapes of Freeman Gosden or Charles Correll imitating the voice of two black buffoons.
But I am downright nostalgic for the days when you could see and smell and feel hate. When you saw it coming. When politicians didn’t try to package their institutional racism in supposedly benevolent public policy. When the face of racism was a man named Lester Maddox who gave out souvenir head-bashing baseball bats and not affable racists like Ronald Reagan whose racism was so coded and re-packaged and dressed up that even he didn’t know it was there.
In almost every area of human endeavor, subtlety and nuance are signs of progress and intellectual sophistication. But subtlety might have been the worst thing that ever happened to racism. Because by wrapping it in benevolence, by re-branding institutionalized cruelty as welfare reform, by getting rid of Amos and Andy and replacing them with Fred Sanford, we have allowed ourselves the collective illusion of progress. We have driven racism so far and so deep into our collective political consciousness that we can convince ourselves it is gone. Worst of all, those of us who claim the mantle of tolerance, who deeply believe that we have dodged the bullet of hate, can fully convince ourselves that at least we are somehow immune from ugliness.
So that’s why I was perversely pleased by the Michael Richards incident. For one brief moment, the fault line between who we think we are and who we really are opened up and good old fashioned rageful hate came pouring out in all its primal viciousness.
We could see that all the shrewd and skilled re-branding of racism, all the new costumes for stereotypes, has not changed a thing. It’s still there, embedded everywhere from public policy to our very consciousness. Or at least it will be there for the next day or two, until Angelina Jolie finds a new baby in Zambia and we can forget about a pathetic comic who made the mistake of spewing bile into the lens of a cell phone camera.
Posted by: Steve Gorelick | 25 November 2006 at 12:02 PM