Loading...

Messages

Proposals

Stuck in your homework and missing deadline? Get urgent help in $10/Page with 24 hours deadline

Get Urgent Writing Help In Your Essays, Assignments, Homeworks, Dissertation, Thesis Or Coursework & Achieve A+ Grades.

Privacy Guaranteed - 100% Plagiarism Free Writing - Free Turnitin Report - Professional And Experienced Writers - 24/7 Online Support

America's most overrated product the bachelor's degree summary

26/11/2021 Client: muhammad11 Deadline: 2 Day

In Defense of a Liberal Education

FAREED ZAKARIA

W. W. NORTON & COMPANY New York • London

For my children, Omar, Lila, and Sofia

We are drowning in information, while starving for wisdom. The world henceforth will be run by synthesizers, people able to put together the right information at the right time, think critically about it, and make important choices wisely.

—E. O. Wilson

Contents

1: Coming to America 2: A Brief History of Liberal Education 3: Learning to Think 4: The Natural Aristocracy 5: Knowledge and Power 6: In Defense of Today’s Youth

Notes

Acknowledgments

In Defense of a Liberal Education

1

Coming to America

IF YOU WANT to live a good life these days, you know what you’re supposed to do. Get into college but then drop out. Spend your days learning computer science and your nights coding. Start a technology company and take it public. That’s the new American dream. If you’re not quite that adventurous, you could major in electrical engineering.

What you are not supposed to do is study the liberal arts. Around the world, the idea of a broad-based “liberal” education is closely tied to the United States and its great universities and colleges. But in America itself, a liberal education is out of favor. In an age defined by technology and globalization, everyone is talking about skills-based learning. Politicians, businesspeople, and even many educators see it as the only way for the nation to stay competitive. They urge students to stop dreaming and start thinking practically about the skills they will need in the workplace. An open-ended exploration of knowledge is seen as a road to nowhere.

A classic liberal education has few defenders. Conservatives fume that it is too, well, liberal (though the term has no partisan meaning). Liberals worry it is too elitist. Students wonder what they would do with a degree in psychology. And parents fear that it will cost them their life savings.

This growing unease is apparent in the numbers. As college enrollment has grown in recent decades, the percentage of students majoring in subjects like English and philosophy has declined sharply. In 1971, for example, 7.6 percent of all bachelor’s degrees were awarded in English language and literature. By 2012, that number had fallen to 3.0 percent. During the same period, the

percentage of business majors in the undergraduate population rose from 13.7 to 20.5.

Some believe this pattern makes sense—that new entrants into higher education might simply prefer job training to the liberal arts. Perhaps. But in earlier periods of educational expansion, this was not the case. In the 1950s and 1960s, for instance, students saw college as more than a glorified trade school. Newcomers, often from lower-middle-class backgrounds and immigrant families with little education, enthusiastically embraced the liberal arts. They saw it as a gateway to a career, and also as a way to assimilate into American culture. “I have to speak absolutely perfect English,” says Philip Roth’s character Alex Portnoy, the son of immigrants and hero of the novel Portnoy’s Complaint. Majors like English and history grew in popularity precisely during the decades of mass growth in American higher education.

The great danger facing American higher education is not that too many students are studying the liberal arts. Here are the data. In the 2011–12 academic year, 52 percent of American undergraduates were enrolled in two-year or less- than-two-year colleges, and 48 percent were enrolled in four-year institutions. At two-year colleges, the most popular area of study was health professions and related sciences (23.3 percent). An additional 11.7 percent of students studied business, management, and marketing. At four-year colleges, the pattern was the same. Business led the list of majors, accounting for 18.9 percent of students, and health was second, accounting for 13.4 percent. Another estimate found that only a third of all bachelor’s degree recipients study fields that could be classified as the liberal arts. And only about 1.8 percent of all undergraduates attend classic liberal arts colleges like Amherst, Swarthmore, and Pomona.

As you can see, we do not have an oversupply of students studying history, literature, philosophy, or physics and math for that matter. A majority is specializing in fields because they see them as directly related to the job market. It’s true that more Americans need technical training, and all Americans need greater scientific literacy. But the drumbeat of talk about skills and jobs has not lured people into engineering and biology—not everyone has the aptitude for science—so much as it has made them nervously forsake the humanities and take courses in business and communications. Many of these students might well have been better off taking a richer, deeper set of courses in subjects they found fascinating—and supplementing it, as we all should, with some basic knowledge of computers and math. In any event, what is clear is that the gap in technical training is not being caused by the small percentage of students who choose

four-year degrees in the liberal arts. Whatever the facts, the assaults continue and have moved from the realm of

rhetoric to action. The governors of Texas, Florida, North Carolina, and Wisconsin have announced that they do not intend to keep subsidizing the liberal arts at state-funded universities. “Is it a vital interest of the state to have more anthropologists?” Florida’s Rick Scott asked. “I don’t think so.” Wisconsin is planning to cut money from subjects that don’t train students for a specific job right out of college. “How many PhDs in philosophy do I need to subsidize?” the radio show host William Bennett asked North Carolina’s Patrick McCrory, a sentiment with which McCrory enthusiastically agreed. (Ironically, Bennett himself has a PhD in philosophy, which appears to have trained him well for his multiple careers in government, media, nonprofits, and the private sector.)

It isn’t only Republicans on the offensive. Everyone’s eager to promote the type of education that might lead directly to a job. In a speech in January 2014, President Barack Obama said, “I promise you, folks can make a lot more, potentially, with skilled manufacturing or the trades than they might with an art history degree.” He later apologized for what he described as a “glib” comment, but Obama has expressed similar sentiments during his presidency. His concern —that in today’s world, college graduates need to focus on the tools that will get them good jobs—is shared by many liberals, as well as conservatives and independents. The irrelevance of a liberal education is an idea that has achieved that rare status in Washington: bipartisan agreement.

The attacks have an effect. There is today a loss of coherence and purpose surrounding the idea of a liberal education. Its proponents are defensive about its virtues, while its opponents are convinced that it is at best an expensive luxury, at worst actively counterproductive. Does it really make sense to study English in the age of apps?

In a sense, the question is un-American. For much of its history, America was distinctive in providing an education to all that was not skills based. In their comprehensive study of education, the Harvard economists Claudia Goldin and Lawrence Katz note that, historically, Britain, France, and Germany tested children at a young age, educated only a few, and put them through a narrow program designed specifically to impart a set of skills thought to be key to their professions. “The American system,” they write, “can be characterized as open, forgiving, lacking universal standards, and having an academic yet practical curriculum.” America did not embrace the European model of specific training and apprenticeships because Americans moved constantly, to new cities,

counties, and territories in search of new opportunities. They were not rooted in geographic locations with long-established trades and guilds that offered the only path forward. They were also part of an economy that was new and dynamic, so that technology kept changing the nature of work and with it the requirements for jobs. Few wanted to lock themselves into a single industry for life. Finally, Goldin and Katz argue, while a general education was more expensive than specialized training, the cost for the former was not paid by students or their parents. The United States was the first country to publicly fund mass, general education, first at the secondary-school level and then in college. Even now, higher education in America is a much broader and richer universe than anywhere else. Today a high school student can go to one of fourteen hundred institutions in the United States that offer a traditional bachelor’s degree, and another fifteen hundred with a more limited course of study. Goldin and Katz point out that on a per capita basis, Britain has only half as many undergraduate institutions and Germany just one-third. Those who seek to reorient U.S. higher education into something more focused and technical should keep in mind that they would be abandoning what has been historically distinctive, even unique, in the American approach to higher education.

And yet, I get it. I understand America’s current obsession. I grew up in India in the 1960s and 1970s, when a skills-based education was seen as the only path to a good career. Indians in those days had an almost mystical faith in the power of technology. It had been embedded in the country’s DNA since it gained independence in 1947. Jawaharlal Nehru, India’s first prime minister, was fervent in his faith in big engineering projects. He believed that India could move out of its economic backwardness only by embracing technology, and he did everything he could during his fourteen years in office to leave that stamp on the nation. A Fabian socialist, Nehru had watched with admiration as the Soviet Union jump-started its economy in just a few decades by following such a path. (Lenin once famously remarked, “Communism is Soviet power plus the electrification of the whole country.”) Nehru described India’s new hydroelectric dams as “temples of the new age.”

I attended a private day school in Bombay (now Mumbai), the Cathedral and John Connon School. When founded by British missionaries in the Victorian era, the school had been imbued with a broad, humanistic approach to education. It still had some of that outlook when I was there, but the country’s mood was feverishly practical. The 1970s was a tough decade everywhere economically, but especially in India. And though it was a private school, the tuition was low,

and Cathedral catered to a broad cross section of the middle class. As a result, all my peers and their parents were anxious about job prospects. The assumption made by almost everyone at school was that engineering and medicine were the two best careers. The real question was, which one would you pursue?

At age sixteen, we had to choose one of three academic streams: science, commerce, or the humanities. We all took a set of board exams that year—a remnant of the British educational model—that helped determine our trajectory. In those days, the choices were obvious. The smart kids would go into science, the rich kids would do commerce, and the girls would take the humanities. (Obviously I’m exaggerating, but not by that much.) Without giving the topic much thought, I streamed into the sciences.

At the end of twelfth grade, we took another set of exams. These were the big ones. They determined our educational future, as we were reminded again and again. Grades in school, class participation, extracurricular projects, and teachers’ recommendations—all were deemed irrelevant compared to the exam scores. Almost all colleges admitted students based solely on these numbers. In fact, engineering colleges asked for scores in only three subjects: physics, chemistry, and mathematics. Similarly, medical schools would ask for results in just physics, chemistry, and biology. No one cared what you got in English literature. The Indian Institutes of Technology (IITs)—the most prestigious engineering colleges in the country—narrowed the admissions criteria even further. They administered their own entrance test, choosing applicants entirely on the basis of its results.

The increased emphasis on technology and practicality in the 1970s was in part due to domestic factors: inflation had soared, the economy had slumped, and the private sector was crippled by nationalizations and regulations. Another big shift, however, took place far from India’s borders. Until the 1970s, the top British universities offered scholarships to bright Indian students—a legacy of the raj. But as Britain went through its own hellish economic times that decade —placed under formal receivership in 1979 by the International Monetary Fund —money for foreign scholarships dried up. In an earlier era, some of the brightest graduates from India might have gone on to Oxford, Cambridge, and the University of London. Without outside money to pay for that education, they stayed home.

But culture follows power. As Britain’s economic decline made its universities less attractive, colleges in the United States were rising in wealth and ambition. At my school, people started to notice that American universities

had begun offering generous scholarships to foreign students. And we soon began to hear from early trailblazers about the distinctly American approach to learning. A friend from my neighborhood who had gone to Cornell came back in the summers bursting with enthusiasm about his time there. He told us of the incredible variety of courses that students could take no matter what their major. He also told tales of the richness of college life. I remember listening to him describe a film society at Cornell that held screenings and discussions of classics by Ingmar Bergman and Federico Fellini. I had never heard of Bergman or Fellini, but I was amazed that watching movies was considered an integral part of higher education. Could college really be that much fun?

My parents did not push me to specialize. My father had been deeply interested in history and politics ever since he was a young boy. He had been orphaned at a young age but managed to get financial assistance that put him through high school and college. In 1944, he received a scholarship to attend the University of London. He arrived during the worst of the blitzkrieg, with German V-2 rockets raining down on the city. On the long boat ride to England, the crew told him he was crazy. One member even asked, “Haven’t you read the newspapers? People are leaving London by the thousands right now. Why would you go there?” But my father was determined to get an education. History was his passion, and he worked toward a PhD in that subject. But he needed a clearer path to a profession. So, in addition, he obtained a law degree that would allow him to become a barrister upon his return to Bombay.

Though my mother was raised in better circumstances, she also faced a setback at a young age—her father died when she was eight. She briefly attended a college unusual for India at the time—a liberal arts school in the northern part of the country called the Isabella Thoburn College, founded in 1870 by an American Methodist missionary of that name. Though her education was cut short when she returned home to look after her widowed mother, my mother never forgot the place. She often fondly reminisced about its broad and engaging curriculum.

My parents’ careers were varied and diverse. My father started out as a lawyer before moving into politics and later founding a variety of colleges. He also created a small manufacturing company (to pay the bills) and always wrote books and essays. My mother began as a social worker and then became a journalist, working for newspapers and magazines. (She resigned from her last position in journalism last year, 2014, at the age of seventy-eight.) Neither of them insisted on early specialization. In retrospect, my parents must have

worried about our future prospects—everyone else was worried. But to our good fortune, they did not project that particular anxiety on us.

My brother, Arshad, took the first big step. He was two years older than I and fantastically accomplished academically. (He was also a very good athlete, which made following in his footsteps challenging.) He had the kind of scores on his board exams that would have easily placed him in the top engineering programs in the country. Or he could have taken the IIT exam, which he certainly would have aced. In fact, he decided not to do any of that and instead applied to American universities. A couple of his friends considered doing the same, but no one quite knew how the process worked. We learned, for example, that applicants had to take something called the Scholastic Aptitude Test, but we didn’t know much about it. (Remember, this is 1980 in India. There was no Google. In fact, there was no color television.) We found a pamphlet about the test at the United States Information Service, the cultural branch of the U.S. embassy. It said that because the SAT was an aptitude test, there was no need to study for it. So, my brother didn’t. On the day the test was scheduled, he walked into the makeshift exam center in Bombay, an almost empty room in one of the local colleges, and took the test.

It’s difficult to convince people today how novel and risky an idea it was at the time to apply to schools in the United States. The system was still foreign and distant. People didn’t really know what it meant to get into a good American university or how that would translate into a career in India. The Harvard alumni in Bombay in the 1970s were by no means a “Who’s Who” of the influential and wealthy. Rather, they were an eclectic mix of people who either had spent time abroad (because their parents had foreign postings) or had some connection to America. A few friends of ours had ventured to the United States already, but because they hadn’t yet graduated or looked for jobs, their experiences were of little guidance.

My brother had no idea if the admissions departments at American colleges would understand the Indian system or know how to interpret his report cards and recommendations. He also had no real Plan B. If he didn’t take the slot offered by engineering schools, he wouldn’t be able to get back in line the next year. In fact, things were so unclear to us that we didn’t even realize American colleges required applications a full year in advance. As a result, he involuntarily took a gap year between school and college, waiting around to find out whether he got in anywhere.

As it happened, Arshad got in everywhere. He picked the top of the heap—

accepting a scholarship offer from Harvard. While we were all thrilled and impressed, many friends remained apprehensive when told the news. It sounded prestigious to say you were going to attend Harvard, but would the education actually translate into a career?

My mother traveled to the United States to drop my brother off in the fall of 1981, an uneasy time in American history. The mood was still more 1970s malaise than 1980s boom. The country was in the midst of the worst recession since the Great Depression. Vietnam and Watergate had shattered the nation’s confidence. The Soviet Union was seen as ascendant in our minds. Riots, protests, and urban violence had turned American cities into places of genuine danger. Our images of New York came from Charles Bronson movies and news reports of crack and crime.

All of this was especially alarming to Indians. The country’s traditional society had interpreted the 1960s and 1970s as a period of decay in American culture, as young people became morally lax, self-indulgent, permissive, and, perhaps most worrisome, rebellious. The idea that American youth had become disrespectful toward their elders was utterly unnerving to Indian parents. Most believed that any child who traveled to the United States would quickly cast aside family, faith, and tradition for sex, drugs, and rock and roll. If you sent your kids to America, you had to brace yourselves for the prospect that you might “lose” them.

In his first few weeks abroad, Arshad was, probably like all newcomers to Harvard, a bit nervous. My mother, on the other hand, returned from her trip clear of any anxiety. She was enchanted with the United States, its college campuses, and the undergraduate experience. She turned her observations into an article for the Times of India titled “The Other America.” In it, she described how concerned she had been before the trip about permissiveness, drugs, and rebellion at American colleges. She then went on to explain how impressed she was after actually spending time on a campus to find that the place focused on education, hard work, and extracurricular activities. The students she met were bright, motivated, and, to her surprise, quite respectful. She met parents who were tearfully bidding their children good-bye, talking about their next visit, or planning a Thanksgiving reunion. “I feel I am in India,” she wrote. “Could this be the heartless America where family ties have lost their hold?”

Indians had it all wrong about the United States, my mother continued. She tried to explain why they read so much bad news about the country. “America is an open society as no other. So they expose their ‘failings’ too as no other,” she

wrote. “[Americans] cheerfully join in the talk of their own decline. But the decline is relative to America’s own previous strength. It remains the world’s largest economy; it still disposes of the greatest military might the world has known; refugees from terror still continue to seek shelter in this land of immigrants. It spends millions of dollars in the hope that someone, somewhere may make a valuable contribution to knowledge. America remains the yardstick by which we judge America.” As you can see, she was hooked.

In those years, it was fashionable in elite Indian circles to denounce the United States for its imperialism and hegemony. During the Cold War, the Indian government routinely sided with the Soviet Union. Indira Gandhi, the populist prime minister, would often blame India’s troubles on the “foreign hand,” a reference to the CIA. But my mother has always been stubbornly pro-American. When my father was alive, he would sometimes criticize America for its crimes and blunders, partly to needle my brother and me and partly because, as one who had struggled for India’s independence, he had absorbed the worldview of his closest allies, who were all on the left. Yet my mother remained unmoved, completely convinced that the United States was a land of amazing vitality and virtue. (I suspect it’s what has helped her accept the fact that her sons chose the country as their home.)

Along with photographs and information brochures from her trip, my mother also brought back Harvard’s course book. For me, it was an astonishing document. Instead of a thin pamphlet containing a dry list of subjects, as one would find at Indian universities, it was a bulging volume overflowing with ideas. It listed hundreds of classes in all kinds of fields. And the course descriptions were written like advertisements—as if the teachers wanted you to join them on an intellectual adventure. I read through the book, amazed that students didn’t have to choose a major in advance and that they could take poetry and physics and history and economics. From eight thousand miles away, with little knowledge and no experience, I was falling in love with the idea of a liberal education.

I decided to follow in my brother’s footsteps and didn’t pursue the Indian options available to me. I took the SAT and wrote the required essays and applications. If you had asked me why I was so determined to go to the United States, I couldn’t have given you a coherent response. Indian universities seemed limiting and limited. I thought about applying to British universities, but I would have needed a scholarship and few existed. The idea of “reading” just one subject at Oxford or a narrow set of subjects at Cambridge seemed less

interesting when compared with the dazzling array of opportunities at the Ivy League schools. And, of course, there was the allure of America.

I had always been fascinated by America. I had visited once as a teenager, but most of my knowledge about the country came from Hollywood. While the Indian market was too poor and distant to get any newly released movies, we watched the ones we would get, a few years delayed—anything from The Poseidon Adventure to Kramer vs. Kramer—as well as old classics, like the Laurel and Hardy comedies, which I loved. Television arrived in the country in the mid-1970s, initially with just one government-run black-and-white channel that mostly aired documentaries on the glories of Indian agriculture. Every Sunday night, my family would gather around the television set to watch the one unadulterated piece of entertainment it would air, a Bollywood movie. Preceding that was a single episode of I Love Lucy, presumably all that Indian television could afford to import from the United States. Everyone watched it with pleasure, laughing along with Lucy and her crazy family. To this day, I have a soft spot for that show.

By the late 1970s, technology had begun to bring more of the West to India. A few of my friends had video recorders, and after a while, so did we. It was impossible to acquire actual copies of American movies and shows, but we did get many bootlegged versions. Somewhere in the United States, a relative would tape the latest television shows and send them to the family back home. These bootlegged Betamax tapes would be passed around in Bombay like samizdat publications in the Soviet Union.

The hottest show at the time was Dallas, which we all devoured. The scenes during the opening credits were my window into the American dream: shining shots of gleaming skyscrapers, helicopters landing in office parks, men in ten- gallon hats getting in and out of cavernous Cadillacs. And Victoria Principal— she was certainly part of my American dream. Whatever the newspapers said about problems in the United States, who could believe it with these images flashing across the screen? America seemed vast, energetic, and wealthy. Everything happened in Technicolor there.

The U.S. Information Service, set up to promote American culture and ideas during the Cold War, would hold screenings of older American classics. A friend and I would often attend these showings. There, in a small room in Bombay, sitting amid aging expats, I was introduced to Hollywood’s golden age. I kept a scrapbook on these movies, from It Happened One Night to Adam and Eve to How the West Was Won. In a sense, they were my first real introduction to

American history. And they added to my sense of the country as the world’s most exciting place.

Let me be honest, though: while the soft attraction was great, so was the cold cash. My parents were well-paid professionals, but India was one of the poorest countries in the world. Their annual salaries combined would have equaled just half of one year’s tuition abroad. At the time, American colleges did not offer need-blind admissions to foreign students like me—the schools all had much smaller endowments in those days—but they did distribute merit scholarships. And if you were admitted, they worked out a combination of grants, loans, and on-campus jobs that would allow you to attend. My brother’s reports from Harvard were that between his scholarship and a campus job, he was making do quite well. He even had enough money for books and incidental expenses. Yet realizing that I needed not only admission but also a scholarship added to my anxiety.

I got very lucky and ended up going to Yale. I have no idea why they let me in or why I chose to go there. I marvel today at college-bound American kids who take two or three trips to campuses, sit in on classes, have long discussions with counselors, and watch student theater productions—all to decide where to go to college. In comparison, I made an utterly uninformed choice from halfway around the world. I didn’t get into Harvard, but I was fortunate to be able to choose between Princeton and Yale and couldn’t really decide. I knew little about either. If I made a list of each university’s objective merits, which I did, Princeton usually came out on top. It was smaller and richer and had offered me a bigger scholarship. Everyone had heard of it in India because of Albert Einstein. Very few knew of Yale. This seems hard to believe, but Yale really was quite obscure in India. My father, like many Indians, couldn’t pronounce the name, and to his dying day he called it “Ale.” In general, American universities that have great name recognition in India—and in Asia more generally—are those with strong engineering programs, science departments, and business schools. These were not Yale’s strengths.

Eventually, I decided to use the only mechanism I could think of: a coin toss. Heads, I would go to Yale; tails, I would go to Princeton. I flipped the coin. It was tails. So I decided to make it a “best of three” and tossed again. I don’t remember if Yale won the coin toss at that point or if I kept going until it did. But in doing the exercise, I realized that I wanted to go to Yale. I don’t quite know why. It is an example of the power of intuition. Though obviously both are great institutions, Yale was the perfect place for me. I knew something at the

time that I couldn’t explain or even understand. Yale offered then, and still does now, a rigorous first-year academic program

called Directed Studies. It is a sweeping survey of the Western literary and philosophical tradition from ancient Greece to modernity. This seemed like a wonderful opportunity for a kid from India. It would have introduced me to a number of great Western classics that I had heard about but never read. You had to apply to be able to take the courses, which I did. Some months later, I was thrilled to get a note informing me that I had been accepted into the program.

I chickened out. When I got to Yale, it was time for me to finalize my choices. I tallied up the subjects that I believed I had to take—courses like math, computer programming, and physics—and realized that if I were going to enroll in Directed Studies, it would fill up most of my schedule. I panicked at the idea of committing so completely to something that seemed so impractical. I remember thinking to myself, “When people ask me in India over the summer about my courses, I could talk about computers and math. How would I explain this?” So I dropped Directed Studies and signed up for courses that seemed more sensible.

In my first year, however, I allowed myself to pick one class simply out of sheer interest. The course was a popular lecture on the history of the Cold War, taught by a political science professor named H. Bradford Westerfield. His lectures were packed with vivid details and delivered with gusto. I was hooked.

International politics and economics had always appealed to me. As a teenager in India, I would avidly read the major international newspapers and magazines, which sometimes arrived weeks after they were published. The great global drama of the times was the clash of the superpowers, and it echoed in India, a country that was torn between the two camps. I remember devouring the excerpts of Henry Kissinger’s memoirs when they came out, though I’m sure I didn’t understand them. (I was fifteen at the time.) Yet I never thought that one studied these kinds of subjects in college. I had assumed that I would major in something that was practical, technical, and job oriented. I could always read newspapers on the side. Westerfield’s course, however, made me realize that I should take my passion seriously, even without being sure what it might lead to in terms of a profession. That spring, I declared my major in history. I was going to get a liberal education.

But still, I couldn’t have answered the question, what is a liberal education?

2

A Brief History of Liberal Education

FOR MOST OF human history, education was job training. Hunters, farmers, and warriors taught the young to hunt, farm, and fight. Children of the ruling class received instruction in the arts of war and governance, but this too was intended first and foremost as preparation for the roles they would assume later in society, not for any broader purpose. All that began to change twenty-five hundred years ago in ancient Greece.

Prior to the change, education in Greece had centered on the development of arête, roughly meaning excellence or virtue. The scholar Bruce Kimball notes that the course of study largely involved the memorization and recitation of Homeric epic poetry.* Through immersion in the world of gods and goddesses, kings and warriors, children would master the Greek language and imbibe the lessons, codes, and values considered important by the ruling elite. Physical training was a crucial element of Greek education. In the city-state of Sparta, the most extreme example of this focus, young boys considered weak at birth were abandoned to die. The rest were sent to grueling boot camps, where they were toughened into Spartan soldiers from an early age.

Around the fifth century BC, some Greek city-states, most notably Athens, began to experiment with a new form of government. “Our constitution is called a democracy,” the Athenian statesman Pericles noted in his funeral oration, “because power is in the hands not of a minority but of the whole people.” This innovation in government required a simultaneous innovation in education. Basic skills for sustenance were no longer sufficient—citizens also had to be properly trained to run their own society. The link between a broad education

and liberty became important to the Greeks. Describing this approach to instruction centuries later, the Romans coined a term for it: a “liberal” education, using the word liberal in its original Latin sense, “of or pertaining to free men.” More than two thousand years later, Frederick Douglass saw the same connection. When his master heard that young Frederick was reading well, he was furious, saying, “Learning will spoil the best nigger in the world. If he learns to read the Bible it will forever unfit him to be a slave.” Douglass recalled that he “instinctively assented to the proposition, and from that moment I understood the direct pathway from slavery to freedom.”

From the beginning, people disagreed over the purpose of a liberal education. (Perhaps intellectual disagreement is inherent in the idea itself.) The first great divide took place in ancient Greece, between Plato, the philosopher, and Isocrates, the orator. Plato and his followers, including Aristotle, considered education a search for truth. Inspired by Socrates, they used the dialectic mode of reasoning and discourse to pursue knowledge in its purest form. Isocrates, on the other hand, hearkened back to the tradition of arête. He and his followers believed a person could best arrive at virtue and make a good living by studying the arts of rhetoric, language, and morality. This debate—between those who understand liberal education in instrumental terms and those who see it as an end in and of itself—has continued to the present day.

In general, the more practical rationale for liberal education gained the upper hand in the ancient world. Yet the two traditions have never been mutually exclusive. The Roman statesman and philosopher Cicero, one of the earliest writers on record to use the term artes liberales, wanted to combine the search for truth with rhetoric, which was seen as a more useful skill. “For it is from knowledge that oratory must derive its beauty and fullness,” the philosopher- statesman wrote around 55 AD. While debate continues, the reality is that liberal education has always combined a mixture of both approaches—practical and philosophical.

Science was central to liberal education from the start. Except that in those days, the reason to study it was the precise opposite of what is argued today. In the ancient world, and for many centuries thereafter, science was seen as a path to abstract knowledge. It had no practical purpose. Humanistic subjects, like language and history, on the other hand, equipped the young to function well in the world as politicians, courtiers, lawyers, and merchants. And yet the Greeks and Romans studied geometry and astronomy alongside rhetoric and grammar. In the first century BC, this dualistic approach to education was “finally and

definitively formalized” into a system described as “the seven liberal arts.” The curriculum was split between science and humanities, the theoretical and the practical. Centuries later, it was often divided into two subgroups: the trivium— grammar, logic, and rhetoric—was taught first; the quadrivium—arithmetic, geometry, music, and astronomy—came next.

Soldiers and statesmen, naturally, placed greater emphasis on subjects they thought of as practical—what today we would call the humanities. But even so, the idea of a broader education always persisted. In the eighth century, Charlemagne, king of the Franks (a Germanic tribe that inhabited large chunks of present-day Germany, France, Belgium, Netherlands, and Luxembourg), consolidated his empire. Bruce Kimball notes that Charlemagne then established a palace school and named as its master Alcuin, an English scholar (even then Englishmen were the ideal headmasters). Alcuin and his followers concentrated on grammar and textual analysis and demoted mathematics, but they continued to teach some version of the liberal arts. And the deeper quest for understanding never disappeared. Even during the Dark Ages, medieval monasteries kept alive a tradition of learning and inquiry.

Why did European learning move beyond monasteries? One influence might have been Islam, the most advanced civilization in the Middle Ages—something difficult to imagine today. Within the world of Islam there were dozens of madrasas—schools where history, politics, science, music, and many other subjects were studied and where research was pursued (though not all Islamic educational institutions were called madrasas). Islamic learning produced innovations, especially in the study of mathematics. Algebra comes from the Arab phrase al-jabr, meaning “the reunion of broken parts.” The name of the Persian scholar al-Khwārizm ī was translated into Latin as algoritmi, which became “algorithm.” By the eleventh century, Cairo’s al-Azhar and Baghdad’s Nizamiyah were famous across the world for their academic accomplishments, as were many other centers of learning in the Arab world. This Islamic influence found a home in the Muslim regions of continental Europe as well, in the madrasas of Moorish Spain, in Granada, Córdoba, Seville, and elsewhere.

By the late Middle Ages, Europe’s stagnation was ending. The expansion in global trade and travel meant that its leaders needed greater knowledge and expertise in areas like geography, law, and science. As city-states competed with one another economically, they sought out individuals with better skills and education. Because of its long coastline, Italy became a place where commerce, trade, and capitalism were beginning to stir. Groups of scholars started coming

停滞

together in various Italian cities to study theology, canon and civil law, and other subjects. These scholars came from great distances and were often grouped by their geographical origins, each one being called a “nation,” an early use of the word. Some of these “nations” hired local scholars, administered exams, and joined together into groups that came to be called universitas. These organizations sought and were granted special protections from local laws, thus allowing them necessary freedoms and autonomy.

In 1088, Europe’s first university was founded in Bologna. Over the next century, similar institutions sprouted up in Paris, Oxford, Cambridge, and Padua. By 1300, western Europe was home to between fifteen and twenty universities. These schools were initially not bastions of free inquiry, but they did become places where scholars discussed some taboo subjects, recovered, translated, and studied Aristotle’s writings, and subjected laws to close scrutiny. Yet most research took place outside of universities in those days because of their religious influence. It was heretical, for instance, for scientists to speculate on earth’s place amid the stars. In most cities, while students were accorded some of the same freedoms and exemptions as the clergy, they desired even more. The University of Padua’s motto was Universa universis patavina libertas—“Paduan freedom is universal for everyone.”

In the fourteenth century, the balance between practical and philosophical knowledge shifted again. Some Italian scholars and writers believed that universities had become too specialized. They looked to return European education to its Greek and Roman roots. These humanists rejected the highly detailed, scholastic approach to learning and theology that was pervasive in medieval universities. Instead, as the late scholar Paul Oskar Kristeller notes, they encouraged a “revival of arts and of letters, of eloquence and of learning” that “led to a new and intensified study of ancient literature and history.” Over the next two centuries, what has been called Renaissance humanism spread to the rest of Europe.

These traditions of scholarship, however, did not create the experience we now think of as a liberal education. That modern tradition had less to do with universities and more with colleges. And “college as we know it,” writes Columbia University professor Andrew Delbanco, “is a fundamentally English idea.”† The earliest English colleges were founded in the thirteenth century for scholars of divinity whose duties, Delbanco notes, “included celebrating mass for the soul of the benefactor who had endowed the college and thereby spared them from menial work.” Religious influences were strong—the public lecture,

1

for instance, was a secular outgrowth of the Sunday sermon—though the curriculum was varied and included non-theological subjects.

Colleges grew more secular by the nineteenth century as seminaries assumed responsibility for training ministers. They also began to develop a character distinct from European universities, which were becoming increasingly focused on research, especially in Germany. Unlike universities, which often lacked a clear physical embodiment, colleges were defined by their architecture. An imposing stone building was usually constructed with an open courtyard in the center and student dormitories arrayed around it. The “common” room was where students could meet, the chapel where they could pray, and the library where they could read. This model of a residential college originated in England and spread to the Anglo-American world, where it remains the distinctive form for undergraduates.

In the early twentieth century, among the major universities, Harvard and Yale adopted the full-fledged residential college model for student housing, partly in an effort to retain the intimate setting of liberal arts colleges while pursuing their ambitions to become great research universities. The residential college has come to be seen as possessing certain qualities that enhance the experience of liberal education beyond the curriculum. The advantages of such an arrangement are often described today in terms like “living-learning experiences,” “peer-to-peer education,” and “lateral learning.” Samuel Eliot Morison, the legendary historian of Harvard, best described the distinctive benefits of the residential setting: “Book learning alone might be got by lectures and reading; but it was only by studying and disputing, eating and drinking, playing and praying as members of the same collegiate community, in close and constant association with each other and with their tutors, that the priceless gift of character could be imparted.” An emphasis on building character, stemming from the religious origins of colleges, remains an aim of liberal arts colleges almost everywhere, at least in theory.

America’s earliest colleges were modeled on their English predecessors. Many of the founders of Harvard College, for example, were graduates of Emmanuel College at Cambridge University. Perhaps because, in America, they did not start out strictly as seminaries, colonial colleges often incorporated into their curricula a variety of disciplines, including the sciences, humanities, and law. Students were expected to take all these subjects and relate them to one another because it was assumed there was a single, divine intelligence behind all of them. In Cardinal John Newman’s nineteenth-century formulation of this

approach to education, “The subject-matter of knowledge is intimately united in itself, as being the acts and the work of the Creator.” It was a theological version of what physicists today call the unified field theory.

America’s first colleges stuck to curricula that could be described as God and Greeks—theology and classics. But a great debate over this approach emerged at the beginning of the nineteenth century. People wondered why students should be required to study ancient Greek and Latin. They suggested that colleges should begin to incorporate modern languages and subjects into their instruction. After all, the country was growing rapidly and developing economically and technologically, making the college course of study seem antiquated in comparison. After much deliberation, the Yale faculty issued a report in 1828 defending the classical curriculum. It powerfully influenced American colleges for half a century—delaying, some might say, their inevitable evolution. It also, however, outlined a central tension in liberal education that persists till now.

The Yale report explained that the essence of liberal education was “not to teach that which is peculiar to any one of the professions; but to lay the foundation which is common to them all.” It described its two goals in terms that still resonate: training the mind to think and filling the mind with specific content.

The two great points to be gained in intellectual culture, are the discipline and the furniture of the mind; expanding its powers, and storing it with knowledge. The former of these is, perhaps, the more important of the two. A commanding object, therefore, in a collegiate course, should be, to call into daily and vigorous exercise the faculties of the student. Those branches of study should be prescribed, and those modes of instruction adopted, which are best calculated to teach the art of fixing the attention, directing the train of thought, analyzing a subject proposed for investigation; following, with accurate discrimination, the course of argument; balancing nicely the evidence presented to the judgment; awakening, elevating, and controlling the imagination; arranging, with skill, the treasures which memory gathers; rousing and guiding the powers of genius.

Though its particular aim historically was to defend the classical curriculum, the Yale report’s broader argument was that learning to think is more important than the specific topics and books that are taught. A Harvard man revived the argument fifty years later, as he battled to undo the report’s recommendations.

Charles Eliot was an unlikely candidate for the presidency of Harvard. He was a scientist at a time when the heads of schools like Harvard, Yale, and Princeton were still generally ministers. After graduating from Harvard in 1853, Eliot was appointed to be a tutor and later an assistant professor of mathematics and chemistry at the school. But he was not made a full professor as he had hoped, and at about the same time, his bad luck compounded as his father’s

fortune collapsed. Eliot decided to travel to Europe, where he saw firsthand the rapidly changing state of higher education on the Continent and the rise of the great research universities in Germany. He then returned to the United States to take up a professorship at the Massachusetts Institute of Technology in 1865. At the time, like many other colleges, Harvard was in the midst of a tumultuous period in its history. It faced calls for more vocational education in order to prepare Americans for the workforce in the rapidly industrializing economy just emerging from the Civil War.

To address these concerns, Eliot penned a two-part essay in the Atlantic Monthly titled “The New Education.” It began with words that could be uttered by any parent today, adjusted for gender: “What can I do with my boy? I can afford, and am glad, to give him the best training to be had. I should be proud to have him turn out a preacher or a learned man; but I don’t think he has the making of that in him. I want to give him a practical education; one that will prepare him, better than I was prepared, to follow my business or any other active calling.” Eliot’s answer was that Americans needed to combine the best developments of the emerging European research university with the best traditions of the classic American college.

Eliot proposed that America’s great universities embrace the research function, but that they do so at the graduate level, leaving undergraduates free to explore their interests more broadly. He showed a strong understanding and mastery of the emerging trends in education, like the difference between scientific and humanistic fields and the rise in technical training. He wanted colleges to distinguish carefully between a skills-based and a liberal education, the latter of which he considered more important. Months after his essays were published, at the age of thirty-five, Charles Eliot was offered the presidency of Harvard, a post that he held for four decades—exactly—and from which he reshaped the university and the country.

Eliot made so many transforming changes at Harvard that they are impossible to recount—he essentially established the modern American university. Yet perhaps his most influential reform, at least for undergraduates, was his advocacy for a curriculum based on the “spontaneous diversity of choice.” In other words, under his new system, students had very few required courses and many electives. Previously in American colleges, much of the curriculum had been set in stone. Students had enrolled in courses and studied topics in a predetermined sequence from one year to the next. The faculty had believed, in the terms of the Yale report, that it should choose “the furniture” that

was to inhabit the students’ minds. Eliot disagreed profoundly. He was probably influenced by his

Protestantism, which saw the individual as the best mediator of his own fate. But perhaps more than anything, he was imbued with the spirit of Ralph Waldo Emerson and his distinctively American ideas, which were deeply influential at the time. For Emerson, the task of every human being was to find his or her voice and give expression to it. “Trust thyself,” Emerson wrote in “Self- Reliance.” “Every heart vibrates to that iron string.” Emerson’s notion of the importance of authenticity, as opposed to imitation, and his praise of unique thinking could have been turned into copy for Apple ad campaigns (“Think Different”).

In an 1885 speech, Eliot outlined the case for his elective system using language that remains radical today—and with which many parents might still disagree. “At eighteen the American boy has passed the age when a compulsory external discipline is useful,” Eliot wrote. “A well-instructed youth of eighteen can select for himself—not for any other boy, or for the fictitious universal boy but for himself alone—a better course of study than any college faculty, or any wise man who does not know him and his ancestors and his previous life, can possibly select for him.” Eliot believed that American liberal education should allow you to choose your own courses, excite your own imagination, and thus realize your distinctive self. Many responded that some subjects are not worthy of being taught. The solution, he believed, was to let faculty members offer what they want and students to take what they like.

Eliot’s views were not shared by many influential educators of the time, most notably the president of Princeton, James McCosh. (In fact, Eliot’s speech that I quote from above was from a public debate with McCosh on the topic in New York City.) A Scottish minister and philosopher, McCosh thought that universities should provide a specific framework of learning and a hierarchy of subjects for their students—or else they were failing in their role as guardians. In particular, religion could not simply be treated like any other subject, to be taken or dropped at an undergraduate’s whim. Eliot’s ideas, however, were more in sync with American culture and its emphasis on individualism and freedom of choice. Over time, the elective system in some form or another has come to dominate American higher education, with a few notable exceptions.

In the early years of the twentieth century, a swell in the tide of immigrants entering the United States prompted concern among some citizens, educators, and public officials that the country was losing its character. Against that

backdrop, an English professor at Columbia University, John Erskine, began offering a two-year course called General Honors in 1920. Erskine “wanted to provide young people from different backgrounds with a common culture, something he thought was already thin in the United States,” writes the Harvard scholar Louis Menand. Erskine believed that the best way to become truly educated was to immerse oneself in great works of the past.

In 1930, Mortimer Adler, an educator who had taught a section in Erskine’s program, left Columbia for the University of Chicago. His friend Robert Maynard Hutchins had recently been appointed president of the school, and the two began teaching a seminar together for underclassmen on classic works in the Western canon. The course evolved into a “great books” program—a core curriculum in which students read prescribed works of history, literature, and philosophy and then gather for small-group discussions guided by faculty members. Several years later, two professors named Stringfellow Barr and Scott Buchanan moved from Chicago to St. John’s College in Annapolis to start their own great-books program. Barr and Buchanan radically altered the undergraduate curriculum at the small school with the tradition of seven liberal arts in mind. Even science was taught from a great-books perspective, reading classic accounts that were, in many ways, outdated or had been superseded. The program left no room for electives.

By the 1930s and 1940s—perhaps because the immigrant tide had receded with the introduction of national quotas in 1921 and 1924—interest in the common core waned. Today, about 150 schools in the United States offer some kind of core program based on great books, though very few require that all undergraduates take it, as Columbia, Chicago, and St. John’s do.

Whatever its merits, the idea of a curriculum based on some set of great books has always been debated. In a 1952 essay, Hutchins, who could be considered the father of the great-books movement, made what has become a familiar case for it. “Until lately the West has regarded it as self-evident that the road to education lay through great books,” Hutchins wrote. “No man was educated unless he was acquainted with the masterpieces of his tradition.” Times have changed, but political and social changes cannot “invalidate the tradition or make it irrelevant for modern men,” he insisted. Except that, as we have seen, this account is not entirely true. Everyone who has ever set up a great-books program based it on the belief that, in the good old days, people used to study a set of agreed-upon classics. In fact, from the start of liberal education, there were disputes over what men (and women) should read and how much or how little

freedom they should have to follow their curiosity. Martha Nussbaum, a philosopher at the University of Chicago, argues that the Socratic tradition of inquiry by its nature rejected an approach dependent “on canonical texts that had moral authority.” She writes, “It is an irony of the contemporary ‘culture wars’ that the Greeks are frequently brought onstage as heroes in the ‘great books’ curricula proposed by many conservatives. For there is nothing on which the Greek philosophers were more eloquent, and more unanimous, than the limitations of such curricula.”

I’ve found that my own views on this subject have changed over time, from my days as an undergraduate, then as a teacher in graduate school, and now as a parent. In college, I was attracted to the idea of a common core—though I didn’t end up studying one. And yet I wished I had more of a grounding in some areas, and found myself playing catch-up. When teaching undergraduates in the 1980s, I was struck not only by how bright they were but also by how little they knew about, say, the basic outlines of American history. They could analyze everything placed in front of them, but if you asked them to put six events in chronological order, they would get many of them wrong. I thought it would be worthwhile to require exposure to a set of facts or books—furniture for the mind —that would give students a foundation from which to then roam freely. And they had room to roam. Remember, most advocates of a core do not consider it sufficient for a liberal education. The programs at Columbia and Chicago allow for many electives. Proportionally, the core represents only a part of the overall curriculum.

There are also social benefits to a common core. All students are able to share an intellectual experience. They can discuss it together, join in its delights, and commiserate over its weaknesses. It’s ultimately a bonding opportunity. “Once they have gone through the Core,” writes Delbanco, referring to Columbia’s program specifically, “no student is a complete stranger to any other.” That sense of being part of a larger group becomes even more useful later in life, when one is expected to work with one’s peers and colleagues toward common goals in a professional setting. As campuses get more diverse and students spend time pursuing more narrowly focused studies and highly targeted extracurricular activities, something needs to define the collective educational experience.

I still sympathize with arguments in support of a core, but I have come to place a greater value than I once did on the openness inherent in liberal education—the ability for the mind to range widely and pursue interests freely.

In my own experience, the courses I took simply because I felt I needed to know some subject matter or acquire cultural literacy have faded in my memory. Those that I took out of genuine curiosity or because I was inspired by a great teacher have left a more lasting and powerful impression. After all, one can always read a book to get the basic information about a particular topic, or simply use Google. The crucial challenge is to learn how to read critically, analyze data, and formulate ideas—and most of all to enjoy the intellectual adventure enough to be able to do them easily and often.

Loving to learn is a greater challenge today than it used to be. I’ve watched my children grow up surrounded by an amazing cornucopia of entertainment available instantly on their computers, tablets, and phones. Perhaps soon these pleasures will be hardwired into their brains. The richness, variety, and allure of today’s games, television shows, and videos are dazzling. Many are amazingly creative, and some are intellectually challenging—there are smart video games out there. But all are designed to get children enraptured and, eventually, addicted. The all-consuming power of modern entertainment can turn something that demands active and sustained engagement, like reading and writing, into a chore.

And yet reading—especially, I would argue, reading books—remains one of the most important paths to real knowledge. There are few substitutes to understanding an issue in depth than reading a good book about it. This has been true for centuries, and it has not changed. And kids need to enjoy reading—not just see it as the thing their parents make them do before they can play video games or watch a television show. If having teenagers read Philip Roth’s Goodbye, Columbus rather than Jane Austen’s novels makes this more likely, so be it. I don’t decry or condemn new forms of entertainment and technology. They open up new vistas of knowledge and ways of thinking. Our children will be smarter and quicker than us in many ways. But a good education system must confront the realities of the world we live in and educate in a way that addresses them, rather than pretend that these challenges don’t exist.

And then there are those strange college courses on, say, “transgendered roles in East-African poetry” that infuriate conservative critics of higher education. They are right to be dismayed at the bizarre and narrow content, but it comes about for reasons that are often nonpolitical. Some of the most controversial features of modern liberal education have come into being not out of intellectual conviction but from bureaucratic convenience. As America’s best colleges became the world’s best universities, the imperatives of the latter began

to dominate the former. Research has trumped teaching in most large universities —no one gets tenure for teaching. But as important, the curriculum has also been warped to satisfy research. Professors find that it is dreary and laborious for them to teach basic courses that might be interesting and useful for students. It is much easier to offer seminars on their current research interests, no matter how small, obscure, or irrelevant the topic is to undergraduates. As knowledge becomes more specialized, the courses offered to students become more arcane. It is this impulse that produces the seemingly absurd courses one finds in some colleges today, as much as the subversive desires of a left-wing professoriat.

Another development, again unrelated to any intellectual theory about liberal education, has been the abandonment of rigor, largely in the humanities. Grades have risen steadily in almost all American colleges in recent decades. Today, 43 percent of all grades awarded are in the A range—up from 15 percent in 1960. This is an outgrowth of a complex set of factors, one of which is indeed the rising quality of students. But others are bureaucratic and philosophical, such as the 1960s assault on hierarchy. I can attest from personal experience that handing out high marks can be convenient for faculty interests. When I was a teaching assistant at Harvard, I quickly realized that giving B minuses or below meant that the students would come to complain at length; ask you to reconsider, maybe give them another chance to do the work over; and even raise the issue with their advisor or a dean. It meant lots of work for me. The much easier strategy was to give everyone a B plus or an A minus, reserving the straight A for works of genuine distinction. (I tried to resist but was certainly guilty of taking the easy way out more than once.) I cannot say if the incentives remain the same, but I notice that the portion of all grades that are A or A minus at Harvard has risen from a third (in 1986) to a half (in 2006). And the most commonly awarded grade at Harvard today is a straight A—not even an A minus.

The greatest shift in liberal education over the past century has been the downgrading of subjects in science and technology. Historically, beginning with Greek and Roman developments in education, scientific exploration was pursued through the lens of “natural philosophy.” In the Middle Ages, the subject was seen as part of an effort to explain God’s creation and man’s role within it. But during the age of scientific revolutions, and coming to a climax in the nineteenth century with Charles Darwin’s theory of evolution, the study of science increasingly conflicted with religion. This led to the discipline losing its central position in liberal education, which was still then grounded in a pious outlook

that sought to understand not only the mystery of life but also its purpose. As Anthony Kronman writes, a rise in scientific research meant “a material universe whose structure could now be described with astounding precision but which was itself devoid of meaning and purpose. As a result, the physical sciences ceased to be concerned with, or to have much to contribute to, the search for an answer to the question of the meaning of life.” Science was relegated to scientists—a huge loss to society as a whole.

By the middle of the twentieth century, following the quantum revolution in physics, laypeople found it even more difficult to understand science and integrate it into other fields of knowledge. In 1959, C. P. Snow, an English physicist and novelist, wrote a famous essay, “The Two Cultures,” in which he warned that the polarization of knowledge into two camps was producing “mutual incomprehension . . . hostility and dislike.” He explains:

A good many times I have been present at gatherings of people who, by the standards of the traditional culture, are thought highly educated and who have with considerable gusto been expressing their incredulity at the illiteracy of scientists. Once or twice I have been provoked and have asked the company how many of them could describe the Second Law of Thermodynamics. The response was cold: it was also negative. Yet I was asking something which is about the scientific equivalent of: Have you read a work of Shakespeare’s? I now believe that if I had asked an even simpler question—such as, What do you mean by mass, or acceleration, which is the scientific equivalent of saying, Can you read?—not more than one in ten of the highly educated would have felt that I was speaking the same language. So the great edifice of modern physics goes up, and the majority of the cleverest people in the western world have about as much insight into it as their neolithic ancestors would have had.

In 2003, Lawrence Summers, then president of Harvard, echoed Snow’s concerns and advocated a return to scientific literacy for all at the undergraduate level. Former Princeton president Shirley Tilghman, herself a scientist, argued in 2010 that discussions of public policy are impoverished because of the basic ignorance of science that pervades American society today. Nonscientists need to understand science, she contends, and scientists are best off with a strong background in other subjects as well. And yet little has changed on this front in recent years.

The most interesting and ambitious effort to reform liberal education for the twenty-first century is not taking place in America. In fact, it is taking place about as far away from the United States as one can possibly get—Singapore. In 2011, Yale University joined with the National University of Singapore to establish a new liberal arts school in Asia called Yale-NUS College, and in the fall of 2013, it welcomed its first class of 157 students from twenty-six countries. When I was a trustee at Yale, I enthusiastically supported this venture.

The project—though not without risks—has the potential to create a beachhead for broad-based liberal education in a part of the world that, while rising to the center stage globally, remains relentlessly focused on skills-based instruction.

Scholars from both universities have used the venture as an opportunity to reexamine the concept of liberal education in an increasingly connected and globalized world. The curriculum of Yale-NUS reflects that thinking, in some parts drawing on the best of the old tradition, in some parts refining it, and in some parts creating a whole new set of ideas about teaching the young. In April 2013, a committee of this new enterprise set forth the ideas that will define the college. It is an extraordinary document and, if implemented well, could serve as a model for the liberal arts college of the future.

The Yale-NUS report is radical and innovative. First, the school calls itself a college of liberal arts and sciences, to restore science to its fundamental place in an undergraduate’s education. It abolishes departments, seeing them as silos that inhibit cross-fertilization, interdisciplinary works, and synergy. It embraces a core curriculum, which takes up most of the first two years of study but is very different from the Columbia-Chicago model. The focus of the Yale-NUS core is to expose students to a variety of modes of thinking. In one module they are to learn how experimental scientists conduct research; in another, how statistics informs social science and public policy. There is a strong emphasis throughout on exposing students to scientific methods rather than scientific facts so that— whatever their ultimate major—they are aware of the way in which science works.

The Yale-NUS core does include courses on the great books, but it does not treat them as simply a canon to be checked off on a cultural literacy list. The books selected are viewed as interesting examples of a genre, chosen not because they are part of a “required” body of knowledge but because they benefit from careful analysis. The emphasis again is on the method of inquiry. Students learn how to read, unpack, and then write about a great work of literature or philosophy or art. The curriculum requires students to take on projects outside the classroom, in the belief that a “work” component teaches valuable lessons that learning from a book cannot. This part has a powerful practical appeal. I once asked Jeff Bewkes, the CEO of Time Warner, what skill was most useful in business that wasn’t taught in college or graduate schools. He immediately replied, “Teamwork. You have to know how to work with people and get others to want to work with you. It’s probably the crucial skill, and yet education is mostly about solo performances.”

The greatest innovation in the Yale-NUS curriculum comes directly from the nature of the association between the two universities and their home cultures. Students study not only Plato and Aristotle but also, in the same course, Confucius and the Buddha—and ask why their systems of ethics might be similar or different. They study the Odyssey and the Ramayana. They examine the “primitivisms” of Paul Gauguin and Pablo Picasso while also looking at the woodcarvings from the South Sea Islands and the ukiyo-e tradition of Japanese woodblock prints that influenced Western artists. And, of course, as they study modern history, politics, and economics, they will naturally find themselves taking a more comparative approach to the topics than any college in the United States or Asia would likely do by itself. Multiculturalism in education is usually a cliché that indicates little of substance, or involves Western critiques of the West (like those of the writer Frantz Fanon or the historian Howard Zinn). The Yale-NUS curriculum is built to provide a genuine multicultural education in a college designed for the emerging multicultural world. In studying other societies, students learn much more about their own. It is only by having some point of comparison that one can understand the distinctive qualities of Western or Chinese or Indian culture.

Yale-NUS is in its very early days. It may not be able to implement all its ideas. It does not solve all the problems of a liberal education. The tensions between freedom of inquiry and the still-closed political system in Singapore might undermine the project. But the educators involved have conceived of the college’s mission and mandate brilliantly, and have pointed the way to a revived, rigorous liberal education that recovers the importance of science, places teaching at its heart, combines a core with open exploration, and reflects the direction the world is headed, in which knowledge of new countries and cultures is an essential component of any education. Yale-NUS should become a model studied around the world.

But what if a liberal education done well still doesn’t get you a job? In 1852, Cardinal Newman wrote that a student of liberal education “apprehends the great outlines of knowledge” for its own sake rather than to acquire skills to practice a trade or do a job. Even then, he noted, there were skeptics who raised questions of practicality. As we have seen, such questions have surrounded the idea of liberal education since the days of Isocrates, and they persist today. Newman tells us that his critics would ask him, “To what then does it lead? where does it end? what does it do? How does it profit?” Or as a former president of Yale, the late A. Bartlett Giamatti, put it in one of his beautiful lectures, “What is the

earthly use of all this kind of education?” So, what is the earthly use of a liberal education?

* Bruce Kimball, Orators and Philosophers: A History of the Idea of Liberal Education (New York: Teachers College Press, 1986), is especially enlightening on ancient and early education, and I draw on it, among other sources, for the paragraphs dealing with that period. † I draw on Delbanco’s excellent book College: What It Was, Is, and Should Be (Princeton, NJ: Princeton University Press, 2012), among others, for the early years of American higher education.

3

Learning to Think

WHEN YOU HEAR someone extol the benefits of a liberal education, you will probably hear him or her say that “it teaches you how to think.” I’m sure that’s true. But for me, the central virtue of a liberal education is that it teaches you how to write, and writing makes you think. Whatever you do in life, the ability to write clearly, cleanly, and reasonably quickly will prove to be an invaluable skill.

In my freshman year of college, I took an English composition course. My teacher, an elderly Englishman with a sharp wit and an even sharper red pencil, was a tough grader. He would return my essays with dozens of comments written in the margins, each one highlighting something that was vague or confusing or poorly articulated. I realized that in coming from India, I was pretty good at taking tests and regurgitating things I had memorized; I was not so good at expressing my own ideas. By the time I got to college, I had taken many, many exams but written almost no papers. That was not unusual even at a good high school in Asia in the 1970s, and it’s still true in many places there today.

Over the course of that semester, I found myself starting to make the connection between my thoughts and words. It was hard. Being forced to write clearly means, first, you have to think clearly. I began to recognize that the two processes are inextricably intertwined. In what is probably an apocryphal story, when the columnist Walter Lippmann was once asked his views on a particular topic, he is said to have replied, “I don’t know what I think on that one. I haven’t written about it yet.”

In modern philosophy, there is a great debate as to which comes first—

thought or language. Do we think abstractly and then put those ideas into words, or do we think in words that then create a scaffolding of thought? I can speak only from my own experience. When I begin to write, I realize that my “thoughts” are usually a jumble of half-formed ideas strung together, with gaping holes between them. It is the act of writing that forces me to sort them out. Writing the first draft of a column or an essay is an expression of self- knowledge—learning just what I think about a topic, whether there is a logical sequence to my ideas, and whether the conclusion flows from the facts at hand. No matter who you are—a politician, a businessperson, a lawyer, a historian, or a novelist—writing forces you to make choices and brings clarity and order to your ideas.

If you think this has no earthly use, ask Jeff Bezos, the founder of Amazon. Bezos insists that his senior executives write memos, often as long as six printed pages, and begins senior-management meetings with a period of quiet time, sometimes as long as thirty minutes, while everyone reads the “narratives” to themselves and makes notes on them. If proposing a new product or strategy, the memo must take the form of a press release, using simple, jargon-free language so that a layperson can understand it. In an interview with Fortune’s Adam Lashinsky, Bezos said, “Full sentences are harder to write. They have verbs. The paragraphs have topic sentences. There is no way to write a six-page, narratively structured memo and not have clear thinking.”

Norman Augustine, reflecting on his years as the CEO of Lockheed Martin, recalled that “the firm I led at the end of my formal business career employed some one hundred eighty thousand people, mostly college graduates, of whom over eighty thousand were engineers or scientists. I have concluded that one of the stronger correlations with advancement through the management ranks was the ability of an individual to express clearly his or her thoughts in writing.”

The second great advantage of a liberal education is that it teaches you how to speak. The Yale-NUS report states that the college wants to make “articulate communication” central to its intellectual experience. That involves writing, of course, but also the ability to give compelling verbal explanations of, say, scientific experiments or to deliver presentations before small and large groups. At the deepest level, articulate communication helps you to speak your mind. This doesn’t mean spouting anything and everything you’re thinking at any given moment. It means learning to understand your own mind, to filter out under-developed ideas, and then to express to the outside world your thoughts, arranged in some logical order.

Another difference that struck me between school in India and college in the United States was that talking was an important component of my grade. My professors were going to judge me on my ability to think through the subject matter and to present my analysis and conclusions—out loud. The seminar, a form of teaching and learning at the heart of liberal education, helps you to read, analyze, and dissect. Above all, it helps you to express yourself. And this emphasis on “articulate communication” is reinforced in the many extracurricular activities that surround every liberal arts college—theater, debate, political unions, student government, protest groups. In order to be successful in life, you often have to gain your peers’ attention and convince them of your cause, sometimes in a five-minute elevator pitch.

The study and practice of speech actually figured far more prominently in the early centuries of liberal education. Rhetoric was among the most important subjects taught—often the most important. It was intimately connected not only with philosophy but also with governance and action. In the centuries before print, oral communication was at the center of public and professional life. The eighteenth- and nineteenth-century college curricula in Britain and the United States maintained that emphasis on oratory.

Homework is Completed By:

Writer Writer Name Amount Client Comments & Rating
Instant Homework Helper

ONLINE

Instant Homework Helper

$36

She helped me in last minute in a very reasonable price. She is a lifesaver, I got A+ grade in my homework, I will surely hire her again for my next assignments, Thumbs Up!

Order & Get This Solution Within 3 Hours in $25/Page

Custom Original Solution And Get A+ Grades

  • 100% Plagiarism Free
  • Proper APA/MLA/Harvard Referencing
  • Delivery in 3 Hours After Placing Order
  • Free Turnitin Report
  • Unlimited Revisions
  • Privacy Guaranteed

Order & Get This Solution Within 6 Hours in $20/Page

Custom Original Solution And Get A+ Grades

  • 100% Plagiarism Free
  • Proper APA/MLA/Harvard Referencing
  • Delivery in 6 Hours After Placing Order
  • Free Turnitin Report
  • Unlimited Revisions
  • Privacy Guaranteed

Order & Get This Solution Within 12 Hours in $15/Page

Custom Original Solution And Get A+ Grades

  • 100% Plagiarism Free
  • Proper APA/MLA/Harvard Referencing
  • Delivery in 12 Hours After Placing Order
  • Free Turnitin Report
  • Unlimited Revisions
  • Privacy Guaranteed

6 writers have sent their proposals to do this homework:

Buy Coursework Help
Writing Factory
Top Class Engineers
Essay & Assignment Help
Online Assignment Help
A+GRADE HELPER
Writer Writer Name Offer Chat
Buy Coursework Help

ONLINE

Buy Coursework Help

Being a Ph.D. in the Business field, I have been doing academic writing for the past 7 years and have a good command over writing research papers, essay, dissertations and all kinds of academic writing and proofreading.

$36 Chat With Writer
Writing Factory

ONLINE

Writing Factory

This project is my strength and I can fulfill your requirements properly within your given deadline. I always give plagiarism-free work to my clients at very competitive prices.

$35 Chat With Writer
Top Class Engineers

ONLINE

Top Class Engineers

I have assisted scholars, business persons, startups, entrepreneurs, marketers, managers etc in their, pitches, presentations, market research, business plans etc.

$27 Chat With Writer
Essay & Assignment Help

ONLINE

Essay & Assignment Help

I am an elite class writer with more than 6 years of experience as an academic writer. I will provide you the 100 percent original and plagiarism-free content.

$19 Chat With Writer
Online Assignment Help

ONLINE

Online Assignment Help

I have worked on wide variety of research papers including; Analytical research paper, Argumentative research paper, Interpretative research, experimental research etc.

$40 Chat With Writer
A+GRADE HELPER

ONLINE

A+GRADE HELPER

I am a PhD writer with 10 years of experience. I will be delivering high-quality, plagiarism-free work to you in the minimum amount of time. Waiting for your message.

$17 Chat With Writer

Let our expert academic writers to help you in achieving a+ grades in your homework, assignment, quiz or exam.

Similar Homework Questions

Betty - Codes of Conduct - Rosemount 1151 smart level transmitter - Papa model risk management - Tarasoff case brief - Discussion week1 - Agriculture persuasive speech topics - Libs 150 quiz 1.1 key concepts answers - Unchained our family's addiction mess is our message - Legal paternalism is the doctrine that the law course hero - Message passing in ada - Molality formula with example - Capital Budgeting Decisions - MIS DS 2. - D - Pneumonia nursing diagnosis and interventions - Northern midlands council rubbish collection - All or nothing law - Modernity and social changes in europe - Macroeconomics final exam study guide - In narrative format discuss the key facts and critical issues of the life, crimes, and conviction of Aileen Wuornos. - Observing chemical reactions lab report - College accounting a contemporary approach 4th edition - Glst 200 quiz 1 - Machine to determine melting point - How to apply split entrance effect in powerpoint - Amazon case study strategic management - Case-1 - Essay on loneliness and isolation - Phase diagram questions and answers - Advance essay - Gcf funding proposal template - Carriage in profit and loss account - Taco bell case study harvard - Grindr unsecure connection detected android - Lpi online 360 assessment - Coursespaces uvic - Pp - 2 PAGES ESSAY - International business communication - Ignore non routable ethernet - Act 1 scene 1 - Blending problem linear programming solution - Ism3013 test 3 - Double declining balance depreciation method ddb - Between 1879 and 1880, an estimated 40,000-60,000 african-americans migrated to - Unit 4 DB Leading Change Management - Wbs for a coffee shop - Buescher cornet serial numbers - 2/20 oswald street elsternwick - Analyzing and Visualizing Data - Api 610 7th edition pdf - World Civilization before 1650. - Accommodations for diverse learners - Discussion Board - Health Care - Health assessment - Downies new release coins - W5: Problem Solving - Get rms customer number - Sota colloidal silver generator - Business-Level and Corporate-Level Strategies - Case study - Excel 2016 in practice ch 9 guided project 9 3 - City and guilds functional skills 3748 - Software engineering unsw handbook - How much is ouc deposit - International journal of business and social science predatory - Ops 571 week 6 signature assignment - Auchterarder to perth bus - Snake totem pole - Renoir the large bathers 1887 - EH week10 DB - Animism definition ap human geography - Wickford c of e - Ufo sightings in san jose ca - Essay - Century marine pro 720 - Vision, Mission, and Values Statements - Cost of Capital - Olympic rent a car case study - Phasor diagram of transformer for leading power factor - Personal classroom management plan - Understanding labor practices worksheet mgt 434 - Informative Speech Outline and Self Appraisal - Bell crank suspension design - Information technology in Global Economy - Your Leadership Profile - Certificate iv in massage therapy practice - Blue winds dancing summary - Dyadic therapy definition - 10 amp h bridge - Never had it so good speech - Level 7 john radcliffe hospital - BA 611 Journal Article 3 - Nhs lanarkshire staff bank law house - UNIT V ASSESSMENT MBA 6601 - Kfc swot analysis ppt - Kem kromik universal metal primer - Rheem lazer boiling water unit manual