The Letter of Recommendation: Professional Judgment Under Siege

As a veteran professional, considered to have expertise in education and social science, I am often asked for advice. The requests run the gamut from where to study, to what to study, to how to succeed in one or another competitive arena. I do my best to be helpful—which often includes the admission that I don’t know enough to offer help.

Among the areas for which my professional judgment is most often sought is the letter of recommendation. I am asked to write a variety of letters. These range from recommending a young person for admission to a secondary school or college to recommending a senior colleague for a prize or a month long residency at a picturesque conference site. In the former case, I’m buoyed by the knowledge that there are many good places where the candidate can study. In the latter case, there is often already a lot of public knowledge about the candidate and so my support is probably symbolic rather than substantive.

The most challenging letters: those requested by young scholars who are applying for full-time tenure track teaching jobs. (Sometimes I have been the chief doctoral adviser for the scholar; at other times, I am one of her teachers or on her dissertation committee.) These jobs are highly competitive, with dozens or even hundreds of qualified candidates for each coveted position. Not infrequently, I (as well as other colleagues) will be asked to write letters for more than one candidate for the same job!

I am always suspicious of claims that “things used to be easier” or “more straightforward” in the past—and in reading C. P. Snow’s novels about academe in England, I learned that intrigue has always hovered over coveted appointments. But things certainly used to be different.

In the first half of the 20th century, nearly all appointments at selective institution (in the United States, United Kingdom, and other countries) came about through personal recommendations—in writing, in person, or by phone. The operating principle was the “old boys’ network”—and literally boys, since almost no “girls” were part of the network. When there was an opening at an institution, or when a senior scholar had a promising student, relevant “old boys” would get in touch with one another and have a presumably frank discussion of strengths and weaknesses. (Having read some correspondence from that era, I have been impressed by how candid the letters were—critiques were at least as prominent as raves.) In that sense, one can say that these recommendations were truthful.

But it’s equally important to point out that scholars had their favorites—and having a candidate who carried on your work or agreed with your view of the field or was personally helpful to you were undoubtedly fingers on the scale of a positive recommendation.

The “old boys’ network” needed to be exploded, and in the last several decades, it clearly has been. To begin with, while sexism has hardly disappeared, the range and variety of candidates is much greater, with women and minorities at least in the pool even when there has not been special encouragement for their candidacy. All jobs must be publicly advertised. Further, in many places there are “sunshine” rules, such that either letters are made public or—more typically—the letter writer is warned that confidentiality cannot be guaranteed.

Efforts have been undertaken to make such letters more objective. One ploy is to ask the letter writer to compare the candidate to other candidates in her cohort—either others who are explicitly named or ones whom the writer himself is asked to nominate. Another ploy, common for admission to a highly competitive program, is to ask the letter writer to rate the candidate in terms of her percentile rank with respect to properties like originality, expression in writing, oral expression, etc. An example: “In oral expression, as compared to other candidates, is this candidate in the upper 1%, the upper 5% the upper 10%, etc.?”

A complicating factor—especially salient in the United States—is what I’d term “letter inflation.” We are all familiar with grade inflation—the tendency over the decades to give students ever higher grades (in many institutions of higher learning, the Gentleman C has been promoted the Gentleman A minus). With respect to letters, I’ve observed the same trend in the United States—letters often compete with one another for superlatives. Indeed, of the many letter writers whom I know personally or “on paper,” only one of them is relatively candid about the flaws in a candidate.

So, in the light of all of these obstacles, what is left, if anything, of professional judgment? Faced with other letters that are likely to be laden with superlatives, as well as the prospect of public exposure of critical remarks (not to mention the possibility of a lawsuit filed by an unsuccessful job candidate!), are there any principles to which a letter writer should adhere in order to convey his or her professional judgment in a reliable way?

Here is what I would recommend:

1. When asked by a job candidate for a letter of recommendation, be prepared to say “no” and to give reasons that are candid, though not, of course, gratuitously nasty. I often explain that I don’t know the candidate well enough to be helpful or that I have already agreed to write for someone else or that I don’t think that the candidate is appropriate for the job. Better to be tough at the beginning than to find yourself in a quagmire.

2. Refuse to do rank orderings or checklists. Here’s the standard boilerplate that I use: “As a matter of personal policy, I do not complete ratings questionnaires as a portion of recommendations.” Why this refusal? One almost never sees checklists that are not completely skewed to the positive—so much so that checking off “Top 10%,” rather than “Top 1%,” can be the kiss of death.

3. Be purely descriptive whenever possible. For example, when it comes to a discussion of the candidate’s research, put it in your own words and be explicit about its contribution as well as its limitations.

4. State in a positive way the candidate’s strong features—letter readers will be interested in how you see her strengths.

5. If possible, touch on the candidate’s less strong features—or indicate areas where you don’t feel competent to comment (for example, if you know the candidate’s research but not her teaching, it is fine to state that).

If, for whatever reason, you cannot be explicit about a candidate’s weaknesses, be silent. Leave it to the readers of the letter to make inferences about what is not discussed. To avoid unintentionally harming the candidate, I always have a last line that reads, “Please let me know if I can provide any further information.” If there is indeed a follow-up, you are free to say, “I am not comfortable commenting on that issue.” Don’t lie!

As you can probably tell, this state of affairs does not please me. I’d much rather be completely candid and have others be equally candid with me. (In that sense, despite its obvious flaws, I have sympathy for the normative behavior in earlier times.) But that is not the world in which we live, and it is unfair to treat a job candidate in a way that unfairly jeopardizes her chance for a livelihood. But I come to the reluctant conclusion that, at least in the United States, letters of recommendation are not a site where one can expect candid professional judgment.

Since these issues will not go away, and they affect likely all who would read this piece, I’d be eager to hear others’ ideas about the professional judgment involved in letters of recommendation and how to exercise one’s professional judgment in a responsible way. Feel free to write your own recommendations below!

Colleagues from the Netherlands Visit The Good Project

By Daniel Mucinskas

GoedWerk-e1484687810725.png

In October 2016, The Good Project welcomed five colleagues from the Netherlands affiliated with The Professional Honor Foundation (PHF), a Dutch organization that explores professional identity and behavior across many sectors. PHF is currently celebrating its 10th anniversary. In honor of this milestone, members of The Good Project and PHF teams convened in Cambridge to share updates, connections, and reflections.

During the visit, The Good Project’s researchers had the opportunity to learn more about the work of these scholars and activists, including successes and ongoing challenges.

Co-founded by Thijs Jansen and Alexandrien van der Burgt, PHF formed following the publication of Professional Pain (2005), a book which detailed the frustrations of many Dutch professionals with pervasive forces of deprofessionalization, bureaucratization, and lack of autonomy. In 2009, a second book, Professional Pride, furthered these messages and emphasized “pride” as a central value for all professionals to embrace.

Over the past several years, PHF has drawn on The Good Project’s conception of “good work” for a series of focus groups with professionals in different sectors, among them judges, physicians, educational officials, and accountants. PHF and associated scholars have also released additional books supporting their vision of a “Good Work society.”

In our sessions together, we focused on three professional domains where PHF has been able to exert influence.

Medicine: In recent years, Dutch healthcare has inched ever closer to the system in the United States, with high costs and insurers/private companies holding increasing power. Those opposed to these trends have had some success in countering them (for example, the government’s plan to abolish the free market for patients to choose their doctors was scrapped); but medical practitioners in the Netherlands are subject to new levels of policy oversight that distance them from patients. In The Alternative for Healthcare (2015), Jansen and his associates argue that the quality of care and a relationship of trust between patient and practitioner are the foundations of the medical profession. Two-thirds of Dutch general practitioners have successfully campaigned for measures that would reduce red tape, limiting the influence of healthcare insurance companies and forced competition between GPs.

Accountancy: Following the 2008 financial collapse, public confidence in accounting in the Netherlands collapsed; surveys revealed that 85% of people had no confidence in auditors. A crisis of identity for the profession resulted, with many accountants questioning how to ensure integrity and quality in their work. Margreeth Kloppenburg has been at the forefront of work encouraging Dutch accountants to be more ethical, accountable, and aware. As a result of the report “In the Public Interest” about the purposes of the accounting profession, 53 new policy measures were passed by the Dutch government. These measures include the allowance for external governors in accounting firms, penalization for individual misconduct, and the adoption of a professional oath and mandatory ethics courses. Kloppenburg is currently working on a curriculum to help accounting students tackle difficult ethical dilemmas on the job; she has launched a website called The Accountables, where accounting students reflect on vexing professional issues and share ideas and insights. While reforms in the Dutch accounting system could act as a model of “good work” practices, puzzles remain. The best methods of inculcating these ideas have yet to be determined, and students complain they are being asked to over-reflect before they have even entered into the profession. By the end of 2017, PHF will publish a book on the professional honor of accountants, written by accountants and other interested parties, as a force of change for the greater good.

Teaching: Education in the Netherlands is increasingly hierarchical, with the government dictating policy down to districts, to administrators, and finally to individual teachers. In 2013 the co-authored book Het Alternatief  (English title: The Alternative), Jelmer Evers, a candidate for the Global Teacher Prize, details a reversal of this un-professional top-down power structure. According to the scheme that he has developed, education should focus more on open dialogue for teacher collaboration/association, a reduction in burdensome instructional periods, and creation of national teacher academies. These recommendations have received attention from the Dutch Minister of Education and Parliament. As part of a global professional movement, Evers the co-authored an international follow-up book called Flip the System: Changing Education from the Ground Up (2015). Based on this work, together with Education International, Evers is organizing a worldwide educator network called TENGlobal which seeks to increase teacher agency through greater trust, a sense of purpose and pride, and collaborative effort and support.

Overall, PHF has pushed matters of professional identity into the national consciousness of the Netherlands, presenting a counter-narrative to the marketization and systemic weakening of individual choice apparent across the professional landscape. As Gabriel van den Brink, professor emeritus at Tilburg University, put it, a new paradigm in professionalism will temper the prevalence of capitalist commercialism with more relational and creative arrangements.

Truth and Goodness: Taking a Page from Ronald Reagan!

In a previous blog, I lamented the emergence of a “post truth”-“false news” public space—one where there is essentially no belief in truth, nor even in the possibility of establishing it. Given my interest in ethical behavior, I wondered whether it is possible to offer visions of “the good” when there is no longer a belief in the search for—indeed, even the possibility of ever establishing—truth.

I rejected two options: 1) surrender to postmodern skepticism about the possibility of rendering judgments of truth; and 2) clinging to the Olympian view that truth may ultimately be established but is not a viable goal for ordinary mortals in ordinary time.

While searching for a plausible alternative in real time, I suddenly remembered the words uttered by President Ronald Reagan as he laid out his stance toward the then still formidable Soviet Union. Reflecting on the possibility of mutually reducing or even eliminating nuclear weapons, the 40th president said, “Trust, but verify.”

(There are various wordings and translations of this phrase, which may date back to classical times—for my purposes, it’s the two key terms that are instructive.)

Turning first to verification, of course anyone can make any kind of assertion at any time. Those who encounter the assertion need to determine on what basis it has been made. And so, if, for example, the Soviet Union (or the United States) claimed to have reduced its stockpile of weapons, there needed to be surveillance methods whereby the accuracy of the claim could be ascertained.

The scholarly disciplines and forms of technical expertise that humans have developed over the centuries have embedded in them ways, methods, and algorithms on the basis of which claims can be judged. Sometimes, of course, the methods of verification are controversial, as are their realm of their proper application. Yet, within, say, economics or psychology or astronomy or civil engineering or neurosurgery, certain methods are widely accepted; only a cynic or an ignoramus would ignore or bypass them completely. Why re-invent the disciplinary wheel?

Experts frequently agree when the evidence is inconclusive, and then these experts are challenged to indicate conditions under which claims might be supported with greater confidence.

Each of us is better off if we can judge claims and methods ourselves or in discussion with other knowledgeable peers. But that state of affairs demands that we have achieved significant expertise; and life is far too short to allow any individual to attain expertise in more than a few, usually quite closely-related areas. No more polymaths in the tradition of Leonardo da Vinci!

Enter the second arrow in the Reagan quiver: that of trust. Only a fool trusts all claims blindly; only a skeptic does not trust anyone under any conditions.

And so the challenge for all of us is to determine who(m) to trust, and under what circumstances. In my own case, there are certain publications and certain websites that I have come to trust because they are disinterested in the best sense of that word.  Rather than seeking to find evidence to support a position to which they have already been committed, these publications carry out fresh investigations, are careful in their reporting, and—importantly—are quick to point out errors and to correct course. In cases of doubt, I’ll turn to The New York Times, National Public Radio, The Economist, and their respective websites (not, of course, to their opinion pages and columns).

Depending on the area of expertise, there are also certain individuals whose judgments, opinions, and conclusions I have come to trust. (Out of respect for their privacy, I am not going to name them, but they know who they are!) What these individuals will claim or conclude with respect to a particular case cannot be anticipated; rather, these knowledgeable individuals weigh each case on its merits, come to the best conclusion that they can, and freely admit when the case remains unclear or indeterminate. And, as in the case of the publications to which I have just made reference, these trustworthy individuals do not hesitate to indicate when an earlier conclusion or claim was off base.

With respect to trust, there is one potential source about which I am particularly skeptical: one’s own intuitions. Intuitions are sometimes well-founded; but when it comes to issues of import, especially as they affect others, evidence, argument, and consideration of counterclaims need to be given pride of praise. I recall an old saw: “No one ever went so wrong as the person who relied primarily on his own judgment.” (If this makes you think of a current political figure, you and I are thinking along similar lines.)

Bottom Line: If we are to continue to believe in the possibility of ascertaining what is true, we have two primary allies: 1) the methods of verification of the several fields of knowledge and practice; and 2) the existence of persons, publications, and institutions whose track record merits trust. It’s best if we can continue to draw on both of these allies, with the relative importance of each ally, depending on the particular issue and its ramifications.

So while Ronald Reagan was contemplating reductions in the arsenal of nuclear weapons, his pithy phrase helps us to think about the validity of the various claims that we encounter—claims that are essential to consider if there is to be any progress in judging and achieving the good.

The Good: Can We Have It in the Absence of Truth?

For over half a century, I’ve been obsessed with the nature of truth, beauty, and goodness. I see them as central in education and, indeed, in life—I would not want to live in a world where human beings could not distinguish truth from falsity; did not value beauty; and did not seek what is good and desist from what is bad.

In the last quarter century, I have argued that a principal reason—perhaps the principal reason—for education is to help young people understand (and act upon) this trio of virtues. These are the themes of my books The Disciplined Mind and its update in Truth, Beauty and Goodness Reframed. This past term, I taught a course on the topic—I jokingly dubbed it “Truth Beauty and Goodness Reframed Reframed.” And in an ongoing study of education, I speak about the space between LIteracies (the goal of the first years of school) and the LIvelihoods (the attainment of reasonable employment toward the end of adolescence) as the LIberal Arts and Sciences—the study, appreciation, and realization of these three virtues.

But any thought that I had cracked the secret of the virtues has been exploded during the past year by the political events in the United States. Voters in America had the choice between one presidential candidate who approached issues of truth with the hair-splitting logic of a lawyer; and another candidate who baldly lied and then lied about his lies. As if to finish the final funeral of truth, we have an electorate, many of whom do not seem to care about rampant lying; and the creation of a new category—fake or false news: news which is simply made up for propaganda purposes and is then circulated as if it had been carefully researched and validated.

How does this newly emerging state-of-affairs relate to the virtues? Until 2016, I had assumed that truth was a widely accepted goal—we might even say a widely accepted good—even though, of course, it is not always achieved. And so we could turn our attention to what I consider the heartland of goodness: the relations that obtain among human beings, those to whom we are close as well as those with whom we have only a distant, transactional relationship.

But I have had to come face-to-face with an uncomfortable, if not untenable situation: if we don’t agree about what is true, and if we don’t even care about what is true, then how can we even turn our attention to what is good, let alone care about what is good, and what is not? (In thinking about this issue, I’ve been aided by the excellent discussions with my students at the Harvard Graduate School of Education.)

So here’s my current thinking:

Option #1. A Post Post-Modern View: If we throw out the possibility of ascertaining truth, or even caring about truth, then goodness must be scuttled. If P and Not P are equally valid (or equally invalid), there is no possibility of making an ethical or moral judgment. All are good, all are bad, flip a coin.

Option #2. An Olympian View of Goodness: For the sake of argument, let’s concede that we ordinary humans are not able during our lifetimes to make judgements of what is true and what is not true and hence are stymied in our evaluation of “the good.” There might still be judgments of goodness which are based on some absolute standard: standards of justice (that exist in some document, be it a constitution or the Bible); standards of the good (that are made by God or by the gods); standards of posterity (that are made by historians many years hence); or standards of philosophers (what Plato or Kant or Rawls might deem to be good).

I certainly favor Option #2 over Option #1. But I propose another way of thinking of this issue.

If there is any view of good that can be put forth as universal, or close to universal, it is that one should not kill innocent people (The Seventh Commandment—Thou shalt not Kill; The Golden Rule: Do onto others…). So let us stipulate that principle as a “Given Good.” In making a judgment about the relation among human beings, we can therefore conclude that one who kills one or more innocent persons is a bad person and/or has committed a bad act. (By extension, one could then say that individuals who save innocent persons or who penalize killers of innocent persons are good persons.)

Following this line of argument, we need now to determine the truth of the matter: whether a killing took place, who carried out the killing and why, what is the status of the person who was killed, and what, if anything, should be done with the identified killer.

Allegation: John killed Joe.

In what I have termed “neighborly morality,” these questions can usually be answered without too much difficulty. People who live in a neighborhood know one another, they see what is going on and why, and nowadays they can record (and replay) happenings instantly on various recording devices. If Joe’s murder is observed by other individuals, and/or recorded for posterity, then only a crazy person will deny that it has happened.

Of course, determining the motive of the killer and the status of the killed can be more challenging. But again, in a neighborhood, individuals will generally be well-known by those whom they see each day, and the planned or accidental nature of the killing will be apparent, as well as the behavior of the killer in the aftermath of the deed.

And so, in brief, if establishing what happened, what is true, is relatively straightforward, and judgments of good/bad can be validly made… except by the extreme post-modernists or by those who are crazy.

But now let’s consider killing that occurs outside the neighborhood, often of a large number of persons, and often by agents whose motivation and activities are far more difficult to ascertain.

Allegation: Serb leader Radovan Karadzic killed thousands of innocent Bosnians and Croats

Allegation: Syrian leader Bashar Al Assad is killing thousands of innocent Syrians.

Allegation: Russian leader Vladimir Putin poisoned several of his political opponents.

In these latter cases, the norms of neighborly morality do not apply. The alleged killers are not known personally by most of the victims and observers. Nor do the alleged killers directly carry out the killings—the lines of authority, and the details of the killing, are much more difficult to ascertain. Indeed, in the absence of such personal culpability and of documentation of the circumstances of murder, the killings can almost seem like crimes that did not happen or perpetrator-free crimes: As Josef Stalin cynically quipped, “A single death is a tragedy; a million deaths is a statistic.”

In the second decade of the twenty first century, such heinous crimes do not always go unpunished. Using the precedent of the Nuremberg Trials in post-World War II Europe, we now have an International Criminal Court. And at least occasionally, a leader like Karadzic can be held accountable for mass deaths—in his case, he was found guilty of genocide, war crimes and crimes against humanity. But for this result to occur, one needs to have massive amount of evidence, the power to arrest and extradite, and the decision of a court that proceeds according to international law. No wonder that more distant forms of killing typically go unpunished.

Even in the case of the conviction of Karadzic, consensus about the crime and punishment can remain elusive. The charge of genocide is very difficult to sustain; indeed, over a century after the killing of one million Armenians, Turkish leaders refuse to discuss or even use the term genocide. Militant Serbs believe that they are in a justifiable struggle to vindicate their own history and sustain their own culture, a struggle dating back to the battle of Kosovo in 1389! Paradoxically, for many Serbs, the actions of the late 20th century were a retaliation against neighbors whom they have loathed over the centuries.

So if truth is so difficult to establish, where is the dry land? Once we leave the neighborhood, on what bases can we render judgments of what is good and what is not, especially when cases are less clear-cut than the Syrian or the Serbian cases?

I find two sources of hope:

  1. Understanding the means, the methods, and the evidence on which assertions are made. If one is dealing with contemporary or historical political events, one needs to know how to make sense of journalism, eyewitness reports, historical documents, and other putative sources of evidence. This approach applies equally well to science, medicine, art, and indeed any way of marshalling and evaluating evidence.

  2. Identifying individuals and sources who are trustworthy. Even the most polymathic among us cannot be expected to be able to evaluate all argument and evidence by ourselves. And so it is especially important to identify those persons (known personally or known through the media) and those sources of information that we find to be regularly accurate and reliable. This does not mean that such persons or sources are always right. None can pass that test! Rather it means that when they are wrong, they acknowledge it. It also means that their judgments are not always predictable; rather, they evaluate each case on its merits.

In my next blog, I’ll turn my attention to the ethics of roles. I’ll pursue how, on the basis of these two promising sources, we can establish—or, perhaps, more precisely RE-establish—a firmer link between truth and goodness.

The Three Virtues and the Presidential Election

Ever since I went to Wyoming Seminary secondary school, I have been obsessed by what I call the three virtues: Truth, Beauty, and Goodness (known at ”Sem” as Verum, Pulchrum, Bonum, the school’s motto). I have been convinced that they are important and should be cherished; and that my own life should be guided by the pursuit of these values.

Reflecting that pursuit, in the late 1990s, I wrote a book called The Disciplined Mind, in which I developed the idea that education should be focused on developing understanding and valuing of truth, beauty and goodness. Though sincere, I was also naïve: my children, my friends, and my critics (sympathetic or otherwise) pointed out that these three virtues are hardly transparent or self-evident. People differ vociferously on how to define them, or even whether the terms are legitimate descriptors. As the years went on, the emergence and proliferation of digital media—the internet, the web, social media—further complicated the identification and the pursuit of this venerable trio. Among the disruptive features were the ease of disseminating fake information, unethical exploitation of social media, and challenges to traditional criteria of beauty.

And so, some years ago I decided to update the principal argument of The Disciplined Mind in a new book called Truth Beauty and Goodness Reframed. The subtitle of the hardback edition was “Education for the Virtues in the 21st Century.” The subtitle turned out to be too subtle! In the paperback version, I substituted the somewhat jazzier “Education for the Virtues in the Era of Truthiness and Twitter.” The book sought to pin down which aspects of the virtues could be sustained in our time, and which had to be rethought, modestly or fundamentally.

I continue to be haunted by these terms and their implications. A few years ago, I gave a series of public lectures at New York’s Museum of Modern Art; I’ve given university-wide lectures at Harvard; and this semester I am teaching a course on the topic. Informally, I have described the course as “Between the Literacies and the Livelihoods.” Almost all agree that the first job of school is to help students master the basic literacies—reading, writing, arithmetic, and perhaps now coding. And certainly within the contemporary United States, most have come to believe—rightly or wrongly—that the principal purpose of higher education is to secure a job. But what should come in between?

In an effective educational system, LIteracies are attained by the ages of 9-10; and securing a LIvelihood is the challenge of late adolescence. That leaves close to a decade to pursue the LIberal arts and sciences. In my view, the liberal arts and sciences are the time-honored avenue by means of which we approach the trio of truth, beauty, and goodness; the liberal arts teach us how to stretch our minds, evaluate evidence, confront new perspectives, and express ourselves clearly and thoughtfully.

Enter the 2016 election. At least among journalists, scholars, pollsters, and pundits (not to mention my own family and friends), the process and the outcome of the election were, to put it mildly, unexpected! Nor has it been easy or straightforward to understand the way in which a large part of the electorate approached the campaign and the vote. Since I think about truth, beauty, and goodness every day—often every hour—I’d like to share how the virtues have fared over the months.

Truth

The biggest surprise to me has been how little importance large portions of the electorate have placed on whether statements made by candidates (or their surrogates) are in accord with the facts. Time and again Donald Trump said things that could easily be demonstrated to be false—the so-called “pants on fire” metric. This did not seem to faze his supporters at all.

While Hillary Clinton’s tally for truth-telling was somewhat better, she was widely (and, I believe, correctly) perceived as being legalistically truthful but otherwise suspect. And so when Trump called her “Lying Hillary,” the label seemed more apt than, say, a characterization of “Lying Barack” or “Lying Michelle.” Truthiness triumphed over truthfulnesss—and whether truthfulness will count more in future political campaigns is anyone’s guess. As some have suggested, we may live in a “post-truth” society.

Beauty

While my definition of truth—the accuracy of statements—is conventional, my definition of beauty is less orthodox. I apply the descriptor “beautiful” to experiences. An experience counts as “beautiful” to the extent that it fulfills three criteria: 1) it captures one’s interest; 2) its form is memorable; and 3) the experiencer would like to repeat the experience or one like it.

Politicians blessed with charisma are likely to create beautiful experiences. Ronald Reagan, Bill Clinton, and Barack Obama created beautiful experiences for their audiences. Hillary Clinton does poorly on these criteria—no fault of hers, as beautiful experiences are very difficult to contrive, particularly in the public arena. Personally, I found it painful to watch and listen to either candidate for any length of time, but it was clear to me that many Americans found it interesting and even memorable to behold Trump and happily revisited the experience. Without doubt, his long immersion in reality television—aided by the American public’s love affair with that genre—contributed to his ability to create what for many were beautiful experiences.

Goodness

I apply the descriptor “good” to the relations that obtain among human beings. But I distinguish between neighborly morality and the ethics of roles. Neighborly morality describes how we strive to relate to family, friends, and neighbors. The Ten Commandments and the Golden Rule tell us how to behave towards these people. But the Ten Commandments cannot tell us how to resolve an ethical dilemma at work, how to petition, how to vote, how to click our support, or when to mobilize or become a whistle blower. For these situations, we need what I call the ethics of roles. Instead of the Ten Commandments, we might turn to the Hippocratic Oath, or the U.S. Constitution, to determine how to behave in our roles as citizens and workers.

I will leave it to others to draw conclusions about the extent to which Hillary Clinton and Donald Trump are actually good family members, good neighbors, good workers, or good citizens. Certainly it is possible to be good in one sphere and to fall short in the other. But I will share my thinking about the recent campaign.

Each of the candidates sought to convey goodness to the public. Their platforms and policy recommendations had little impact—much of the public seemed little interested in how either candidate would function as president (worker) and whether either would in fact pursue the public interest (citizen).

Instead of judging the candidates using the ethics of roles, the public weighed them in terms of neighborly morality. They judged the candidates in terms of whether they were perceived as “my friend,” feeling “my pain,” speaking directly to me, making me feel better, or making me feel more optimistic.

In the popular vote, while Clinton came out ahead, it was very close. She was successful with many women and minorities, those in urban regions, and those with more education; Trump was successful with many men, with whites (who thought of themselves as “real” Americans), those in rural regions, and those with less education.

And Trump was successful with those voters because he proved a better reader of what “goods” were wanted than was Clinton. He spoke to their economic insecurity, promising to bring back jobs. No matter that he did not explain how. What mattered is that Trump made people feel good; the majority of those in the Rust Belt felt they’d rather have a beer with Donald Trump (though he does not drink) than bourbon with Hillary Clinton.

Of course, getting elected is one thing. It remains to be determined in what ways, if any, the Trump presidency will be true, beautiful, or good.

A version of this posting also appeared via CMRubinWorld and The Huffington Post.