The Professional Ethicist

The Three Virtues and the Presidential Election

Ever since I went to Wyoming Seminary secondary school, I have been obsessed by what I call the three virtues: Truth, Beauty, and Goodness (known at ”Sem” as Verum, Pulchrum, Bonum, the school’s motto). I have been convinced that they are important and should be cherished; and that my own life should be guided by the pursuit of these values.

Reflecting that pursuit, in the late 1990s, I wrote a book called The Disciplined Mind, in which I developed the idea that education should be focused on developing understanding and valuing of truth, beauty and goodness. Though sincere, I was also naïve: my children, my friends, and my critics (sympathetic or otherwise) pointed out that these three virtues are hardly transparent or self-evident. People differ vociferously on how to define them, or even whether the terms are legitimate descriptors. As the years went on, the emergence and proliferation of digital media—the internet, the web, social media—further complicated the identification and the pursuit of this venerable trio. Among the disruptive features were the ease of disseminating fake information, unethical exploitation of social media, and challenges to traditional criteria of beauty.

And so, some years ago I decided to update the principal argument of The Disciplined Mind in a new book called Truth Beauty and Goodness Reframed. The subtitle of the hardback edition was “Education for the Virtues in the 21st Century.” The subtitle turned out to be too subtle! In the paperback version, I substituted the somewhat jazzier “Education for the Virtues in the Era of Truthiness and Twitter.” The book sought to pin down which aspects of the virtues could be sustained in our time, and which had to be rethought, modestly or fundamentally.

I continue to be haunted by these terms and their implications. A few years ago, I gave a series of public lectures at New York’s Museum of Modern Art; I’ve given university-wide lectures at Harvard; and this semester I am teaching a course on the topic. Informally, I have described the course as “Between the Literacies and the Livelihoods.” Almost all agree that the first job of school is to help students master the basic literacies—reading, writing, arithmetic, and perhaps now coding. And certainly within the contemporary United States, most have come to believe—rightly or wrongly—that the principal purpose of higher education is to secure a job. But what should come in between?

In an effective educational system, LIteracies are attained by the ages of 9-10; and securing a LIvelihood is the challenge of late adolescence. That leaves close to a decade to pursue the LIberal arts and sciences. In my view, the liberal arts and sciences are the time-honored avenue by means of which we approach the trio of truth, beauty, and goodness; the liberal arts teach us how to stretch our minds, evaluate evidence, confront new perspectives, and express ourselves clearly and thoughtfully.

Enter the 2016 election. At least among journalists, scholars, pollsters, and pundits (not to mention my own family and friends), the process and the outcome of the election were, to put it mildly, unexpected! Nor has it been easy or straightforward to understand the way in which a large part of the electorate approached the campaign and the vote. Since I think about truth, beauty, and goodness every day—often every hour—I’d like to share how the virtues have fared over the months.

Truth

The biggest surprise to me has been how little importance large portions of the electorate have placed on whether statements made by candidates (or their surrogates) are in accord with the facts. Time and again Donald Trump said things that could easily be demonstrated to be false—the so-called “pants on fire” metric. This did not seem to faze his supporters at all.

While Hillary Clinton’s tally for truth-telling was somewhat better, she was widely (and, I believe, correctly) perceived as being legalistically truthful but otherwise suspect. And so when Trump called her “Lying Hillary,” the label seemed more apt than, say, a characterization of “Lying Barack” or “Lying Michelle.” Truthiness triumphed over truthfulnesss—and whether truthfulness will count more in future political campaigns is anyone’s guess. As some have suggested, we may live in a “post-truth” society.

Beauty

While my definition of truth—the accuracy of statements—is conventional, my definition of beauty is less orthodox. I apply the descriptor “beautiful” to experiences. An experience counts as “beautiful” to the extent that it fulfills three criteria: 1) it captures one’s interest; 2) its form is memorable; and 3) the experiencer would like to repeat the experience or one like it.

Politicians blessed with charisma are likely to create beautiful experiences. Ronald Reagan, Bill Clinton, and Barack Obama created beautiful experiences for their audiences. Hillary Clinton does poorly on these criteria—no fault of hers, as beautiful experiences are very difficult to contrive, particularly in the public arena. Personally, I found it painful to watch and listen to either candidate for any length of time, but it was clear to me that many Americans found it interesting and even memorable to behold Trump and happily revisited the experience. Without doubt, his long immersion in reality television—aided by the American public’s love affair with that genre—contributed to his ability to create what for many were beautiful experiences.

Goodness

I apply the descriptor “good” to the relations that obtain among human beings. But I distinguish between neighborly morality and the ethics of roles. Neighborly morality describes how we strive to relate to family, friends, and neighbors. The Ten Commandments and the Golden Rule tell us how to behave towards these people. But the Ten Commandments cannot tell us how to resolve an ethical dilemma at work, how to petition, how to vote, how to click our support, or when to mobilize or become a whistle blower. For these situations, we need what I call the ethics of roles. Instead of the Ten Commandments, we might turn to the Hippocratic Oath, or the U.S. Constitution, to determine how to behave in our roles as citizens and workers.

I will leave it to others to draw conclusions about the extent to which Hillary Clinton and Donald Trump are actually good family members, good neighbors, good workers, or good citizens. Certainly it is possible to be good in one sphere and to fall short in the other. But I will share my thinking about the recent campaign.

Each of the candidates sought to convey goodness to the public. Their platforms and policy recommendations had little impact—much of the public seemed little interested in how either candidate would function as president (worker) and whether either would in fact pursue the public interest (citizen).

Instead of judging the candidates using the ethics of roles, the public weighed them in terms of neighborly morality. They judged the candidates in terms of whether they were perceived as “my friend,” feeling “my pain,” speaking directly to me, making me feel better, or making me feel more optimistic.

In the popular vote, while Clinton came out ahead, it was very close. She was successful with many women and minorities, those in urban regions, and those with more education; Trump was successful with many men, with whites (who thought of themselves as “real” Americans), those in rural regions, and those with less education.

And Trump was successful with those voters because he proved a better reader of what “goods” were wanted than was Clinton. He spoke to their economic insecurity, promising to bring back jobs. No matter that he did not explain how. What mattered is that Trump made people feel good; the majority of those in the Rust Belt felt they’d rather have a beer with Donald Trump (though he does not drink) than bourbon with Hillary Clinton.

Of course, getting elected is one thing. It remains to be determined in what ways, if any, the Trump presidency will be true, beautiful, or good.

A version of this posting also appeared via CMRubinWorld and The Huffington Post.

The Personal in the Professional

Among today’s professions, medicine is generally considered to be the gold standard. Consistent with the high status of the profession, the Hippocratic oath taken by new physicians is generally taken as a model for how professionals should behave.

Clearly inspired by the Greek physician Hippocrates, the oath is almost two and one half millennia old. It’s difficult for us to imagine what medicine was like in the days before modern science and technology—a time without antibiotics, sophisticated surgical tools, or coherent theories of the causes of disease, the variety of treatment, and the range of possible cures. And so, not surprisingly, doctors have recently sought to update the oath—for example, considering the social and financial conditions of the patient, taking into account the patient’s knowledge as well as the doctor’s area of ignorance, and acknowledging the importance of the psychological states of both patients and their families. Click here for a discussion of such tweaks.

But even as the prototypical profession has attempted to incorporate facets of new knowledge and perspectives, another powerful force is at work: the advent of technologies which promise to accomplish the traditional tasks of the physician more competently than does the flesh-and-blood practitioner. Either already in use or on the horizon are surgeries performed by robots, diagnoses executed by computer programs, radiological data interpreted by apps, and numerous other computational innovations whose performances are indistinguishable from or even superior to those of well-trained doctors, armed with their Hippocratic oaths.

Nor is this substitution of the program for the person restricted to the practice of medicine. To be sure, not all professions are as well-delineated as medicine, but as detailed in The Future of the Professions by Richard and Daniel Susskind, and in other contemporary writings, many of the tasks traditionally carried out by lawyers, architects, accountants, and their respective paraprofessionals can now be done more quickly, cheaply, and correctly by machines than human beings. In the profession that I know best—that of educator—there are well-financed efforts to make both instruction and evaluation “teacher-proof.”

Other than expressing nostalgia for the “good old days” or voicing relief that we have transcended “the bad old days,” how should we feel about these trends—trends that there is every reason to expect will continue? One has to be rigid, even doctrinaire, not to be pleased when services can be provided to individuals who previously were not served at all (for example, physical or psychological care for a person who is unable to care for himself) or can be done more swiftly, more accurately, and in a seemingly more disinterested way (for example, when one’s taxes are computed by a well-tuned program).

Yet before celebrating the replacement of the professional person by the practical program, it’s important to take into account two factors. First of all, even the well-planned and well-executed program has been created by human beings. Accordingly, such programs have built into them assumptions and biases which must be recognized and may need to be reformulated—again, by human beings (at least for now).

Rather than eliminating the trained professional altogether, a shift to “computational programs” brings to the fore a new kind of professional: the computer programmer, the individual who handles servers, those who write and rewrite and enforce the rules of the internet and the web. We might say that the qualities of these professional services are only as “good” as the qualities of the professionals who create and regulate their use, and new technologies always bring about new complexities and unintended consequences. Perhaps these new professionals need to develop and promulgate their own versions of a profession-wide code or oath in order to deal with the multitude of moral and ethical quandaries that are bound to arise.

Second, while it is preferable to have some service rather than no service, the absence of a knowledgeable and caring “live” professional has its costs. A trained professional does not simply provide the requested service; she is expected to know the client as an individual, to take into account the larger concerns of the person (even those not directly linked to the requested service), to make broader connections and recommendations, and to exhibit solidarity as one human being responding to another. To capture the recent changes in the Hippocratic oath in a familiar phrase, we should expect that a good professional can “feel our pain”—though of course the professional cannot allow that pain to interfere with the needed service.

Professionals are neither fully replaceable nor completely irreplaceable. Within and across professions, there will be tasks that can and should be carried out in the absence of human beings, as well as those that benefit significantly from a human touch. Also, it may well be the case that some professions are far more dependent on the human interface than are others. For example, most of us care more about personal contact with a social worker than personal contact with an architect or an auditor. By the same token, it is more important for a pre-schooler to spend daily “face time” with a teacher than for a college student to have personal interaction with a teacher of statistics.

Returning to the opening example of the medical profession, it seems clear that every profession needs to rethink its version of the Hippocratic oath and to do so periodically. Such rethinking requires a recognition (and, in most cases, a reaffirmation) of its original set of values—the services that it most urgently and uniquely has performed over the years. In addition, professions must consider the strengths and limitations of the available and foreseeable technologies and participate in meaningful dialogue with those individuals and institutions who have been served by the profession, in order to determine those situations and conditions where the human factor is crucial, as well as those where it is less important, unnecessary, or even counterproductive. The result may well be reconfigured professions and a different professional landscape. I doubt that it will mean the disappearance or the devaluation of the professions altogether.

The Making of a Good Journalist

I am often asked to help design educational programs. Usually the requests come from individuals who have found the theory of multiple intelligences to be appealing and want to apply it with one or another population (e.g. how to assess the intelligences of middle school children and create an appropriate curriculum). Less frequently, though still often, the request reflects an interest in building on the work that my colleagues and I have carried out under the aegis of The Good Project (e.g. how to encourage cooperation among high schools students). Only rarely have those individuals who have approached me sought ways in which to combine MI thinking and approaches with insights and suggestions from The Good Project. And so when a request arrived to do just that—from a school of journalism in Argentina—it got me thinking.

My first thoughts centered on what intellectual capacities are important for a journalist-in-training. Newspaper, magazine, and media journalists deal with words—and so having a well-honed linguistic intelligence is crucial. Indeed, when a television personality simply reads the news stories off of a teleprompter, I consider that action to be “reading” and “performing”—not journalism. Journalists typically have to form relations with individuals—those whom they are covering, those whom they are interviewing—and so interpersonal intelligence is also vital. Nowadays, much of journalism also entails mastery of visual media, as in hosting and curating websites, and this new requirement draws on spatial intelligence.

Depending on what topics you cover, other intelligences come into play. And so, for example, if you are a business or finance journalist, your logical-mathematical intelligence needs to be honed. Or if you cover athletics or dance, bodily-kinesthetic intelligence is relevant and at a premium.

Of course, no journalist can be expected to have all of these intelligences developed to a high degree. This Is where collaboration or division of labor become important. But one can also enhance an intelligence by exercising it. Journalists who are assigned to a new beat may find that initially they lack the relevant intelligence, but there is no reason why they can’t develop it. If I can use a personal example, my naturalist intelligence is quite underdeveloped. But if I were assigned to cover food and restaurants, I presumably could strengthen that form of intelligence… hopefully not gaining too much weight in the process!

So far I have been talking about the skills that one needs to be a journalist—particularly a journalist who can turn out effective “copy” on various topics. But that is not equivalent to being a good journalist. In our conceptualization at The Good Project, the good journalist must be a professional. She must understand and seek to embody the principal values of journalism: carrying out thorough research, making sure that all sides of a story are covered, sticking closely to the facts, and, perhaps most importantly, not tilting the story in a certain direction because of one’s own personal preferences or biases. The phrase that I favor is being a “disinterested professional,” in the sense of not letting one’s own personal interests color the way that one covers a story.

How one becomes more than just a journalist but a good journalist is not easy. It certainly helps to have good values from home or from one’s early education—be it religious or secular. But the crucial time for developing the professional stance is during one’s training—at school, on internships, and on the first job. Crucial here are role models who embody the professional stance, who can talk about it in words, and who demonstrate how they deal with complex ethical situations, admit when they erred, and strive to do better the next time.

It has never been easy to become a good professional or a good journalist. Indeed, until the 20th century, the idea of a professional journalist had not been much developed. There was perhaps a “golden age” of journalism after the Second World War—particularly in Western Europe and in North America. But since the advent of digital media, the 24/7 news cycle, the desire for ever-bigger financial profits, and public lust for gossip and disinclination to pay for thorough, objective coverage, it has become even more difficult to carry out good journalism.

And yet, if professional journalism were to disappear from our world society, that would be a tragedy. We depend on journalists to remove the bravura surrounding persons and institutions and to help us understand what is actually going on. In the absence of such dedicated professionals, we are left with rumor, innuendo, and indeed complete lies, or “truthiness”—naïve belief in what one has heard before, without any sense of whether it has actually been verified. Ultimately, some groups would have to reinvent journalism in some form. But it makes much more sense to preserve its essential components, while adapting it to present conditions.

And in that mission, MI theory can be very helpful. Recognition of our multiple intelligences not only provides many more roles in journalism. It also opens the possibility that individuals can acquire news from many different media and modalities. And since we live a global, hyper-connected world, these avenues of communication hold the potential of delivering news of high quality to the population of the world.

The Making of a Professional: Where and When to Start

Most of us have a sense of which professionals are impressive—the astute and reliable physician; the fair-minded journalist who will only publish a story after thorough research; the empathetic professor who keeps up with the latest findings in her field. But where do these individuals come from? How are they formed?

While at one time, certain adult roles may have been passed down or inherited, we no longer think of queens or knights or jesters as professionals. However, one can discern analogies: several generations of doctors or lawyers in a family; families deeply involved in teaching; children growing up in a doctor or teacher family observing the behaviors, habits, and skills of the older generation and in many cases emulating the role models. Less frequently, older professionals may spot promising young persons and encourage a mentoring relationship. This can happen when an experienced teacher spots an adolescent who shows interest and ability in teaching peers; or when a high school debater comes to the attention of a veteran lawyer. Early signs of professionalism (we might call it “pre-professionalism”) are found in a youngster who understands the positive missions of their school, who joins into those missions, and who eventually takes a leading (if not leadership) role in the achievement of those missions.

This abstract characterization of “pre-professionalism” cries out for concrete specifics. And indeed, at every school, such specifics reliably abound. What happens when a new child comes to school and is unfamiliar with the norms? How is an incident of cheating handled? How does the school community react when there is a tragedy in the family of one of the students, or, indeed, when harsh weather or a fire damages a home, a business, or a church in the neighborhood? Or even an institution that is not much admired? And when there are epochal political events in the larger society, are these discussed in school and are reasonable (or even unreasonable) actions considered and carried out?

Whatever the prompt, some students will instinctively behave constructively and, alas, some students will be by-standers or even trouble makers. Here the role of the adults—both those in school and those parents and neighbors who are part of the broader school family—is salient. As is often quipped with respect to parents, “Children don’t listen to what parents say, but they invariably notice what their parents do.” And equally important—sometimes, more important—is what the older children, the admired role models, do (or don’t do).

In this context, negative role models must also be considered. Sometimes, particularly if they are attractive in one way or another, such seductive role models can encourage immature or even damaging emulations. However, as we learned in our own study of excellent professionals, young persons also gain lessons from those whom they do not admire. Frequently, these professionals told us of anti-mentors, tormentors, and negative role models who taught them exactly what they did not want to do and how they did not want to be.

Recently, it has become fashionable to say, “One should never waste a crisis.” Perhaps there’s an ounce of truth in this expression, but it would be foolish—in fact “stupid”—for educators to wait until an actual crisis erupts. Here is where simulations, games, dramatic enactments, and role play can be timely. From the time they enter school, young people should be aware of problems and conundra that may arise, the range of possible responses, and which responses are likely to be effective, which not, and why. Such rehearsals, if carried out with seriousness, can be of considerable help in the formation of good young people.

And of course, the explicit curriculum is as important as the hidden curriculum. Whether it is in humanistic subjects, like history or literature, or in more scientifically oriented subjects, like biology or physics, students should be alerted to the ethical dilemmas that can arise, how they have been dealt with (or swept under the rug), and what might or should have been done.

But these class exercises gain efficacy when they are reflected in the behaviors and attitudes of teachers and administrators every day, and they lose all power and credibility if the lessons from “upstanders” in history or literature are undermined by the decisions and actions of leaders in one’s midst.

There is a long distance from the milieu of middle and high school and the atmosphere of our hospitals, law offices, and university campuses—and thus plenty of time either to further develop one’s professional muscles or, less happily, to see them atrophy. And the values learned at home (and in religious institutions) are certainly formative. But I would wager that in many cases, when we admire a professional, we are looking at behaviors and thought patterns that were launched in school, well before youngsters even knew the word “professional.”

Ethical Quandaries: What To Do? What Not To Do?

For professionals (and, no doubt, many others as well), ethical quandaries arise all the time. For example, as a teacher and adviser of students who have diverse strengths and needs, I have to decide how much time to devote to each of my students. As a researcher, I have to decide how to credit members of a team, each of whom depends on authorship for professional advancement. The key questions with respect to such dilemmas: 1) Is one aware of the issues?; 2) How does one think about them?; and 3) How does one decide on what to do and then act appropriately?

As it happens, recently, over the course of 48 hours, three ethical issues arose in my work life, broadly construed. Perhaps one day I can write about the details but, alas, not at present. (Ethical quandaries typically involve specific people and situations, and it is unethical to write about them—unless you are a skilled writer of fiction.) What I can do is reflect on the way in which I thought about and then handled each challenge. It turns out that the three issues differed on a major dimension: the degree of freedom I had to discuss them with others.

a. In Case 1, the information involved was completely privileged. If I spoke to anyone about the situation, I would risk doing something very destructive, since there was no way to hide the identities of those involved. I ruminated at length about the issue, but the rumination took place entirely in my own mind. What kept the exercise from being completely solipsistic is that I was able to imagine what others might say. I could react to their imaginary comments, anticipate their reactions, and draw comparisons with other similar issues under analogous circumstances.

b. In Case 2, the issue, while sensitive, was one about which I could speak to others. The problem was that only individuals familiar with the personalities involved, and how they have interacted over time, could provide useful suggestions. And so I was actually restricted to a very few people’s input (though, as in Case 1,  I could have imaginary dialogues with anyone whose voice, so to speak, I could conjure up).

c. Case 3 was the easiest one to deal with, with respect to advice from others. It did not involve information that was confidential, nor was it highly technical or context-specific. I was able to consult with a wide range of individuals, including those who knew the issues and personalities as well as those to whom I had to explain the situation from scratch. I did end up giving greater weight to the advice from individuals within my professional circle.

It’s been wisely said that the quickest way to a bad decision is to think about it in isolation and not to solicit a range of opinions. Thus, the most important thing not to do when confronted with a quandary is to assume that you have all the answers, or, equally bad, the right answer. Having time to ruminate is also advisable; and sometimes what appeared to be a complex or intractable situation resolves itself or at least becomes clearer. Whenever possible, however, it is prudent to speak to as many persons as practical and to weigh their input carefully. The crucial variable, then, is to whom to speak, and how much knowledge they should ideally have.

I just used the adjective “wise”—and perhaps the situation I’m describing provides some insight into the nature of wisdom. Once, when I faced an ethical dilemma which was highly sensitive, I thought about my acquaintance (let’s call him Solomon) who knew the most about the general issue without knowing any of the players. We had such a useful conversation, and Solomon’s framing and advice were so “on point,” I felt no need to speak to anyone else. Of course, if he had not been available, I might well have sought advice elsewhere or conducted an extensive imaginary conversation with Solomon before making a decision and acting upon it.

Of what does Solomon’s wisdom consist? On one point I am quite clear: he has encountered many comparable situations over the years. And whether or not he remembers the details of each, he remembers the configuration and the possible solutions well enough so that he can arrive readily at apt advice.

In the end, the buck stops with the person faced with the ethical dilemma—in this case, with me. I might or might not make the wisest decision. Contributing to the wisdom are the number and types of conversations that I can have with others, either directly or in my imagination, and how I weigh and synthesize what I’ve heard and learned.

Whatever decision is made, I should add it to my file of vexing cases; refer to it when possible; be pleased, though not smug, when the decision turns out to have been well-advised. And in the frequent cases where the decision was less than desirable, I should figure out what can be learned from this failure and how to do better the next time Fate hands me a vexing ethical dilemma.