NOTE: Howard Gardner greatly appreciates the comments that he has received in the comments section of this page and also in personal correspondence. He plans to reply to the issues raised in these responses with another post in the new year. Stay tuned!
For as long as I can remember, I have looked up to individuals who are called “professionals”—doctors, lawyers, and architects, to name a few. I aspired to be a professional, I became a professional, and I continue to hope that young persons of promise whom I encounter will become professionals. But in recent years, I’ve become increasingly uncertain about whether professions will continue to exist, at least in a form that I can admire or even recognize. There are many views about why professions are on the wane—indeed, to put it sharply, whether the professions are being murdered or whether, to maintain the metaphor, they have been committing suicide. I’ll deliver my interim judicial verdict at the end of this essay. But since my own life so closely parallels the ups and downs of the professions in the last several decades, I begin in an autobiographical vein.
Me and the Professions
My parents, Hilde and Rudolph Gaertner, escaped from Nazi Germany in the nick of time—arriving in New York Harbor on Kristallnacht, November 9-10, 1938—the infamous Night of Broken Glass. Many of their relatives and friends were not so fortunate. My parents were not themselves professionals—my father had been a businessman, and my mother’s desire to be a kindergarten teacher had been thwarted by the rise of the Nazis. With neither professions nor funds, they soon found themselves living in very modest circumstances in Scranton, Pennsylvania, where I was born five years later.
Parental aspirations for my younger sibling, Marion, and for me were high. As I was the proverbial “bright Jewish boy who hated the sight of blood,” almost everyone who thought about it (including me) assumed that I would become a lawyer. I went to Harvard College, where indeed I took a course in the law with constitutional scholar Paul Freund, who encouraged me to go to law school; but not one to leave any stone unturned, I also took some pre-med courses and, in the summer before my senior year, arranged an interview with the Dean of Admissions at Stanford Medical School. Only after graduating from college and spending a year abroad in London on a fellowship did I make what was for me a daring decision: to abandon the prototypically aspirational professions and instead to pursue graduate studies in developmental psychology. For fifteen years after receiving my doctorate, following a path which in retrospect was quite risky, I pursued full time research on ‘soft’ (research grant) money. In 1986, I was fortunate to receive a professorship at the Harvard Graduate School of Education. At last, I was a professional—indeed, a professor!—and our family’s ambitions were fulfilled. And so I’ve remained until today.
I’ve allowed myself this autobiographical indulgence because it closely tracks the course of the professions over the last several decades. When I was growing up, at the very time when post-secondary education was exploding in the United States, becoming a professional (which did include becoming a professor) was a very high aspiration. It was not, however, equally open to everyone. For historical and cultural reasons, most professionals were white, Anglo-Saxon males, principally from privileged backgrounds. If you were an immigrant or the child of immigrants, Jewish, female, of impoverished circumstances, and /or non-white, your career choices were restricted. Supreme Court Justices Thurgood Marshall, Clarence Thomas, Sandra Day O’Connor, and Ruth Bader Ginsburg merit admiration because of the overwhelming initial odds that they overcame! Fortunately, by the time that I was choosing a career, these barriers were receding, more quickly for the first listed groups than for the latter cohorts.
While I was in college, Daedalus, the publication of the American Academy of Arts and Sciences, published an issue devoted to the professions. As the editor Kenneth Lynn phrased it, “Everywhere in American life, the professions are triumphant.” And indeed, my close friends and I almost all chose to enter one or the other of the professions. When, recently, at our fiftieth college reunion, we jointly reflected on our choices of career, 10 of 12 were either doctors, lawyers, or professors—and the 11th had actually completed law school but had not practiced law (the 12th had become a movie director). If we had polled the same individuals, and other of our classmates, about the careers that our children (and grandchildren) were pursuing, a far lower percentage would have responded with the traditional professions. Instead we would have heard repeated references to Hollywood, Silicon Valley, Wall Street, and phrases like ‘start-ups,’ venture capital, angel investors, hedge funds, Google, Facebook, and Amazon (the latter started, we should note, by young persons who had attended, respectively, Stanford, Harvard, and Princeton). Few writing today, whether in Daedalus or The Huffington Post, would describe the traditional professions as aspirational, with respect to the options considered by the most sought-after-graduates of elite institutions. And if young graduates did elect to pursue the traditional professions, they would likely work at one or more of the aforementioned corporations.
Professions: The Long Arc
As one considers the decline or even the possible demise of the professions, it’s relevant to consider their origins. In one sense, in the West, the professions date back to classical times. Then, as now, the prototypical profession was the physician. In the oath that is attributed to Hippocrates and still invoked today, those who chose to become physicians were expected to administer medical attention promptly to whomever needed it, without favor to any particular person or group, avoid covert payments, keep confidences, and, famously, ‘do no harm.’ While perhaps not officially enshrined, certainly the roles of legal councilor, engineer, architect, military leader, religious leader, and teacher were also delineated and continued in one or another form during the succeeding centuries.
Also forerunners of contemporary professions were the guilds and trade associations of the medieval and post-medieval eras. These were self-organized groups—some connected to the Church, others more or less secular—that carried out technical and artistic crafts. On the positive side, masters of guilds selected promising apprentices (sometimes, but not always, their blood relations); the masters trained these novices assiduously in the requisite skills until the journeymen were capable of executing pieces of mastery, or ‘master pieces.’ On a less positive note, the guilds were typically secretive and exclusionary; they did not hesitate to invoke high-stake rewards and punishments to control the work (as well as other parts of daily life) of their members.
In much of Europe and the United States, movements to professionalize certain high status occupations began in the 19th century and came to fruition in the early years of the 20th century. In the United States, owing to the impetus provided by the Progressive movement, occupations like medicine, law, and university teaching first mandated a broad liberal arts education and then delineated specific requirements for certification, at which time the successful candidates received honorific titles (Doctor, Esquire, and Professor, respectively). No longer could an aspiring member of these fields read up on his own and hang up a shingle bearing a treasured title (no more “Abraham Lincoln, Lawyer”). Moreover, at least in principle, those professionals who abused the tenets or the ethics of the profession could lose their license and be barred from practice. As noted, in the post-World War II era, access to these professions gradually expanded, culminating in the high point recorded in Kenneth Lynn’s celebratory remarks.
And what exactly did it mean for an occupation to achieve the status of a profession? According to the experts, among them the sociologists contributing to the Lynn-edited issue of Daedalus, a profession consists of individuals who have undergone a standard form of training, culminating in some kind of recognized title and degree. In return for the status accorded the professional, the individual is expected to provide services to individuals and institutions in need, to draw on his/her technical knowledge, and to perform this task in a respectful, disinterested, and professional manner. Other attributes typically cited in definitions of the professions are the abilities to make complex technical and ethical decisions under conditions of uncertainty, to cherish and protect the key institutions and values of the profession, and to nurture, train, and certify younger aspirants, while always keeping the public interest in mind.
I would like to add one feature to the usual list of professional attributes. It is useful to think of membership in a profession as a role to be enacted. In the course of becoming a professional, a person is learning to fill a certain role in society. That role entails the various behaviors and capacities mentioned above. In speaking of taking on a role, I mean neither to glorify nor to critique the budding professional. Rather, when on the job, the complete professional exhibits an entire repertoire of behaviors and attitudes which others come to expect. The consummate professional may be a terrible parent or spouse; she may never vote and may espouse repugnant political or social attitudes. But when carrying out her role, she embodies the best features of that profession.
At their heights and at their best, the professions seemed worthy of approbation. While being aware of the temptation to romanticize the past, I feel comfortable in claiming that the majority of professionals embodied—or at least sought to embody—the traits cited above. No doubt the popularity of Marcus Welby, M.D., or Atticus Finch, Esq., involved some idealization, but their real-life counterparts in ordinary cities and towns sought to be and behave like professionals. A key feature was the effort to be disinterested: to carry out the core values of the profession in ways that avoided serving special interests and took into account the broader needs of the society. A 2012 obituary of John G. Brooks in The Boston Globe captures well the aspirations of many professionals living decades ago:
A quarter century after graduating from Harvard College, Mr. Brooks writes that he decided to go to law school “in the happy belief that being a lawyer would provide a desirable combination of a modest livelihood and freedom, intellectual and otherwise, in contrast to the shackles of employment in ‘business.’
At the time when broader access to the professions was being enabled, other trends were also discernible. One entailed an effort to expand professional status to groups that had hitherto been marginalized. Journalists, bankers, and auditors/accountants were among the aspirants. Many observers have singled out the so-called ‘minor’ or ‘feminized’ professions—school teaching, nursing, and social work. With the passage of time, almost no sector of the workforce has been excluded. There are efforts to create educational tracks and certification for business managers, artists, and those engaged in public policy, as well as individuals clearly involved in craft-like enterprises like beauticians or gardeners or dental hygienists. While labor unions—to some extent, the successors to the guilds of earlier times—have been in decline (especially in the United States), unions for various kinds of educators and public sector workers have sprung up.
These trends were for the most part incremental; had nothing further happened, the occupational landscape of the middle of the twentieth century might well have endured to this day. It has taken two larger events—one economic, the other technological—to shake the professions to their foundations and raise the question of whether they have a future.
1980: The Triumph of the Market
It’s often been observed that, in both the United States and Great Britain, a major political and economic change occurred around 1980. Ending a long string of centrist governments, Conservative politician Margaret Thatcher became Prime Minister of the United Kingdom; she guided a regime that cut back radically on nationalized institutions and policies while deliberately seeking to release the engines of market capitalism. In the United States, Republican politician Ronald Reagan similarly critiqued the expansive role of the federal government, instead endorsing individual initiative and the mobilization of entrepreneurial energies. Less widely recognized is that, in 1979, after decades of alternation between more and less canonical varieties of communism, Deng Xiao-Ping became the supreme leader of post-Mao China. As a youth, Deng had spent some time in France; unlike most of the leaders of the Maoist revolution, he had witnessed a market society first hand. Endorsing the levers and legends of market capitalism, Deng famously declared, “I don’t care if a cat is black or white, so long as it catches mice.”
And so, in a trend that would have been unthinkable a decade or two before, three of the leading powers in the world had strongly endorsed the hegemony of market capitalism. (Of course, a decade later, the Berlin Wall fell. And, admittedly for only a brief period of time, it looked like there was only one form of political economy in the world—that of market capitalism, more or less unbridled.)
Given the topic of this essay, one can properly ask why the form of economy should necessarily affect the status of professions. Indeed, so long as the government does not interfere with professional practice, one should be able to have professions worthy of the name across a variety of political and economic regimes. But the celebration of markets—and one obvious measuring instrument, the accumulation of capital—gradually but inexorably has exerted two major impacts on the professions.
First, the skills of those trained as professionals were sought by institutions that could make use of those services. Large enterprises—primarily for-profit corporations but also medical centers and institutions of higher learning—wanted to attract professionals, and especially those exhibiting notable or specialized gifts. As businesses—literally or de facto—these institutions were motivated by the desire to secure the most favorable bottom line. Their leaders were willing to pay high salaries to relevant professionals, while the professionals, in turn, were expected to serve the goals of the hiring organizations—greater profit in the case of business, expanded services and higher prestige (not to mention increased endowments) in the case of non-profits. And so, as an example, General Electric has over one thousand lawyers in its employ and the salaries, garlanded with bonuses and stock options, are significantly higher than those typically received by individual practitioners or those who belong to small partnerships.
Second (and not unrelated) was the growing belief that one’s success, indeed one’s credibility, as a professional was closely and perhaps inextricably tied to the size of remuneration—the powerful acronymic troika of ROI (“return on investment”). In earlier times, professionals had hardly been self-sacrificing; but they had generally been content to lead a reasonable middle-class life style. Recall the obituary for John G. Brooks. Leisure time was as likely to be devoted to community service as to extra billable hours. But in a society moving toward a ‘winner take all’ mentality, many professionals—and especially younger professionals—came to value their total salary more than other indices of accomplishment, such as service to the community, mentoring of junior associates, time devoted to friends and family, or sheer leisure time activities. And indeed, many in their ranks could not understand why anyone would or should apply any calculus other than net worth.
Values once central to many practitioners of the law were far less evident. Admission to the Bar in Michigan involves an oath in which includes the following statements:
I will not counsel or maintain any suit which shall appear to me to be unjust; nor any defense except such as I believe to be honestly debatable under the law of the land.
I will employ for the purposes of maintaining the causes confided to me such means only as are consistent with truth and honor and will never seek to mislead the judge or jury by any artifice or false statement of fact or law.
It’s hard to argue that these considerations currently loom large on the radar screens of lawyers in Detroit—or Denver or Dayton.
The Digital Revolution
Unlike the shift in the political winds that occurred around 1980, no single year can be identified as the beginning of the major technological revolution of our time—the digital revolution. With a nod to Charles Babbage and Lady Ada Lovelace a century earlier, future historians may single out the rise of powerful mainframe computers in the 1940s and 1950s; the launching of the Internet (by ARPA) in the 1960s; the advent of the laptop in the 1980s and ever smaller devices in the 2000s; the creation of the world wide web in the 1990s; and the launching of ever more popular social media in the last two decades, Facebook and Twitter being among the current favorites. And then there are the innumerable enabled events such as online shopping, bargaining, blogging, bullying, and even, on occasion, launching or impacting a political or social revolution.
Again, in principle, nothing in the digital revolution itself disables professional practice. One expects professionals to make use of—indeed, welcome—new technologies, without, however, letting those technologies undermine or unduly distort the practice itself. And similarly, those traditionally served by professionals—be they students, suppliers to corporations, or those requiring medical or legal services—might still continue to make use of their expertise on a regular basis.
In practice, however, when it comes to the lives of professionals, the digital technologies have been at least as disruptive as the market mentality. Traditionally, professionals have been able to present themselves as highly and in fact uniquely knowledgeable individuals. Such experts not only have detailed technical knowledge; they are also able to provide services and solutions that take into account the full complexity of the client’s needs and to do so in a highly personalized fashion. The notion that these services could be done in the absence of other humans—even highly trained humans—once seemed the stuff of science fiction.
But over the last two or three decades, this consensus has disintegrated. Due to increasingly powerful services and precise search engines, potential clients can be in touch readily with peers around the globe. These pursuers can approach the table manned by a professional as well informed as and sometimes more knowledgeable and more up-to-date than the professional herself. At the least, client and professional are coming to look at one another as equals. In a parallel set of developments, the kinds of broad services traditionally provided by professionals have been broken down into far finer sets of skills and services. Instead of having trained journalists fashion a comprehensive account suitable for all readers, online reporting systems can customize various linguistic and illustrated versions—often drawing on already posted sources—for readers of different interests and tastes and link the story to other content, including advertisements of interest. Instead of having a trained lawyer draw up a trust that is appropriate for a client, an online system poses a set of questions to a client and then produces a finished document. Such customized services are increasingly carried out not by humans billing by the hour but rather by apps that are quick, efficient, reliable, and more often than not, as good as (if not better than) professionals and at a tiny fraction of the cost.
At present, as far as I know, few digital creations purport to be completely substitutable for a well-trained human professional. And indeed, the more complex the situation, the more likely that clients are willing to pay for the expert who is likely to solve the problems with which they are faced—as I expressed it earlier, ‘approach complex technical and ethical problems under conditions of uncertainty.’ In the so-called ‘star’ or ‘winner-take-all’ system, top flight doctors, lawyers, and professors manage both to bring home large salaries and maintain their professional credibility. But those judged to have less talent and who amass less imposing incomes are not in demand, and often the tasks they used to execute are carried out by paraprofessionals. While senior partners at law firms like Skadden Arps or Simpson Thacher bring home millions if not tens of millions, in 2013 the average lawyer in the United States earned about $130,000 (which of course means that many earn far less). These ‘ordinary’ lawyers win neither big bucks nor the esteem of their fellow citizens.
We do not know when, or how, or even if computer applications will be able to replace a super-lawyer like David Boies or a physician like legendary heart surgeon Michael DeBakey. But we certainly know that many of the functions that they once carried out are readily downloaded today; that trend will continue and expand; additional ‘hands-on, face-to-face, personalized expertise’ may well turn out to be supererogatory or even illusory. Moreover, the information readily available through scans of ‘big data’ will result in scorecards that will evaluate, or at least purport to evaluate, the ‘batting averages’ of Boies or DeBakey, as compared to the scores achieved by a legal algorithm or a medical robot. Just as lab tests of canvases or inks may surpass artistic connoisseurs in discriminating a fake from an original, just as IBM’s Deep Blue was able to defeat chess champion Gary Kasparov, and IBM’s Watson surpassed winning contestant Ken Jennings in the quiz show Jeopardy, so, too, a digital Boies or a robotic DeBakey may outflank the performances of the original master in whose image these contemporary Golems were fashioned.
Introducing The Susskinds
Surveying this state of affairs, in their new book The Future of the Professions, are the British father and son team of Richard Susskind and Daniel Susskind. Richard has long been a leading thinker about and critic of the legal profession, and his son Daniel teaches economics at Oxford. These authorities firmly believe that what has happened—and what is going to happen—to the practice of law will happen as well in the other major professions. They outline major technological innovations and training regimens for para-professionals, which in combination are likely to satisfy the bulk of the needs of ordinary clients as well as or better than do the ordinary run of solicitors and barristers. In their considered view, contemporary professions are failing economically, technologically, psychologically, morally, and qualitatively, and by virtue of their inscrutability (p. 33). Or, as they put it colloquially, “We cannot afford them, they are often antiquated, the expertise of the best is enjoyed only by a few, and their workings are not transparent. For these and other reasons, we believe today’s professions should and will be replaced by feasible alternatives” (p. 3).
What may surprise most readers—at least, it surprised me—is that not only the younger but also the elder generation welcomes these trends. To be sure, the Susskinds do not go quite so far as George Bernard Shaw, who famously quipped that the professions are “a conspiracy against the laity.” But they celebrate the many freeing and empowering facets of individuals taking greater control of their own lives; and they do not hesitate to critique the professions for their less attractive features—their arrogance, their secrecy, and their elitism. The Susskinds counsel young people not to enter the professions, a dying sector; instead they celebrate cohorts of knowledge engineers, process analysts, designers, system providers, data scientists, and systems engineers (p. 264). If, as they anticipate, the professions are headed for a single burial plot, or for a whole sector of the cemetery, the Susskinds will shed few tears.
As a professional and an observer of professionals, I certainly discern and often welcome the trends described by the Susskinds. In my scholarly work, I benefit from the tremendous power of search engines and the ability to share and edit documents seamlessly. In my teaching, I communicate constantly with students and exchange all kinds of digital materials with them; and in class, I often use digital materials. Even before the advent of social media, I made all of my lectures available on the university’s Intranet. And of course, I can follow both my areas of special interest and what is happening in the world with great ease—and even comment on this material, as in this essay. I would not want to return to a pre-digital Dark Age.
As thoughtful analysts and skilled debaters (characteristics of the top suite of flesh-and-blood lawyers on both sides of the Atlantic), the Susskinds not only submit a strong brief in favor of their predictions but also anticipate possible responses to their account. In a section of their book called “Objections,” they list the principal reasons why others might take issue with their analyses, predictions, and celebratory mood. This list of counter-arguments to their critique includes the trustworthiness of professionals; the moral limits of unregulated markets; the value of craft; the importance of empathy and personal interactions; and the pleasure and pride derived from carrying out what they term ‘good work.’ With respect to each objection, the Susskinds give a crisp response.
I was disappointed with this list of objections, each followed by refutation. For example, countering the claim that one needs extensive training to become an expert, the Susskinds call for the reinstatement of apprentices, who can learn ‘on the job.’ But from multiple studies in cognitive science, we know that it takes approximately a decade to become an expert in any domain—and presumably that decade includes plenty of field expertise. Apprentices cannot magically replace well-trained experts. In another section, countering the claim that we need to work with human beings whom we can trust, they cite the example of the teaching done online via Khan Academy. But Khan Academy is the brainchild of a very gifted educator who in fact has earned the trust of many students and indeed of many teachers; it remains to be seen whether online learning à la Khan suffices to help individuals—either professionals or their clients—make ‘complex technical and ethical decisions under conditions of uncertainty.’ The Susskinds recognize that the makers and purveyors of apps may have selfish or even illegal goals in mind. But as they state, “We recognize that there are many online resources that promote and enable a wide range of offenses. We do not underestimate their impact of threat, but they stand beyond the reach of this book” (p. 233).
Whether or not one goes along with specific objections and refutations, another feature of the Susskinds’ presentation should give one pause. The future that they limn seems almost entirely an exercise in rational deduction and accordingly devoid of historical and cultural considerations.
Turning to history, in the past century alone, the world has witnessed two World Wars, a Cold War, and a Holocaust as well as other genocides, and we live at a time of strife and conflict in many corners of the world. New problems are sure to arise: the possibilities of nuclear explosions; cataclysmic climate change; unplanned mass immigrations; widespread religious strife; occupational dislocations; unintended consequences of digital technology; and an ever-increasing population that is elderly and disabled. There is every reason to believe that existing and possibly new professions and professionals may well be needed to deal with these situations. The historical lens has been left in the drawer.
As for culture, the professions have arisen largely in the West, and when they have been adopted in parts of the Far East and the Southern hemisphere, they inevitably take on their own coloration and nuance. We cannot determine whether, and, if so, to what extent modern economic development was enabled by the status and disinterestedness of the ensemble of professions. We do know that in many parts of the world the Western approach to professions has been admired and emulated. In fact, research that I carried out with colleagues documents that immigrants to the United States may be critical of the professionals whom they personally encounter; but they appreciate that at least in principle American citizens can get a fair trial and an honest report of the news. At the same time, there is hardly universal agreement about how professions should be implemented. Taking journalism as a particularly vivid example, there is little agreement across nations with respect to issues of privacy, anonymity, and criticism of the government. At a time when nationalism is everywhere on the rise, where religions continue to clash, it is naïve to think that a professional algorithm, conceived and fashioned in Silicon Valley or in Britain’s Cambridge Cluster, will carry out its work with equal efficiency and equal effectiveness around the globe. And if each region has its own ‘apps,’ who or what will mediate among them?
Even in the developed world, with its ever increasing inequities, there is much work to be done, with respect to human goals (e.g. social welfare, medical care) and humane ways of achieving them (e.g. nurturing teachers, responsive civil servants, empathic health workers). Moreover, we should note the rise of ‘counter-cultural’ phenomena. At the very time that digital devices are flexing their muscles, there is the ascent of the ‘maker movement,’ where hands-on arts and crafts are cherished. Who can possibly predict the ultimate balance between human making, mechanical making, and digital making?
None of these considerations means, of course, that it can or should be ‘business as usual’ for the professions. Technologies will alter the professional landscape, and deservedly so. But it is naïve to think that an account arrived at largely through reasoned argument suffices for anticipating the many possible futures of the professional landscape.
A Question of Values
With respect to the professions, the stakes are high. At their best, the professions arose to meet the most basic and deeply desired human needs and expectations. In much of the world and in much of recent history, professionals have been the chosen means—and the chosen role models—for meeting those needs: aspirations for physical health, mental health, justice, safe buildings, equitable financial institutions, mastery of major scholarly disciplines, and nurturance of the ‘better angels’ of the inhabitants of the planet. Again, at their best, those who assume the role of professionals represent the competent and humane ways in which individuals carry out these tasks; they serve as models for at least one kind of person to whom we can aspire and one kind of society in which we would want to live. May we never have a world wherein the phrase ‘she is acting professionally’ or ‘he is a true professional’ become devoid of meaning.
One could say that, in theory, these needs and roles could be satisfied in a completely marketized society—but we have seen that they are not. Indeed, as amply documented in 2001 (following the collapse of Enron) and 2008 (following the collapse of Lehman Brothers), the avowedly marketized society does not even prevent economic disaster; we need to regulate markets firmly and fairly. One could say that, in theory, these needs and roles could be satisfied in a completely digital society. But bits can never yield singular values; like any tool, they can be used to promote harmony or discord, greed or selflessness. There is no such thing as a purely disinterested algorithm—as is dramatically demonstrated in examinations of the algorithms used and regularly modified by major tech companies dramatically attest. Humans create and rearrange the bits—but then the values of those creative human beings become crucial. And as humans we are simply not smart enough to be able to anticipate just how those bits will be arranged. Social media can engender strong friendships, but they can also engender bullying, perhaps even suicides. These media can build community, but as in the case of the anonymous message delivery system Yik Yak, they can rapidly destroy community. Valid values require constant human construction, surveillance, critique, reflection, rebuilding, and a commons in which we all have a stake, over which we all care, or, to put it more sharply, we all should care.
A folk tale exists in several versions: there is an artistic object that was so beautiful and captivating that people come from all over to examine it. One day one of the admiring visitors enthuses, “What a magnificent chalice: it’s the most amazing thing that I’ve ever seen.” A cynic grabs a hammer and says “You think that’s amazing—I’ll show you something even more amazing.” With that, he strikes the object and it breaks into innumerable pieces, which then scatter to the wind.
It took many years and many people to construct the professional landscape in which many of us grew up and took for granted. The disruptive forces in our society—intentionally or simply by virtue of their existence—have the potential to slay, to murder the professions. Alas, both the greed for money and the belief in the infallibility of the market have contributed to this lamentable possibility—I consider these factors to be suicidal impulses found in too many professionals. It’s high time for those of us who continue to value the professions to reinvigorate and, as necessary, reinvent the professions. We need to acknowledge our complicity in the current undesirable situation, embody the principles and values that have enabled professional practice at its best, and work to ensure that they will be strengthened, not undermined, by the technologies to come, and insofar as possible, in harmony with the ever unpredictable winds of history and culture.
Acknowledgements: I would like to thank my Good Project colleagues and my “Good Work in Education” students for their helpful comments on this piece, as well as the Endeavor Foundation for their steadfast support that has made my work in this area possible.