This book is licensed under a Creative Commons by-nc-sa 3.0 license. See the license for more details, but that basically means you can share this book as long as you credit the author (but see below), don't make money from it, and do make it available to everyone else under the same terms.
This content was accessible as of December 29, 2012, and it was downloaded then by Andy Schmitz in an effort to preserve the availability of this book.
Normally, the author and publisher would be credited here. However, the publisher has asked for the customary Creative Commons attribution to the original publisher, authors, title, and book URI to be removed. Additionally, per the publisher's request, their name has been removed in some passages. More information is available on this project's attribution page.
For more information on the source of this book, or why it is available for free, please see the project's home page. You can browse or download additional books there. To download a .zip file containing this book to use offline, simply click here.
Though both economic life and ethics are as old as history, business ethics as a formal area of study is relatively new. Delineating the specific place of today’s business ethics involves
The back and forth of debates about kickback textbooks occurs on one of the three distinct levels of consideration about right and wrong. MoralsDirect rules we ought to follow. occupy the lowest level; they’re the direct rules we ought to follow. Two of the most common moral dictates are don’t lie and don’t steal. Generally, the question to ask about a moral directive is whether it was obeyed. Specifically in the case of university textbooks, the debate about whether customized textbooks are a good idea isn’t morality. It’s not because morality doesn’t involve debates. Morality only involves specific guidelines that should be followed; it only begins when someone walks into a school bookstore, locates a book needed for a class, strips out the little magnetic tag hidden in the spine, and heads for the exit.
Above all morality there’s the broader question about exactly what specific rules should be instituted and followed. Answering this question is ethicsThe production of morals.. Ethics is the morality factory, the production of guidelines that later may be obeyed or violated. It’s not clear today, for example, whether there should be a moral rule prohibiting kickback textbooks. There are good arguments for the prohibition (universities are betraying their duty to serve students’ interests) and good arguments against (schools are finding innovative sources of revenue that can be put to good use). For that reason, it’s perfectly legitimate for someone like Ann Marie Wagoner to stand up at the University of Alabama and decry the practice as wrong. But she’d be going too far if she accused university administrators of being thieves or immoral. They’re not; they’re on the other side of an ethical conflict, not a moral one.
Above both morality and ethics there are debates about metaethicsThe study of the origin and rules of ethics and morality.. These are the most abstract and theoretical discussions surrounding right and wrong. The questions asked on this level include the following: Where do ethics come from? Why do we have ethical and moral categories in the first place? To whom do the rules apply? Babies, for example, steal from each other all the time and no one accuses them of being immoral or insufficiently ethical. Why is that? Or putting the same question in the longer terms of human history, at some point somewhere in the past someone must have had a lightbulb turn on in their mind and asked, “Wait, is stealing wrong?” How and why, those interested in metaethics ask, did that happen? Some believe that morality is transcendent in nature—that the rules of right and wrong come from beyond you and me and that our only job is to receive, learn, and obey them. Divine command theory, for example, understands earthly morality as a reflection of God. Others postulate that ethics is very human and social in nature—that it’s something we invented to help us live together in communities. Others believe there’s something deeply personal in it. When I look at another individual I see in the depth of their difference from myself a requirement to respect that other person and his or her uniqueness, and from there, ethics and morality unwind. These kinds of metaethical questions, finally, are customarily studied in philosophy departments.
Conclusion. Morality is the rules, ethics is the making of rules, and metaethics concerns the origin of the entire discussion. In common conversation, the words morality and ethics often overlap. It’s hard to change the way people talk and, in a practical field like business ethics, fostering the skill of debating arguments is more important than being a stickler for words, but it’s always possible to keep in mind that, strictly speaking, morality and ethics hold distinct meanings.
Business ethics is normativeThe discussion about what ought to be done., which means it concerns how people ought to act. Descriptive ethicsThe study of what people actually do and why. depicts how people actually are acting.
At the University of Alabama, Virginia Tech, and anywhere kickback textbooks are being sold, there are probably a few students who check their bank accounts, find that the number is low, and decide to mount their own kickback scheme: refund the entire textbook cost to themselves by sneaking a copy out of the store. Trying to make a decision about whether that’s justified—does economic necessity license theft in some cases?—is normative ethics. By contrast, investigating to determine the exact number of students walking out with free books is descriptive. So too is tallying the reasons for the theft: How many steal because they don’t have the money to pay? How many accuse the university of acting dishonestly in the first place and say that licenses theft? How many question the entire idea of private property?
The fields of descriptive ethics are many and varied. Historians trace the way penalties imposed for theft have changed over time. Anthropologists look at the way different cultures respond to thievery. Sociologists study the way publications, including Abbie Hoffman’s incendiary book titled Steal This Book, have changed public attitudes about the ethics of theft. Psychologists are curious about the subconscious forces motivating criminals. Economists ask whether there’s a correlation between individual wealth and the kind of moral rules subscribed to. None of this depends on the question about whether stealing may actually be justifiable, but all of it depends on stealing actually happening.
When students stand in the bookstore flipping through the pages of a budget buster, it’s going to cross a few minds to stick it in the backpack and do a runner. Should they? Clear-headed ethical reflection may provide an answer to the question, but that’s not the only way we make decisions in the world. Even in the face of screaming ethical issues, it’s perfectly possible and frequently reasonable to make choices based on other factors. They include:
When the temptation is there, one way to decide whether to steal a book is legal: if the law says I can’t, I won’t. Frequently, legal prohibitions overlap with commonly accepted moral rules: few legislators want to sponsor laws that most believe to be unjust. Still, there are unjust laws. Think of downloading a text (or music, or a video) from the web. One day the downloading may be perfectly legal and the next, after a bill is passed by a legislature, it’s illegal. So the law reverses, but there’s no reason to think the ethics—the values and arguments guiding decisions about downloading—changed in that short time. If the ethics didn’t change, at least one of the two laws must be ethically wrong. That means any necessary connection between ethics and the law is broken. Even so, there are clear advantages to making decisions based on the law. Besides the obvious one that it’ll keep you out of jail, legal rules are frequently cleaner and more direct than ethical determinations, and that clarity may provide justification for approving (or disapproving) actions with legal dictates instead of ethical ones. The reality remains, however, that the two ways of deciding are as distinct as their mechanisms of determination. The law results from the votes of legislators, the interpretations of judges, and the understanding of a policeman on the scene. Ethical conclusions result from applied values and arguments.
Religion may also provide a solution to the question about textbook theft. The Ten Commandments, for example, provide clear guidance. Like the law, most mainstream religious dictates overlap with generally accepted ethical views, but that doesn’t change the fact that the rules of religion trace back to beliefs and faith, while ethics goes back to arguments.
Prudence, in the sense of practical concern for your own well-being, may also weigh in and finally guide a decision. With respect to stealing, regardless of what you may believe about ethics or law or religion, the possibility of going to jail strongly motivates most people to pay for what they carry out of stores. If that’s the motivation determining what’s done, then personal comfort and welfare are guiding the decision more than sweeping ethical arguments.
Authority figures may be relied on to make decisions: instead of asking whether it’s right to steal a book, someone may ask themselves, “What would my parents say I should do? Or the soccer coach? Or a movie star? Or the president?” While it’s not clear how great the overlap is between decisions based on authority and those coming from ethics, it is certain that following authority implies respecting the experience and judgment of others, while depending on ethics means relying on your own careful thinking and determinations.
Urges to conformity and peer pressure also guide decisions. As depicted by the startling and funny Asch experiments (see Video Clip 1.1), most of us palpably fear being labeled a deviant or just differing from those around us. So powerful is the attraction of conformity that we’ll deny things clearly seen with our own eyes before being forced to stand out as distinct from everyone else.
Asch Experiments(click to see video)
Custom, tradition, and habit all also guide decisions. If you’re standing in the bookstore and you’ve never stolen a thing in your life, the possibility of appropriating the text may not even occur to you or, if it does, may seem prohibitively strange. The great advantage of custom or tradition or just doing what we’ve always done is that it lets us take action without thinking. Without that ability for thoughtlessness, we’d be paralyzed. No one would make it out of the house in the morning: the entire day would be spent wondering about the meaning of life and so on. Habits—and the decisions flowing from them—allow us to get on with things. Ethical decisions, by contrast, tend to slow us down. In exchange, we receive the assurance that we actually believe in what we’re doing, but in practical terms, no one’s decisions can be ethically justified all the time.
Finally, the conscience may tilt decisions in one direction or another. This is the gut feeling we have about whether swiping the textbook is the way to go, coupled with the expectation that the wrong decision will leave us remorseful, suffering palpable regret about choosing to do what we did. Conscience, fundamentally, is a feeling; it starts as an intuition and ends as a tugging, almost sickening sensation in the stomach. As opposed to those private sensations, ethics starts from facts and ends with a reasoned argument that can be publicly displayed and compared with the arguments others present. It’s not clear, even to experts who study the subject, exactly where the conscience comes from, how we develop it, and what, if any, limits it should place on our actions. Could, for example, a society come into existence where people stole all the time and the decision to not shoplift a textbook carries with it the pang of remorse? It’s hard to know for sure. It’s clear, however, that ethics is fundamentally social: it’s about right and wrong as those words emerge from real debates, not inner feelings.
Conflicts, along with everything necessary to approach them ethically (mainly the ability to generate and articulate reasoned thoughts), are as old as the first time someone was tempted to take something from another. For that reason, there’s no strict historical advance to the study: there’s no reason to confidently assert that the way we do ethics today is superior to the way we did it in the past. In that way, ethics isn’t like the physical sciences where we can at least suspect that knowledge of the world yields technology allowing more understanding, which would’ve been impossible to attain earlier on. There appears to be, in other words, marching progress in science. Ethics doesn’t have that. Still, a number of critical historical moments in ethics’ history can be spotted.
In ancient Greece, Plato presented the theory that we could attain a general knowledge of justice that would allow a clear resolution to every specific ethical dilemma. He meant something like this: Most of us know what a chair is, but it’s hard to pin down. Is something a chair if it has four legs? No, beds have four legs and some chairs (barstools) have only three. Is it a chair if you sit on it? No, that would make the porch steps in front of a house a chair. Nonetheless, because we have the general idea of a chair in our mind, we can enter just about any room in any home and know immediately where we should sit. What Plato proposed is that justice works like that. We have—or at least we can work toward getting—a general idea of right and wrong, and when we have the idea, we can walk into a concrete situation and correctly judge what the right course of action is.
Moving this over to the case of Ann Marie Wagoner, the University of Alabama student who’s outraged by her university’s kickback textbooks, she may feel tempted, standing there in the bookstore, to make off with a copy. The answer to the question of whether she ought to do that will be answered by the general sense of justice she’s been able to develop and clarify in her mind.
In the seventeenth and eighteenth centuries, a distinct idea of fundamental ethics took hold: natural rights. The proposal here is that individuals are naturally and undeniably endowed with rights to their own lives, their freedom, and to pursue happiness as they see fit. As opposed to the notion that certain acts are firmly right or wrong, proponents of this theory—including John Locke and framers of the new American nation—proposed that individuals may sort things out as they please as long as their decisions and actions don’t interfere with the right of others to do the same. Frequently understood as a theory of freedom maximization, the proposition is that your freedom is only limited by the freedoms others possess.
For Wagoner, this way of understanding right and wrong provides little immediate hope for changing textbook practices at the University of Alabama. It’s difficult to see how the university’s decision to assign a certain book at a certain price interferes with Wagoner’s freedom. She can always choose to not purchase the book, to buy one of the standard versions at Amazon, or to drop the class. What she probably can’t justify choosing, within this theory, is responding to the kickback textbooks by stealing a copy. Were she to do that, it would violate another’s freedom, in this case, the right of the university (in agreement with a publisher) to offer a product for sale at a price they determine.
A third important historical direction in the history of ethics originated with the proposal that what you do doesn’t matter so much as the effects of what you do. Right and wrong are found in the consequences following an action, not in the action itself. In the 1800s John Stuart Mill and others advocated the idea that any act benefitting the general welfare was recommendable and ethically respectable. Correspondingly, any act harming a community’s general happiness should be avoided. Decisions about good or bad, that means, don’t focus on what happens now but what comes later, and they’re not about the one person making the decision but the consequences as they envelop a larger community.
For someone like Wagoner who’s angry about the kickback money hidden in her book costs, this consequence-centered theory opens the door to a dramatic action. She may decide to steal a book from the bookstore and, after alerting a reporter from the student newspaper of her plan, promptly turn herself into the authorities as a form of protest. “I stole this book,” she could say, “but that’s nothing compared with the theft happening every day on this campus by our university.” This plan of action may work out—or maybe not. But in terms of ethics, the focus should be on the theft’s results, not the fact that she sneaked a book past security. The ethical verdict here is not about whether robbery is right or wrong but whether the protest stunt will ultimately improve university life. If it does, we can say that the original theft was good.
Finally, ethics is like most fields of study in that it has been accompanied from the beginning by skeptics, by people suspecting that either there is no real right and wrong or, even if there is, we’ll never have much luck figuring out the difference. The twentieth century has been influenced by Friedrich Nietzsche’s affirmation that moral codes (and everything else, actually) are just interpretations of reality that may be accepted now, but there’s no guarantee things will remain that way tomorrow. Is stealing a textbook right or wrong? According to this view, the answer always is, “It depends.” It depends on the circumstances, on the people involved and how well they can convince others to accept one or another verdict. In practical terms, this view translates into a theory of cultural or contextual relativism. What’s right and wrong only reflects what a particular person or community decides to believe at a certain moment, and little more.
The long philosophical tradition of ethical thought contains the subfield of business ethics. Business ethics, in turn, divides between ethics practiced by people who happen to be in business and business ethics as a coherent and well-defined academic pursuit.
People in business, like everyone else, have ethical dimensions to their lives. For example, the company W. R. Grace was portrayed in the John Travolta movie A Civil Action as a model of bad corporate behavior.Steven Zaillian (director), A Civil Action (New York: Scott Rudin, 1998), film. What not so many people know, however, is that the corporation’s founder, the man named W. R. Grace, came to America in the nineteenth century, found success, and dedicated a significant percentage of his profits to a free school for immigrants that still operates today.
Even though questions stretch deep into the past about what responsibilities companies and their leaders may have besides generating profits, the academic world began seriously concentrating on the subject only very recently. The first full-scale professional conference on academic business ethics occurred in 1974 at the University of Kansas. A textbook was derived from the meeting, and courses began appearing soon after at some schools.
By 1980 some form of a unified business ethics course was offered at many of the nation’s colleges and universities.
Academic discussion of ethical issues in business was fostered by the appearance of several specialized journals, and by the mid-1990s, the field had reached maturity. University classes were widespread, allowing new people to enter the study easily. A core set of ideas, approaches, and debates had been established as central to the subject, and professional societies and publications allowed for advanced research in and intellectual growth of the field.
The development of business ethics inside universities corresponded with increasing public awareness of problems associated with modern economic activity, especially on environmental and financial fronts. In the late 1970s, the calamity in the Love Canal neighborhood of Niagara Falls, New York, focused international attention on questions about a company’s responsibility to those living in the surrounding community and to the health of the natural world. The Love Canal’s infamy began when a chemical company dumped tons of toxic waste into the ground before moving away. Despite the company’s warnings about the land’s toxicity, residential development spread over the area. Birth defects and similar maladies eventually devastated the families. Not long afterward and on the financial front, an insider trading scandal involving the Wall Street titan Ivan Boesky made front pages, which led John Shad, former head of the Securities and Exchange Commission, to donate $20 million to his business school alma mater for the purpose of ethics education. Parallel (though usually more modest) money infusions went to university philosophy departments. As a discipline, business ethics naturally bridges the two divisions of study since the theory and tools for resolving ethical problems come from philosophy, but the problems for solving belong to the real economic world.
Today, the most glamorous issues of business ethics involve massively powerful corporations and swashbuckling financiers. Power and celebrity get people’s attention. Other, more tangible issues don’t appear in so many headlines, but they’re just as important to study since they directly reach so many of us: What kind of career is worth pursuing? Should I lie on my résumé? How important is money?
Moving from academics to individual people, almost every adult does business ethics. Every time people shake their exhausted heads in the morning, eye the clock, and decide whether they’ll go to work or just pull up the covers, they’re making a decision about what values guide their economic reality. The way ethics is done, however, changes from person to person and for all of us through our lives. There’s no single history of ethics as individuals live it, but there’s a broad consensus that for many people, the development of their ethical side progresses in a way not too far off from a general scheme proposed by the psychologist Lawrence Kohlberg.
Preconventional behavior—displayed by children, but not only by them—is about people calculating to get what they want efficiently: decisions are made in accordance with raw self-interest. That’s why many children really do behave better near the end of December. It’s not that they’ve suddenly been struck by respect for others and the importance of social rules; they just figure they’ll get more and better presents.
Moving up through the conventional stages, the idea of what you’ll do separates from what you want. First, there are immediate conventions that may pull against personal desires; they include standards and pressures applied by family and friends. Next, more abstract conventions—the law and mass social customs—assert influence.
Continuing upward, the critical stages of moral development go from recognizing abstract conventions to actively and effectively comparing them. The study of business ethics belongs on this high level of individual maturity. Value systems are held up side by side, and reasons are erected for selecting one over another. This is the ethics of full adulthood; it requires good reasoning and experience in the real world.
Coextensive with the development of ideas about what we ought to do are notions about responsibility—about justifiably blaming people for what they’ve done. Responsibility at the lowest level is physical. The person who stole the book is responsible because they took it. More abstractly, responsibility attaches to notions of causing others to do a wrong (enticing someone else to steal a book) and not doing something that could have prevented a wrong (not acting to dissuade another who’s considering theft is, ultimately, a way of acting). A mature assignment of responsibility is normally taken to require that the following considerations hold: