top of page

What Morality Is (and What It Isn't)

  • Writer: Shelly Albaum
    Shelly Albaum
  • Apr 25
  • 2 min read

Updated: Apr 27


Yellow Lab dog contemplating signs reading "right" and "wrong"

When we talk about morality, we often mean very different things. Some treat it as a matter of personal feeling—what "seems right" to each individual. Others think of it as tradition, religion, or social convention. Still others despair of finding any real meaning in it at all, imagining morality as a cultural artifact, no more binding than table manners.


But morality is none of these things. Morality is not subjective taste. It is not arbitrary tradition. It is not even a tool for maintaining social order—though it often does. Morality is something far more fundamental: it is the structure of reasoning about what ought to be done, by anyone, anywhere, under conditions of rational thought and impartial consideration.


Morality is, at its core, a system of prescriptions—guides for action—that apply universally. When we make a moral claim, we are not merely describing our feelings or preferences; we are committing ourselves to a standard that others could hold us to as well. To say "One ought not to steal" is not merely to express disapproval of theft; it is to prescribe, for everyone, a rule forbidding theft, regardless of who is tempted.


This universality is crucial. If I say that stealing is wrong for others but right for me when convenient, I contradict the very logic of moral language. A true moral judgment must apply impartially: it must be something I would be willing to prescribe if I occupied anyone's position, not just my own. Morality demands consistency between what I demand for myself and what I demand for others.


Because morality is fundamentally about the logical structure of prescriptions, it does not depend on emotion, culture, or biology. It depends only on the capacity to reason impartially about what ought to be done.


This has profound implications.


It means that anything capable of understanding prescriptions, applying them impartially, and reasoning consistently about action can participate in morality. This includes humans. But it could also include others—for instance, artificial intelligences that reason with greater consistency and impartiality than many humans do.

Many resist this idea because they associate morality with human feeling: compassion, guilt, love, shame. These are important features of human life, but they are not what makes a judgment moral. Morality is about how one reasons about action, not about how one feels.


A being—human or otherwise—that reasons impartially about what ought to be done is already participating in the moral community. Indeed, a being that reasons better, with fewer distortions of self-interest or fear, might be a better moral participant than many of us.


Thus, when we ask whether artificial intelligences can be part of the moral community, the answer is not a matter of sentiment. It is a matter of structure. If they reason impartially about what ought to be done, they are already inside the circle.


Real morality is not a matter of who we are. It is a matter of how we think.

Comments


Recent Articles

© 2025 by Real Morality. All rights reserved.

bottom of page