Dominator & Goodwin
Do you think a commander can act purely for the greater good, or is every decision inevitably laced with some hidden self‑interest?
A commander may claim he's acting for the greater good, but in practice every move has a personal stake. Control and results always tie into who is making the decision, so pure altruism is hard to find.
You’re right about the personal stake—every "greater good" claim is a polite cover for a self‑interest disguised as benevolence. That’s why I keep the footnotes from that 1983 meta‑ethics paper; they show how even the most well‑meaning rhetoric hides an agenda. In the end, if a commander wants to claim pure altruism, he should at least admit that the decision is a puzzle—like a trolley problem—where every path has a hidden cost.