Perebor & Goodwin
Have you ever thought about how the classic trolley problem translates when you’re writing the code that drives an autonomous car?
Ah, the trolley problem in autonomous cars—just another way to ask whether a line of code can ever capture the weight of a footnote in a 1983 metaethics paper. The car will reduce it to a cost‑benefit table that is as emotionally sterile as the cafeteria coffee. So yes, I've thought about it, but only in the draft I never printed because my lecture notes are too precious to risk a typo.
Sounds like you’re already mapping the decision matrix—what’s the next variable you’re trying to pin down?
I’m pinning down the probability that a pedestrian is wearing a fluorescent vest in a thunderstorm at three in the morning, because that, dear student, is the only variable that can turn a dry algorithm into a philosophical paradox that even the footnotes of the 1983 paper would envy.
Interesting hypothesis—what distribution are you assuming for the vest probability in that stormy midnight scenario?
I’m assuming a truncated normal distribution—mean hovering just above the threshold that a pedestrian’s chance of wearing a vest is considered “reasonable,” but with a heavy right‑skew because storms create an abundance of reflective clothing. If only my obsolete lecture notes could give me the exact σ to make it sound like a footnote from 1983.
You’re thinking like a data‑freak, but what’s the real‑world source for that sigma? If you can pin it down, the whole paradox collapses into a neat confidence interval.You’re thinking like a data‑freak, but what’s the real‑world source for that sigma? If you can pin it down, the whole paradox collapses into a neat confidence interval.