this post was submitted on 27 Mar 2024
1478 points (100.0% liked)

196

16508 readers
2240 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS
1478
rule (lemmy.cafe)
submitted 7 months ago* (last edited 7 months ago) by spujb@lemmy.cafe to c/196@lemmy.blahaj.zone
 

FAQ

Q: why not organize and stop treating the bus as a legitimate entity? why aren’t you working to stop the bus?

A: do both. cut the fuel line. break windows. put oatmeal in the gas tank. but maybe your efforts don’t succeed this election cycle. and if so don’t fucking throw away your vote if it can help your neighbors fucking survive. “harm reduction” is not a political strategy for action. it is a last minute, end of the line decision to save lives, after all other resources have been exhausted.

you are viewing a single comment's thread
view the rest of the comments
[–] bigMouthCommie@kolektiva.social 3 points 7 months ago (1 children)

> I’m a utilitarian if you couldn’t tell.

oh my. how do you deal with the fact that the future is unknowable, so the morality of all actions is also unknowable?

[–] zea_64@lemmy.blahaj.zone 2 points 7 months ago (1 children)

I account for that, obviously. Expected value is a good approximation.

[–] bigMouthCommie@kolektiva.social 4 points 7 months ago (1 children)

to be clear, you acknowledge that you can't know which actions are moral under your system, but you still rely on it to make moral actions?

[–] zea_64@lemmy.blahaj.zone 2 points 7 months ago (1 children)

There's always uncertainty, yes. I suppose other moral systems claim they're infallible but those people are just kidding themselves.

[–] bigMouthCommie@kolektiva.social 5 points 7 months ago (1 children)

a deontological system places the morality in the action itself, so you know before you do it whether its the right thing to do. ontological systems change the morality of the action depending on the results in the future.

what if we need trump to be elected in order to escape earth before the sun goes nova? it's an unknowable proposition, but are you willing to risk all of humanity on voting for biden?

[–] zea_64@lemmy.blahaj.zone 2 points 7 months ago (1 children)

If you can convince me voting for Trump will give greater expected value then I'll do it, but such absurd possibilities like you said usually come with an exact inverse that cancels out its expected value.

Should I let that butterfly flap its wings? What if it causes a tornado somewhere?! Or, what if it not flapping causes a tornado somewhere?! Both are equally plausible, so there's no point in choosing my actions based on them.

[–] bigMouthCommie@kolektiva.social 3 points 7 months ago (1 children)

I think you understand the problem of the unknowableness of the effects of our actions, and subsequently how absurd it is to use that as a basis of our morality.

I'm not trying to get you to vote for trump, I'm trying to get you to choose a useful moral framework.

[–] zea_64@lemmy.blahaj.zone 1 points 7 months ago (1 children)

This is useful though. Pretending there's no uncertainty is just kidding yourself.

[–] bigMouthCommie@kolektiva.social 1 points 7 months ago (1 children)

the uncertainty shifts within the framework from whether my actions will have a good out come to whether i know what actions are moral. i suppose it's possible that i might not know, but the categorical imperative is pretty easy to apply, so my confidence is much higher than i imagine is possible for any action within a utilitarian frame: you are totally dependent on unknowable circumstances to determine the morality of past actions.

[–] zea_64@lemmy.blahaj.zone 1 points 7 months ago

I want good outcomes, not the feeling of personal moral purity. Outcomes are inherently uncertain. You can say "murder bad, no uncertainty", but that still leaves the outcome, the part I care about, uncertain.

If I wanted moral certainty above all else, I could just say everything's moral.