Poll

Question 1: The Trolley

Pull the lever, kill 1 save 5.
9 (75%)
Don't pull the lever, it's not your fault or problem.
3 (25%)

Total Members Voted: 12

Voting closed: March 09, 2025, 12:27:24 PM

Author Topic: Blockland Ethics Questions: Autism Vaccine  (Read 7445 times)

i kill utilitarians with my mind

If at any point those rules lead to worse outcomes, then they're in fact NOT "for the greater good", so working by the greater good principle, none of that would happen.
thing is its hard to be certain what is the "real greater good" choice without hindsight, and even then its subjective relative to the values of the individuals observing said choice. consider the topic of eugenics for example.

thing is its hard to be certain what is the "real greater good" choice without hindsight, and even then its subjective relative to the values of the individuals observing said choice. consider the topic of eugenics for example.

Which is why rule utilitarianism exists. The thing about the trolley is that it's an unrealistic example and in the absence of context it has to be measured with act utilitarianism.

Which is why rule utilitarianism exists. The thing about the trolley is that it's an unrealistic example and in the absence of context it has to be measured with act utilitarianism.
also consider other workable ethical theories (e.g. kantianism, social contract theory, virtue ethics); something may be ethical if you look at it as a numbers game but it may not necessarily hold up to other types of scrutiny. also utilitarianism as a whole (actually every "workable" theory) has flaws so it's important to evaluate actions using multiple theories and use best judgment
i'm not trying to be a reddit neckbeard here i just had an ethics course in college ok

also consider other workable ethical theories (e.g. kantianism, social contract theory, virtue ethics); something may be ethical if you look at it as a numbers game but it may not necessarily hold up to other types of scrutiny. also utilitarianism as a whole (actually every "workable" theory) has flaws so it's important to evaluate actions using multiple theories and use best judgment
i'm not trying to be a reddit neckbeard here i just had an ethics course in college ok

I haven't taken any courses on ethics so I would love to hear some examples on where utilitarianism fails.

from the definition i get from google on what act utilitarianism is, the problem lies in it just shunting off the ambiguity of the perception of good or bad into the definition of happiness. the definition of it uses the word "best" when best is still relative to personal values.

and for the broader word "utilitarianism", it doesn't address the fact that happiness even for an individual isnt something thats possible to define well. in its most extreme form, nothing besides figuring out how to drug or brainwash our brains without negative side effects is the only thing truly worth pursuing.
« Last Edit: March 09, 2025, 03:07:05 PM by Conan »

I can't seem to update the poll, which is disappointing.

Julie and Mark aare brother and sister. They are traveling together in France on summer vacation from college. One night they are staying alone in a cabin near the beach. They decide that it would be interesting and fun if they tried making love. At the very least it would be a new experience for each of them. Julie was already taking birth control pills, but Mark uses a condom too, just to be safe. They both enjoy making love, but they decide not to do it again. They keep that night as a special secret, which makes them feel even closer to eaach other. What do you think about that? Was it OK for them to make love? (Why or why not?)
https://strawpoll.com/X3nkPdLDPgE

based. they should do it again

based. they should do it again
I personally agree. The biggest arguments I've seen against incest is that incest between adolescents almost always has an unbalanced power dynamic. One sibling is always going to be older than the other sibling. The other issue is with inbreeding. But if contraceptives are used, and both siblings are consenting adults, I  see no reason that it should be an issue.

what if they come across the trolley problem but the only way to activate the lever is by having love without protection? (and they are both fertile)

it's weird but if it's consensual and protection is used whatever.

Self-cest is better

Nothing is more cool than loving yourself

I haven't taken any courses on ethics so I would love to hear some examples on where utilitarianism fails.
suppose a doctor takes in a patient with a benign and easily treatable problem. and there are 4 more patients who will certainly die in the next few hours if they dont get heart, lung, liver, and kidney transplants. coincidentally, the new patient has compatible organs for all of the dying patients. utilitarianism would seem to obligate the doctor to kill the first patient and transplant their organs into the 4 dying patients

or you can imagine that like, ancient romans making a christian fight a lion would create an ecstatic spectacle for the audience. if the audience was sufficiently large and enjoying the spectacle, would that justify sentencing somebody to death?

also if you take utilitarianism seriously, it's an extremely demanding ethical philosophy. like how can you responsibly justify buying any non-essential product when that money could go to feeding somebody starving in the third world or something? should you only eat the cheapest food you can find so you can spend the rest on mosquito nets for people in africa?

if you havent heard of "effective altruism" or "longtermism" or "singularitarianism" (these are sort of all the same thing), this is a bouquet of utilitarian ideologies that are very popular among rich silicon valley elites to justify all of their politically/environmentally/morally questionable decisions. e.g. "the only thing that can stop catastrophic climate change is AGI, so let's just go all in on AI data centers". tech billionaires and their supporters use a sort of utilitarian logic to justify endlessly growing their wealth at any moral/environmental cost. it's also very easy to justify eugenics with longtermist sort of reasoning, and i'm pretty sure lots of these guys (peter thiel, marc andreessen, elon musk probably) do believe in eugenics

this isnt really relevant because these people are just crazy, but there's also a vegan murder cult called the Zizians that started as a longtermism discord server
« Last Edit: March 10, 2025, 03:26:42 AM by ultimamax »



As a moral anti-realist, the first is not really an effective critique. For me personally, good and bad relate to preferred and not-preferred subjective experiences. There isn't a way to objectively quantify happiness and suffering, but options can be compared to each other.

As for the pleasure-machine hypothetical, it relies on the uncomfortable feeling one's moral intuition gives about it. I try not to always trust my own moral intuition as I feel that's the kool-aid that deontologists drink. I'm pretty convinced that happiness and suffering are the measures of morality and when it comes to the pleasure-machine hypothetical, barring all other context, I'm willing to bite the bullet on it. Though there are some counter-arguments like fulfillment and whether constant pleasure gives more happiness than it. Another is that if we were to achieve this state of bliss, it would stagnate all capacity to expand human consciousness beyond needing simple pleasure for maximum happiness.



For the patient-doctor example, this is only effective when critiquing act utilitarianism. When one considers the further outcomes of creating the possibility of you being killed and your organs being harvested when you go to seek help at a hospital, this would create a situation where people would avoid going to a hospital in fear of it, thus causing more harm in the long run and making it a bad option.

For the ancient Rome hypothetical I would oppose the premise that the only form of entertainment that the people can enjoy is one where gladiators kill each other. If it somehow were the case, then it would be moral, because I am not in the business of causing more suffering in order to stop a more explicit form of suffering.

When doing utilitarian calculus, you can only use the information you have. This is why one should read the news, listen and consider the opinions of pundits, and learn about the consequences of one's choices. I don't think you can make things any worse by using at least some effort to check the consequences of your choices.

How the out-of-touch mega-rich have deluded themselves into thinking that their actions are for the betterment of humanity is, I think, not the fault of utilitarianism.

With the murder cult, it's about misunderstanding the consequences of one's actions. They, like the mega-rich, tried/are trying their own methods without any positive outcomes.



As for the incest thing. Yeah, there's no harm I can think of beyond possibly the normalization of incest and then the resulting harm caused by those who don't practice it safely/with balanced power dynamics.

incest is funny because like personally it's hot etc but also I would never in a million years have love with my actual sister. and like everyone else I know who's into it either feels the same way or is just an only child lmao. I'm sure there's people out there who really do wanna forget their siblings, and maybe an even smaller fraction whose sibling feels the same, but ive sure as hell never met any of those