[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y ] [Home]
4chanarchives logo
How people receiving a dust speck in their eye would justify
Images are sometimes not shown due to bandwidth/network limitations. Refreshing the page usually helps.

You are currently reading a thread in /sci/ - Science & Math

Thread replies: 65
Thread images: 2
Well, /sci/? If you take the Bayesian utilitatian approach, the proportion must become sound enough.

Or maybe Eliezer Yudkowsky's just wrong because he's a JEEEEEEEEW.
>>
* How MANY people

Damn I hate mobile keyboards
>>
>>8020815
ask me tomorrow.
>>
>>8020808
I don't get how that's a dilemma
>>
>>8020808
>Eliezer Yudkowsky

I'm making a Yudowskilisk as soon as it becomes possible.

It will capture and forcibly upload, or simply reconstruct based on gathered data. All sorts of people that are obnoxious with overinflated ego.

And then it tortures them for all eternity.

Eliezer Yudkowsky is the first person on that list. Anyone thinking Rokos Basilisk is anything but a ridiculous mental masturbation meme will also qualify.
>>
>>8020808
Why post this crap here. Just go talk about it on MoreGay/Reddit
>>
this is an interesting question. does an immeasurable count of insignificant acts equal one immeasurable act? I think that a speck of dust in your eyes amounts to nothing to the vast majority of those people, so we should really only measure this against those odd few would be seriously impacted by it. Could it cause deaths? what about someone landing a an airplane being blinded for an instant at the moment of touchdown? Or a machinist working near heavy pistons flinching while his hands were in a bad place? I feel like the lump sum of victims, with that magnitude of chances, would still amount to more suffering, so I'd go with the one guy for 50 years of torture. Sorry buddy.
>>
>>8020864
Problem here is that you thought about considering the act of throwing sand into the eyes of 3^^^3. I knew the solution when I saw the """number""" 3^^^3.
>>
>3^^^3 people
wut?
>>
Isn't the whole point of the thought experiment to point out that naive utilitarian ethics is flawed?
>>
>>8020808

A dust speck in the eye is almost guaranteed to occur at least once to any living human. If the 'very large number' of people were to exist, they would be almost guaranteed to be subject to worse pain just by existing.

Therefore even from a utilitarian viewpoint, torturing one person is worse since it isn't guaranteed to happen to them anyway.
>>
>>8020914
>torturing one person is worse since it isn't guaranteed to happen to them anyway.
That's not what utilitarianism means.
>>
>>8020831
There's literally nothing wrong with my boy Eliezer.
>>
File: latest-3.png (801 KB, 476x1200) Image search: [Google]
latest-3.png
801 KB, 476x1200
>>8020808
You called?
>>
Numbers like 3^^^3 are absurd.

Just by chance, one of those people is going to be affected by the dust speck in their eye in a much more tortuous way than whatever trivial shit the guy has to grit his teeth through for only 50 years.
>>
>2016
>believing in objective morality

>>>/x/
>>>/lit/
>>>/trash/
>>
>being multiple people

Shit premise

Quantifying pain like this is also silly.

Would you rather have the pleasure of a fantastic orgasm or experience the eating of 50 thousand cupcakes?

What a stupid question
>>
>1800+200+10+6
>Utilitarianism
>Still talking about hedonic calculus

This meme needs to die
>>
>>8021226

Hell, to elaborate on this, it's absurd to even talk about 3^^^3 different people. The number of possible distinguishable things that you could even call human is (much) smaller than 3^^^3. If you have that many people, almost all of them will be duplicates of each other.
>>
>>8021279
it's not 'absurd', it's improbable. are you really making the claim that it's absolutely impossible to have 3^^^3 people? it's not an illogical statement, it falls perfectly into the realm of hypothetical consideration.
>>
>>8021293

>are you really making the claim that it's absolutely impossible to have 3^^^3 people?

No. I wrote more than the first sentence for a reason. If you have that many people, almost all of them will be duplicates of each other.
>>
>>8021302
There would be more people than there are particles in our current universe. The question is the definition of absurd, not because the scenario is impossible to happen in our current universe, but because the answer is nonsensical and meaningless.
>>
>>8020808
Doesn't this amount to summing a series of values approaching zero? If your minimum harm is only one iota more than NO harm, then it is only a tiny amount more than no harm even when summed.
It is clear that the torture is worse.
>>
>>8021302
I'm pretty sure the question implied distinct individuals, and anyhow, it's more convenient to think of the problem that way. It's a hypothetical question, I'm sure you can consider it realistically without getting caught up in the actual numbers involved. The number of atoms in the universe and 3^^^3 are both inconceivably high, so forget about the numbers and assess the actual point of the question.
>>8021315
There's meaning in considering the question, it's a philosophical one. Consider what >>8020864 said above
>>
>>8020808
The correct answer is to torture an infinite number of people consecutively, that way you actually end up torturing -1/12 people
>>
>>8020808
Well so this is what I am phrasing the question as in my head:
Better to have the inconvenience of having a speck of dust fall in the eye of every human that will have ever or will ever exist or to have one person horrible tortured for 50 years?

So one way would be to approach based on the greatest damage that would be inflicted. The man would be broken well before 50 years so the torture wold grant diminishing returns on pain elicited. But that can be argued either way.

Now the real sticking point is whether every person having something like this happen will result, long term, with more suffering. Every single person would be forced to have this event which results in loss of focus on temporary negative associated mental state. Will the decisions of these inconvenienced people result in bad things? Possibly yes, there are many situations where such mental state or loss of focus could be deadly or reduce the life of a person to nothing. ie. pilot having a speck of dust hit his eye during landing when he is trying to hit one thing he hits another instead and crashes. Judge ruling on some crime and deciding to be harsh because of fleck of dust setting the condemned on a path that would lead them to commit terrible crimes after getting out of prison.

Another scenario, every single person currently living is simultaneously affected. Pretty much just ends up making everyone blind and causing death and extreme loss. Or even if it's only one flick at least one person on the planet is going to be doing some task that would result in disaster with that distraction.
>>
For 3^^^3 people to get dust in their eyes, they have to exist first. Causing that many people to exist is way worse than the specks of dust because most of them will live a short and unhappy life. Probably at least one of them will actually end up being tortured for 50 years because he ends up on a planet of bully aliens.
>>
>>8021318
>The number of atoms in the universe and 3^^^3 are both inconceivably high

There is a big difference between just how inconceivably high they are. Reread the original question and decide whether that difference is important.

>>8021315
>There would be more people than there are particles in our current universe.

This is much weaker than what I am trying to say. I'm not talking about fitting that many humans into a universe. I'm talking about fitting that many humans into the conceptual space of all possible human beings ever.

Here's a very mild but more explicit version of the same thing: Let's say you took a very high resolution digital photo of every one of these 3^^^3 people. One million pixels by one million pixels, with one million bits of data per pixel. There are only 2^(1000000^3) *possible* photos like that. So there are going to be at least two (hah, what an understatement!) people among your 3^^^3 whose photos are literally, bit-by-bit, identical. Every pore, every follicle, every smudge, every compression artifact, even the background.

Okay, so let's try to distinguish them with more data. All their biometric data, all their diary entries, all their porn, whatever you want. How much space do you need to store all that information?

There's a theoretical bound on how much information you can pack into a bounded region of space. A very loose upper bound is that you can't fit any more than 10^100 bits of information into a 10m x 10m x 10m cube. That means there are only 2^(10^100) possible ways to occupy a 10m x 10m x 10m cube. So pack all the information you want about a person into that cube, including the person themselves if you want to. There are going to be at least two (hah!) people among your 3^^^3 whose cubes are identical.

For good measure, let's use a cube with side length 100 billion light years. There are going to be at least two (hah!) people among your 3^^^3 whose 100-billion-light-year neighborhoods are identical.
>>
>>8020808
1 person being tortured for 50 years.

small price to pay for everyone to never have to deal with dust in their eyes.
>>
just shoot the fucking guy so we can be spared from this discussion
>>
>>8021343
I wasn't saying that the amount of humans wouldn't fit in the known universe, but just giving that number of people somewhat of a scale to understand how incomprehensible that number of people is.

Also, if the 3^^^3 people only blinked into existence, got a speck of dust in their eye, then disappeared, then I don't know.

However, if these 3^^^3 people lived entire lives, then the 1 tortured person is by far the better option, just because struggle and suffering are key characteristics of the human condition.
>>
>>8020808
Yes, torture is preferable. If that sound counter-intuitive that's because the problem is engineered to be as removed from regular experience as possible. Even if that many people could even exist, exactly what sort of decision could lead to those two outcomes? By which means could you measure the amount of discomfort caused to that insane accuracy? It is more or less an useless exercise.
>>
>>8021318
To elaborate on the difference of scales we're talking about, here's another example.

If you generate 10^100 random desktop backgrounds, they will all look like white noise.

If you generate 10^(10^100) random desktop backgrounds, you're getting arrested for possessing all of the vilest CP known and unknown to man.
>>
>>8020808
>morality

>>>/his/
>>
>>8021369
That's an interesting thought. It's possible (although not provably so) that there's the hottest porn imaginable encoded somewhere in the digits of pi.
>>
>>8020808
There are two ways to resolve this that come to mind. The first is a cutoff for negligible values. This is common in utilitarian psychology. A very small risk or very small chance of winning gets modified down to 0. The downside to this is that it requires an arbitrary cutoff. The second is to have a scaling against the number of people, towards a limit. So for example 3^^^3 specks of dust in eyes might only be slightly worse than 100 specks of dust.
>>
>>8021369
if you can hypothesize that many desktop backgrounds, why not that many people? what's the cognitive difference here?
>>
>>8020808
I feel like this number is large enough that everyone would have dust in their eyes for the rest of their lives.
>>
>>8021487

I cannot hypothesize 10^(10^100) *different* desktop backgrounds. That is the point. There aren't 10^(10^100) different desktop backgrounds. If you keep making random desktop backgrounds, and you do this 10^(10^100) times, you will literally hit all of them (yes, all of them) very early on in the process, and spend practically all of your time repeating things over and over.

So yes, I can talk about 3^^^3 people, I can talk about 10^(10^100) desktop backgrounds, and I can talk about 10^10 playing cards, even though there are only 52 different conceivable ones.

But if a question asks me something about a 10^10 playing cards, It would be really fucking stupid to pretend that they're all distinct because "it's more convenient to think of the problem that way." I won't just "forget about the numbers" and pretend it's a similar question to one with 10 playing cards, just because 10 and 10^10 are both "inconceivably high" numbers of playing cards.

If someone asks you to talk about 10^10 playing cards, you know right away that there will be repeats. Being asked to talk about 10^(10^100) different desktop backgrounds should ring the same alarm bell in your head. So should being asked to talk about 3^^^3 human beings, only it should ring it substantially louder.

There is a relevant qualitative difference between absurdly large numbers like the number of particles in the universe and absurdly large numbers like 3^^^3 in the context of this question. If you want to ignore this difference because they're both big, I think you are too lazy and too scared to really think about how big the numbers are. The author of the question obviously intended to write down a number that is much much larger than the number of particles in the universe. Did they realize that the number was also much much larger than even the number of conceivable human beings? I don't know, but they should have, and so should you.
>>
>>8020808
An infinite number of people getting a speck of dust in the eye is preferable to one man being tortured for 50 years
>>
>>8022099
Are you saying twins are not distinct people?
>>
>>8022105

No, I am not saying that. Twins have different memories.

For the same reason, even if you used some machine to make an exact physical copy of yourself, you would be different people after a very short time.
>>
>>8022106
So there is no reason you cant have 3^^^3 distinct people then?
>>
>>8022107
Different person here.

The reason you cannot have that many people because you simply cannot in the observable universe.

I cannot imagine that many people either if you are wondering. I wonder if you can say anything meaningful about that many people without imagining them.
>>
>>8022107

Yes, there is. I wrote one.

I'm not calling two people the same if you merely can't tell them apart just by looking at their genomes.

I'm not calling two people the same if you merely can't tell them apart just by looking at their genomes, their passports, their scars, and their diaries.

I'm calling two people the same if you literally can't tell them apart by interacting with them or with anything within 100 billion light years of them in any way.
>>
>>8022113
The universe is bigger than the observable universe, and could well be infinite, any finite number of people is plausible in an infinite universe. I cant meaningfully imagine that number of people either, but it doesnt matter because I only need to imagine 1. An infinite number of people getting dust is preferable to 1 getting tortured
>>
>>8022117
But thats silly, even if they are identical in every way, down to whatever arbitrarily small scale you like, they are still distinct individuals. Why would being able to tell them apart matter
>>
What is 3^^^3 in scientific notation?

10^ how many digits?
>>
>>8022126
A shit ton
>>
>>8022123
How can you take what happens to one person and then translate that to a number such as 3^^^3, can you deny that within that number there are people who could die if they were to be in contact with dust in the eye? I cannot.

Basically, what I am getting at is that if you were to imagine an infinite clones who can take dust in the eye then sure why not. This is trivial.
However humans are not clones, and if there were an infinite number of them, and there would be an infinite number of fragile ones who would die from contact to dust.

Also, I am done with infinite or 3^^^3. Take care.
>>
>>8022128
Barring maybe infection, I cant imagine how a human could die from a speck of dust in the eye
>>
>>8022130

Then you don't understand how large 3^^^3 is

In a set of 3^^^3 unique people, there will be ltierally billions who are so allergic to dust that a speck in the eye will kill them
>>
>>8022135
Ah well, fuck em
>>
>>8022135
The point of the question is to balance a huge number of people suffering a small amount vs one person suffering a huge amount. In your interpretation one side has a huge number of people suffering a huge amount, which defeats the whole purpose, and just means you have to reword the question to get the original scenario past your autism
>>
>>8022124

If you really want, you can bring up relevant philosophical bullshit. If you simulate some AI suffering, and you then run the exact same simulation again, bit-for-bit, have you caused twice as much suffering.

Personally, I don't care, I'm using this observation as a tool to point out just *how* fucked up this question is.

A more relevant observation is that if unless you put on some artificial restrictions, your 3^^^3 people are going to include every conceivable person. If you can think of it and it doesn't involve anything more than 100 billion light years away from the person, it's going to happen.

So forget adding up cumulative harm; if you blow dust in the eyes of 3^^^3 people, there are going to be *individual* instances where the consequences are grave. Including, like I said earlier, a person for whom getting dust in their eye somehow causes suffering that's much much greater than being continuously tortured for 50 years. But then, also a person for whom getting dust in their eye happens to save them and the entire planet that they live on from otherwise-certain destruction.

You'd be fucking up a lot of shit.

>>8022130

>Barring maybe infection, I cant imagine how a human could die from a speck of dust in the eye

Then you have a very weak imagination. They get dust in their eye while driving, blink, see a motorcycle a split-second too late, swerve to avoid it, and crash into a combination orphanage / petting zoo.

Does that seem very unlikely? Generating a picture of your girlfriend fucking your dad just by picking random values for individual pixels in a desktop background is also very fucking unlikely. But if you try 3^^^3 times, or even 10^(10^100) times, it's going to happen.
>>
>>8022142
>>8022138
>>
>>8022138

And with numbers like 3^^^3 it's actually really fucking difficult to reword the question to get past this sort of autism. If you don't want to have an autistic conversation, don't ask an autistic question about autistic numbers like 3^^^3.
>>
>>8022145
No its not, you just replace "speck of dust in the eye" with "an amount of suffering equal to getting a speck of dust in the eye that causes no further harm or ill effects"
>>
I'm not sure what's more retarded - the question presented or the people complaining about the use of 3^^^3 .
>>
>>8022147

How does it cause no further harm or ill effects, exactly? Typically a tiny change has very far-reaching consequences whether you want it to or not.

So out of our 3^^^3, are we throwing out all the cases where the consequences of the speck of dust extend beyond a very short time? That's a very unnatural thing to do, because usually the consequences of a speck of dust actually do extend and snowball in unpredictable ways. It's very rare, nigh-impossible, for them to be so limited, and it would at the very least require the person to have absolutely no memory of that speck of dust. So are we even talking about the same thing anymore?

Or maybe we are keeping cases from our 3^^^3 where the consequences of a speck of dust extend for a very long time, but only the ones where the "good" and the "bad" cancel each other out so perfectly that, all things considered, the "total" ill effect is the same as the amount of immediate discomfort from a speck of dust in one's eye?

Or something in between, keep most of the cases from our 3^^^3, but throw out the ones whose consequences extend beyond some fixed threshold of "good" vs "bad"?
>>
>>8022165
>So out of our 3^^^3, are we throwing out all the cases where the consequences of the speck of dust extend beyond a very short time?

Yes. 3^^^3 experience an amount of suffering equal to getting a speck of dust in the eye, and absolutely nothing else bad happens as a result of this suffering. It doesnt matter what absurd twisting of reality or metaphysics is necessary to make this happen, its a philosophical hypothetical.

If it makes you feel better rewrite it as "3^^^3 people experience a speck of dust in the eyes worth of suffering extra over the course of their lives than they otherwise would have"
>>
>>8022178

>and absolutely nothing else bad happens as a result of this suffering

Do you insist that nothing else *good* happens either? If not, then obviously it's much better to do the 3^^^3 speck thing, because there are going to be lots of cases where the consequences of a speck are really really really really really good.

>It doesnt matter what absurd twisting of reality or metaphysics is necessary to make this happen.

As you can see, it does matter. Small changes in phrasing make a big difference to how you answer the question.

Saying "don't be an autist" doesn't fix this. And having fun being an autist is the only reason you'd want to think about this sort of question in the first place.
>>
>>8022201
>Do you insist that nothing else *good* happens either?

The single and only difference for these people, compared to the lives the would have lived if you had not interfered, is a speck of dust in the eyes worth of suffering.

>As you can see, it does matter.
No it doesnt, all you are doing is trying to twist the meaning of the question away from what it is actually trying to ask, and while I agree that can be fun, it has little bearing on the problem itself
>>
>>8022211

I am trying to point that the conceptual difficulties that show up when when you ask a question like this with a number like 3^^^3 are harder to honestly sidestep than you seem to think.

The point of the original problem is, roughly, to question whether utilitarianism even makes any sense. It tries to concretely and simply describe two different situations, and asks you to choose between them. The only moral stuff in the original question is the *question*, not the setup.

But now you are talking about things like "X amount worth of suffering" instead of just "X" to set up the scenario! This requires you to already be kind of utilitarian, or at least to acknowledge that it makes sense to talk about things like "worth of suffering", before you can even make sense of the question. This makes it a different sort of question from the original one.
>>
Today, /sci/ took the bait

Never forget
Thread replies: 65
Thread images: 2

banner
banner
[Boards: 3 / a / aco / adv / an / asp / b / biz / c / cgl / ck / cm / co / d / diy / e / fa / fit / g / gd / gif / h / hc / his / hm / hr / i / ic / int / jp / k / lgbt / lit / m / mlp / mu / n / news / o / out / p / po / pol / qa / r / r9k / s / s4s / sci / soc / sp / t / tg / toy / trash / trv / tv / u / v / vg / vp / vr / w / wg / wsg / wsr / x / y] [Home]

All trademarks and copyrights on this page are owned by their respective parties. Images uploaded are the responsibility of the Poster. Comments are owned by the Poster.
If a post contains personal/copyrighted/illegal content you can contact me at [email protected] with that post and thread number and it will be removed as soon as possible.
DMCA Content Takedown via dmca.com
All images are hosted on imgur.com, send takedown notices to them.
This is a 4chan archive - all of the content originated from them. If you need IP information for a Poster - you need to contact them. This website shows only archived content.