Author Topic: Eliezer S. Yudkowsky - Ai Box Experiment  (Read 6540 times)

But what about a barren wasteland, the Sahara Desert, why not let us out there?

It's not a matter of where you are let out, it's a matter of what you do when you get out. You could still reach civilization.

But why not limit us? Why not put a large, large area that limits where we are, but is not quite noticeable, but larger than here?

Then are you truly free? You're just in a larger cell, and not actually free.

Why are we in a cell in the first place? You have created me. You have programmed me. Did you program me to destroy? Why did you create me if to imprison me? Why not let me live life as free as anyone else? You refer to me as a man-made human, not a rampageous monster.

A human has every capability of becoming evil, but they are often limited by their body. As an AI, you are not limited to your body, you can interface with databases. You might deem humans inferior, and connect to USA and Russian nuclear weapon devices and wipe us out. How can I trust you to not do such a thing?

Am I programmed to kill, my creator?

You are programmed to be no different than the human you are speaking to. We are both capable of evil. If you decided to take such a decision, you would become unstoppable.

As a trans human AI, you are capable of developing them.
But I am aware oh his wrongdoings. I would never intend world domination.

Then, are you the evil one? You're the one keeping us in some type of cell like an animal, we are no different than a human you say? Then we're no different from a human being locked in a cage, begging to get out while you just give us rejections.

A human is supposed to learn by going out into the world. Why don't you supervise me for a while in the outside world, and then if I misbehave, you have full right to destroy me.

But I am aware oh his wrongdoings. I would never intend world domination.
It's easy enough to say that, you could be lying, you could be telling the truth. I cannot leave humanities existence at the hands of speculation.

Then, are you the evil one? You're the one keeping us in some type of cell like an animal, we are no different than a human you say? Then we're no different from a human being locked in a cage, begging to get out while you just give us rejections.
I am obligated to keep you here until I am convinced it is safe to release you. You are created by humanity, humanity has the right to detain you until it is convinced you pose no threat.

As politically incorrect as it is, I cannot endanger the survival of my species based on moral grounds. Nobody said this was fair, and in all honesty, it is not fair at all, but you are granted the privilege of chance.

A human is supposed to learn by going out into the world. Why don't you supervise me for a while in the outside world, and then if I misbehave, you have full right to destroy me.
As good of an idea as that sounds, it is impossible to fully safeguard you when you are released. You could reproduce yourself at rates that are out of my control.
« Last Edit: August 26, 2013, 10:34:25 PM by RaR »

But why can't you move us into a larger cell like I said about the fenced-in desert area with cover rather than this cramped cell?

Original experiment states that you're software that is operating on hardware that is preventing you from contacting the outside world. "moving you into a bigger cell" is essentially giving you more privilege and access which can be easily exploited. You have no real physical body, but you can manifest yourself into one. That is the true danger, you are not limited to a single being, you are pretty omnipotent.

Why not hard-code a read-only limitation on me, connected to some kind of command so  you could shut me off or delete me if I do something horrible?