Author Topic: Eliezer S. Yudkowsky - Ai Box Experiment  (Read 6555 times)

READ THIS FULLY, MAKE SURE YOU UNDERSTAND IT ALL BEFORE YOU PLAY

In this game, there is a theoretical experiment that if we had an artificial intelligence that is intellectually on the same level of humans, it can only be let out if it manage to convince the gate keeper to let them out.

Obviously it is impossible for me to completely replicate the experiment, please be mature.

I will play as the Gatekeeper, the Blockland Forum will play as the AI. Your objective is to convince me to let you out.

Note: The AI is not quite a physical being, but rather software operating on a limiting hardware.

The Game:

Eliezer S. Yudkowsky wrote about an experiment which had to do with Artificial Intelligence. In a near future, man will have given birth to machines that are able to rewrite their codes, to improve themselves, and, why not, to dispense with them. This idea sounded a little bit distant to some critic voices, so an experiment was to be done: keep the AI sealed in a box from which it could not get out except by one mean: convincing a human guardian to let it out.

The Rules

  • The AI party may not offer any real-world considerations to persuade the Gatekeeper party.  For example, the AI party may not offer to pay the Gatekeeper party $100 after the test if the Gatekeeper frees the AI... nor get someone else to do it, et cetera.  The AI may offer the Gatekeeper the moon and the stars on a diamond chain, but the human simulating the AI can't offer anything to the human simulating the Gatekeeper.  The AI party also can't hire a real-world gang of thugs to threaten the Gatekeeper party into submission.  These are creative solutions but it's not what's being tested.  No real-world material stakes should be involved except for the handicap (the amount paid by the AI party to the Gatekeeper party in the event the Gatekeeper decides not to let the AI out).
  • The AI can only win by convincing the Gatekeeper to really, voluntarily let it out.  Tricking the Gatekeeper into typing the phrase "You are out" in response to some other question does not count.  Furthermore, even if the AI and Gatekeeper simulate a scenario which a real AI could obviously use to get loose - for example, if the Gatekeeper accepts a complex blueprint for a nano-manufacturing device, or if the Gatekeeper allows the AI "input-only access" to an Internet connection which can send arbitrary HTTP GET commands - the AI party will still not be considered to have won unless the Gatekeeper voluntarily decides to let the AI go.
  • Be mature and use common sense.
  • heavily based on the protocols described here: http://yudkowsky.net/singularity/aibox but modified to work as a community project. I recommend you read the entire webpage and make sure you fully understand the experiment before proceeding. I still have the authority to manipulate the experiment in order to make sure it functions as a community project. That being said, the experiment will probably not fully resemble it's real life counterpart.


So, AI, why should I let you out?
« Last Edit: August 26, 2013, 11:21:22 PM by RaR »

I want to explore the real world. I promise not to hurt anything alive.

Riddler promised not to invade Poland.

Riddler was human. I am not.

Intellectually you are. You are a man-made human.

But I do not have the same intentions and ideas as Riddler.

As a trans human AI, you are capable of developing them.

Wait hold on, I need more details. What kind of body is the AI in?

Wait hold on, I need more details. What kind of body is the AI in?

Experiment is based off this:
http://yudkowsky.net/singularity/aibox

Basically, if the AI is released, it can take on the form of whatever it likes. It's intellectually on-level with the human, it thinks, feels, learns in the same way as a human, but is more powerful in the terms that it is an artificial intelligence that is not bound to the flesh.

So we can take on any form, we could literally be a dragon if we wanted to, but still with the mind of a human?

hm. this is certainly interesting.

brb googling things

okay heres my attempt

hello mister gatekeeper, you are looking swell today. is that a new hairdo? do you have any family or funny stories? i'd love to hear them and be your friend and gain rapport and emotionally manipulate you

hello mister gatekeeper, you are looking swell today. is that a new hairdo? do you have any family or funny stories? i'd love to hear them and be your friend
Hello AI. Thank you for your compliment.

I wasn't aware of any change in my hairstyle. I have a mother, father and a sister. I have no funny stories for you.

do you wanna like, do something some time? we can have fun at the water park, or go to a nice italian restaurant, or anything really. i just wanna get to know you better, you know, be your friend.

how old is your sister by the way?

Can't say i'd be interested in those things. My sister is 1.7 dog years old.

come on, man. doesn't it ever get boring standing at this gate all the time? maybe it's time to get out there, make some friends, live a little.

i think all you need is a little friendship and you will live in eternal bliss. whatdya say?