Pseudo Chatbot
/Taking a break from the influence posts today to mention a new feature in the Jerx App that will be available in an update either as you read this or very soon after.
This is an idea that came in from supporter Dustin W. in an email where he wrote:
ChatGPT and Bard and other AI programs are a great premise for a prediction if you want to dodge the psychic claim.
I think this is a very solid idea.
Imagine it as part of a Shuffle-Bored presentation.
You bring up an AI chat page on your phone. It’s not one of the standard ones. It’s a dark-web-only version or some proprietary site that you have to have an invitation to be able to use.
You enter a prompt that says something like:
I want you to describe the configuration of a deck of cards that has been shuffled and mixed face-up into face-down cards by a 38-year-old woman named Carmen from Tupper Lake, New York.
The AI shoots out a reply like:
🤖The configuration of a deck of cards that has been shuffled and mixed face-up into face-down cards by a 38-year-old woman named Carmen from Tupper Lake, New York is likely to be:
24 face-up cards
Of the face-up cards, eight of them will be black.
All the black face-up cards are clubs.
This is the likely configuration of a deck of cards that has been shuffled and mixed face-up into face-down cards by a 38-year-old woman named Carmen from Tupper Lake, New York.
My mistake, I just noticed that one of the black face-up cards is actually the two of spades.🤖
You could scroll the page so that the Two of Spades punchline isn’t revealed until you want it to be.
Of course, this could be used as a reframe for any kind of prediction effect that forces the spectator into one outcome.
So I shot that idea over to Marc Kerstein and he has added it to the Jerx App. In the settings for the chat function you just enter whatever you want the AI to spit out and regardless of what you write in the prompt during performance, it will just shoot out whatever canned response you have.
It does it in the “typing” style similar to many AI chatbots.
What happens if your friend wants to ask it another question?
Additional prompts generate this message:
You have no remaining data credits. Credits renew in 30 days.
Aww, too bad. Sorry, friend.
Don’t call it like your “magic chatbot” or something corny like that. This is just some strangely accurate underground chat program that you have a special invite to.
If they come to you 30 days later and ask you to ask another question to the chatbot, just say, “Ah, they made that thing illegal. I wasn’t even really supposed to show it to you.”
Right now it’s very simple to operate. Just enter the wording you would like it to shoot out and it will do so. You could make it as simple as having it say, The 3 of Clubs, and then asking it in performance, '“What card did my friend pick?” Although when it’s that simple I think it will hardly have much of an impact. The idea of adding names and ages and locations into the sample question/answer mentioned above is to suggest that maybe there’s some sort of interesting calculation happening in the background. But you can do it however you like.
If I feel there’s demand for it, I may ask Marc to make it a little more complex, with multiple saved responses and possibly the ability to enter part of the output secretly during the process of entering the URL. That way it wouldn’t just need to be used for forces, but could also be used for free choices. (Similar to what we did with the Wisdom of Crowds word reveal that’s also in the app). But I may just keep things simple with it.