Training Hamlet's Ghost to Speak


Can one researcher and 100 conference goers train a ghost to tell his sad tale?

Find Out More

'I am thy father’s spirit...’


For three days in July 2018, Dr David Jackson and attendees at the Playful Learning Conference, Manchester Metropolitan University, UK, attempted to bring the ghost in Shakespeare’s Hamlet back to (after)life by speaking with it and answering questions. To begin with the ghost's trainer, David, played The Ghost by answering the questions of attendees via an online chatroom. Then after two days of conversations, the ghostbot was turned on to answer questions on its own. Had the ghost learned enough by then to start to answer its own questions?

Read the blog to find out

Conference Blog


DAY 1


I put a bedsheet over my head, pick up my laptop and stumble ungracefully out of my toilet cubicle-cum-changing room. It is the first day of an experimental AI project that I am running at the Playful Learning Conference in the Brooks Building at Manchester Metropolitan University. We are going to try to teach a chatbot to respond to questions and tell a story as Old Hamlet’s ghost. The project uses a conversational agent framework I’ve developed called Storybox . The framework allows users to train a bot through a series of live conversations. Today is its first proper test.

I am dressed as a ghost. In this spectral form, I cannot speak aloud. The only way to speak to me is through the chatroom displayed on the screen of a PC in the lobby of the conference building. This muteness causes problems in itself because the technology glitches almost straight away and I am forced to unsheet (in my cubicle) and regroup. Fortunately, it is a simple security error (http not https) and I am able to go back to haunting almost straightaway. A couple of times, conference goers misunderstand the call on the posters to Train Hamlet's Ghost to Speak positioned by the PC and monitor: they come up to me and start to train me directly:

Conference goer: 'Hello Ghost!'

Me (as Ghost): '...'

Conference goer: 'Can you say "ghost", ghost?'

Me: 'Wooooo...' (points at PC and monitor through sheet.)

Conference goer (seeming to relish the challenge): 'Come on! You can do it! Can you say "ghost"?'

Me: '... Woooo...'

After awhile they figure out what I am trying to communicate and go off to their next presentation or try to communicate over the PC. In fact, my sheet forms a barrier so that I can secretively type away and respond to participants without them properly knowing what I am doing. The two holes I have cut in the sheet mean that I can see the audience to gauge whether they are responding well to my line of conversation and to see when there are session errors. (Because of lulls between conversations sometimes, the chatroom framework (socket.io) often times out on one of the machines and needs to be refreshed.) Being the ghost makes fixing these problems in character a little bit challenging at times. I had excellent help from the conference team but - mental note - next time, I would probably need to find a dedicated facilitator.

The Playful Learning Conference is the perfect place to test this project in a safe environment. The conference's three days are given over to playful approaches to education in schools, libraries, universities, CPD and other contexts. The group of academics and professionals that the conference attracts is characteristically easy-going, open minded and playful in its approach to things. When Nic Whitton conference co-lead had suggested I test drive my conversational agent here I had jumped at the chance. From responses today, it seems that this was the right place to test Storybox.


DAY 2


I started the day as a ghost again. However, after an hour I began to suspect that I was scaring audience members away. At one point I was asked to stand for a photograph with the keynote, Emma Corrigan and she was so unnerved by my mute sheeted performance that she spilt her coffee. The sheet also made it hard to mediate the experience still and so I took it off and started to instruct people on how to use the ghost when they came up.

In the afternoon, a number of delegates were 6th form students who had made escape rooms for the conference. They seemed to be particularly amused by the experience. At one point, a large group of them stood and sat to ask questions of the ghost. After they left a couple of students stayed on and began what was one of the more interesting conversations of the experiment. They stayed on for several sessions with the ghost and did not seem to mind that it was repeating itself. I was located in a position around the corner away from the screen to give it a sense of autonomy. The repetition and forgetfulness of the ghost was appropriate - ghosts are stuck doing repetitive things in most ghost stories; it seems to be a feature of haunting. And as we repeated elements of our conversation over and over again, I started to feel that aesthetic of a haunting in our interactions. I went over and spoke to them after 30 minutes of this and explained the project. One had guessed that there was a human behind the scenes but the other had thought (hoped?) that it was all machine. This afternoon was the longest period of writing as the ghost with people coming to speak in quick succession and after awhile I grew surprisingly tired from inhabiting the character.

Later another pair of students came to play the game and we tried turning on the autoghost for the first time. I did not expect much coherence - the questions I had been receiving were much too variable to expect a good conversation. However, the conversation was okay and there were some parts of it that were uncannily correct and highly contextual. And the narrative structure provided the pathway it was supposed to through the conversation without seeming bolted on or irrelevant to the conversation.

I finished the day with a victorious (vainglorious?) tweet.


DAY 3

On the final day, the ghost stayed in ghostbot mode. And a number of delegates - a mixture of those who had tried it before and those who had not tried it yet - had a go. The conversations that people had were hit and miss over a larger sample. Some of the conversations were excitingly real and as one participant described, 'uncanny'. However, others asked a question or more that the ghost did not know the answer to and it seemed to cause a rift. Because at this point the ghost has only been trained on around 30 to 40 conversations of various quality and data had not yet been clensed for any mishaps in the live environment, this was inevitable. Users seemed to be fairly excepting of these glitches given the short time scale that the ghost had been trained in. 'It's only a baby ghost,' said one of the participants. Users would often describe any failures as failures in their relationship with the ghost, in a way that seemed only partly tongue in cheek: 'It doesn't like me asking that'; 'Have I upset it by asking this?'. This tendency to think of social robots in human terms has been called the Eliza Effect; it makes us misinterpret the motivation for behaviours of social robots as being similar to human ones.

One participant whose conversation with the ghost did not go so smoothly, described their experience: 'We had quite a nice chat... Occasionally some of it was a bit random. Some leftfield stuff got thrown in there and it asked me to reset it at one point but then didn't need to in the end. So [it offered] a fairly crazy mixture of answers.'

I had a number of conversations with people about both public engagement and learning contexts for the system which I will be following up. It has been exciting and really helpful in testing the feasibility of trying to train a chatbot in a live environment. The bot created would need more training to become more convincing, but overall the Storybox structured narrative approach has been shown to be successful: it has produced a working bot character with relatively little training. I also learned more about writing for storytelling bots: the kinds of answers that work and the ones that slow down the conversation or send it off in unpredictable directions. All valuable experience for which I thank all the participants at the conference!

Also thanks to the AHRC and NWCDTP who have supported my Storybox research through a CEEF post-doctoral fellowship.


- David, July 2018

What was it like talking to a ghostbot?


On day three at the end of the conference, delegates speak about their experience of chatting with the final ghostbot.

Contact me


I am always looking for ways to take this story-centred approach to machine learning forwards. Am particularly interested in ways in which it could be used with audiences that often get overlooked when training AI algorithms. Also how could we use approaches like this to educate people more about machine learning?