A new publication on “Improvised Theatre Alongside Artificial Intelligences” will be presented at the 13th AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment Conference (AIIDE’17) in Snowbird, Utah.
This study presents the first report of Artificial Improvisation, or improvisational theatre performed live, on-stage, alongside an artificial intelligence-based improvisational performer. The Artificial Improvisor is a form of artificial conversational agent, or chatbot, focused on open domain dialogue and collaborative narrative generation.
Using state-of-the-art machine learning techniques spanning from natural language processing and speech recognition to reinforcement and deep learning, these chatbots have become more lifelike and harder to discern from humans. Natural human conversations are seldom limited in scope and jump from topic to topic, they are laced with metaphor and subtext and face-to-face communication is supplemented with non-verbal cues.
Live improvised performance takes natural conversation one step further with multiple actors performing in front of an audience. In improvisation, the topic of the conversation is often given by the audience several times during the performance. These suggestions inspire actors to perform novel, unique, and engaging scenes. During each scene, actors must make rapid-fire decisions to collaboratively generate coherent narratives.
This new work introduces Pyggy and A.L.Ex. (Artificial Language Experiment), the first two Artificial Improvisors, each with a unique composition and embodiment. The paper highlights research and development, successes and failures along the way, celebrates collaborations enabling progress, and presents discussions for future work in the space of artificial improvisation.