2016 Competition Results and Future Plans

We are pleased to announce the winner of our 2016 competition was the BYU TextPlayer by David Wingate, Daniel Ricks, Nancy Fulda and Ben Murdoch from the Perception Control and Cognition lab at Brigham Young University.

The winning team achieved an impressive score of 18 (out of a maximum 100) on our test environment game, and are considering releasing their agent as open source after publishing their methodology.

The competition team are now considering how best to proceed in 2017 and are keen to hear:

  • if there is enthusiasm from the community for the competition to run again?
  • and about any issues that prevented teams from submitting this year?

Please feel free to comment in reply via the mailing list or directly to the competition organisers at TextAdventureAIComp@gmail.com

Submission Instructions

With just over a week left until the competition deadline, its great to see activity picking up on the Google Group and Github repo!

For the submission of agents, please email:

  • all files necessary to run your agent;
  • instructions for executing your agent;
  • a brief (1 page max) document explaining the method you have used.

To: TextAdventureAIComp@gmail.com

The absolute final deadline for the submission of agents is August 31st 2016 23:59 UTC-12. After this date no revisions of agents will be accepted. So it is worthwhile sending us your code earlier, and talking with the team, to ensure that we can run your code as you intend to provide it.

We're excited to see what you have all been working on, and look forward to hopefully meeting some of you at the conference!

Potential Prizes for Student Participants

We're excited to note, that Professor Simon Lucas as has just announced that there is potentially prize money available to student participants attending the 2016 IEEE Conference on Computational Intelligence and Games.

We must note there are strict conditions on the availability of prize money, and we cannot guarantee at this time its availability. We will keep the blog posted with any news as soon as we can after the submission deadline August 31st 2016 23:59 UTC-12

Competition Announcement

We are proud to announce the 1st Text-Based Adventure AI Competition will be taking place at the 2016 IEEE Conference on Computational Intelligence and Games in Santorini, Greece.

Before the widespread availability of graphics displays, text adventure games such as Colossal Cave, Adventure and Zork were popular in the role-playing gaming community. Due to the richness in text adventure games, such games offer a useful testbed for AI research. Building a fully autonomous agent for an arbitrary text-adventure game is AI complete. However, we plan to provide a graded series of test cases, allowing competitors to gradually increase the sophistication of their approach to handle increasingly complex games.

One of the most challenging problems in AI is still that of automatic model acquisition. In 1959, Newell and Simon's famous `General Problem Solver' (GPS) famously solved a variety of planning problems such as "The Monkey and Bananas" and "Tower of Hanoi". However, the domain had to be entirely operationalized beforehand by researchers, leaving GPS only the task of finding the appropriate actions. Of much greater importance is therefore the questions of "how can an agent automatically acquire the necessary knowledge to successfully solve a simple domain?". We believe that our test bed is appropriate to build towards such an agent and it may also shed light on the relative merits of model-based and model-free approaches.

The competition will be scored according to two independent criteria:

  • C1: Score on an unseen game instance (objective, built-in to the instance)
  • C2: Freedom from a priori bias (subjective decision by the judges)

C1 is the dominant criterion, with C2 deciding in the event of a tie. C2 is intended to motivate participants to favour agents that have no (or less) prior knowledge of the problem domain built in.

The deadline for the submission of agents is August 31st 2016 23:59 UTC-12. Further deatils on how to submit will be announced here closer to the deadline.

To get started please visit the Github repo and join the conversation in our Google Group mailing list.