AI for TSP Competition

The goal of this competition is to bring AI researchers into contact with problems from Operations Research (OR), specifically logistics problems such as variants of the Traveling Salesman Problem (TSP). In this competition, participants will solve a variant of TSP that is particularly difficult for traditional OR solvers, using different methods depending on the track: online supervised learning (track 1) or reinforcement learning (track 2). 

Timeline of the competition:
— 29 March: Send out call for participation.
— 26-30 April: Test period, to see if the baselines work for participants.
— 17 May: Deadline for Phase 1.
— 5 July: Deadline for Phase 2, using a withheld data set.
— 9 August: Winners are contacted privately.
— 21 or 22 August: The winners are publicly announced at the DSO workshop of IJCAI 2021. 

Competition url: 

The Fourth International Competition on Computational Models of Argumentation

The International Competition on Computational Models of Argumentation (ICCMA) aims at nurturing research and development of implementations for computational models of argumentation. ICCMA 2021 focusses on decision problems that are hard for the first or second level of the polynomial hierarchy. For the first time at ICCMA, both exact and approximate algorithms for reasoning with abstract argumentation will be evaluated.

Timeline of the competition:
— Mar. 30: solvers submission
— Apr. 15: system descriptions submission
— April – August: the organizers will run the various tracks of the competition and collect the results
— August (during IJCAI): announcement of the results, award ceremony

Competition url: 

The Chef’s Hat Cup: Can you beat them all?

In this competition, your goal will be to develop an artificial player to play the Chef’s Hat card game. We will run two parallel tracks: a competitive and a cooperative scenario. In the first track, the participants will develop the most effective agents to play the Chef’s Hat card game and be the winner. In the second track, they will have to develop an agent that can increase the chances of a dummy agent winning the game.

Timeline of the competition (tentative):
— Starting Date: April 13, 2021
— Final submission: June 15, 2021
— Winners Announcement: July 06, 2021

Competition url: 

The 12th International Automated Negotiating Agent Competition – ANAC 2021

The Automated Negotiating Agent Competition (ANAC) is an international tournament that has been running since 2010 to bring together researchers from the negotiation community, which provides a unique benchmark for evaluating practical negotiation strategies in multi-issue domains. This year, we have five different negotiation research challenges: Automated Negotiation League (GeniusWeb framework), Human-Agent Negotiation (IAGO framework), Werewolf Game (AIWolf Framework), Supply Chain Management (NegMas framework), and HUMAINE (HUman Multi-Agent Immersive NEgotiation). We expect innovative and novel agent strategies will be developed, and the submitted ANAC 2021 agents will serve as a negotiating agent repository to the negotiation community. The researchers can develop novel negotiating agents and evaluate their agents by comparing their performance with the performance of the ANAC 2021 agents.

Timeline of the competition
— 10th of July, 2021  Agent Submission Deadline
— 10th of August, 2021  Finalist Announcement
— 21-26th August 2021  ANAC Session at IJCAI

Competition url: 

Automatic Reinforcement Learning for Dynamic JobShop Scheduling Problem

In this competition, participants are invited to develop automatic reinforcement learning solutions to the dynamic job shop scheduling problem (DJSSP). The solutions are expected to automatically train promising agents on a distribution of DJSSP tasks. After a feedback phase where solutions can be developed and fine-tuned with daily feedback, those that perform best on a set of five unseen tasks win the competition.

Timeline of the competition:
— Apr. 15th, 2021: Call for participants.
— Apr. 19th, 2021: Beginning of the competition (the Feedback Phase).
— Jul. 2nd, 2021: End of the Feedback Phase and beginning of the Check Phase.
— Jul. 9th, 2021: End of the Check Phase and beginning of the Private Phase.
— Jul. 16th, 2021: End of the Private Phase.
— Jul. 23rd, 2021: Annoucement of the winner.
— Aug. 21st, 2021: Beginning of IJCAI 2021 conference.

Competition url: 

Angry Birds AI Competition

The task of this competition is to develop a computer program that can successfully play Angry Birds. The long term goal is to build an intelligent Angry Birds playing agent that can play new levels better than the best human players. This is a difficult problem as it requires agents to predict the outcome of physical actions without having complete knowledge of the world, and then to select a good action out of infinitely many possible actions. This is an essential capability of future AI systems that interact with the physical world. The Angry Birds AI competition provides a simplified and controlled environment for developing and testing these capabilities. New in 2021 is an additional Novelty Track where we introduce novel components to the game during the competition (e.g., new game objects, or changes in properties of existing game objects) and agents are tasked with detecting when novelty occurs and to adjust their strategies to the novelty accordingly.

Timeline of the competition:
— Early registration: July 10
— Late Registration: August 1
— Agent submission: August 20

Competition url:  

WhoIsWho – a Large Name Disambiguation Benchmark

‘WhoIsWho’ is the world’s largest manually-labeled name disambiguation dataset ( and benchmark. It has two subtasks. In the Name Disambiguation from Scratch Subtask, participants will be provided a bunch of papers with authors who have the same names and will be asked to return different clusters of papers by authors. In the Incremental Name Disambiguation subtask, participants will be provided a set of new papers and a group of existed author’s paper lists already in the system, and need to assign the new papers to the existed authors correctly.

Timeline of the competition:
— May 1, 2021: Launch the competition website.
— May 10, 2021: Phase I starts, release the training data and validation data, open submission portal.
— July 15 (23:59 UTC): Team merger deadline.
— July 16 (23:59 UTC): Phase II starts, release the test data; all participants have 48 hours to download the test set and submit their results on the test set.
— July 18 (23:59 UTC): Phase I and II end, close submission for both validation and test prediction.
— July 22 (23:59 UTC): Announce winners. 

Competition url: