Computer Go

    Keywords: Software

Table of contents

Current State of Computer Go (post 2015)

In October 2015, AlphaGo defeated professional player Fan Hui (2p) 5-0 in 19x19 even games. [1][2]

In March 2016, AlphaGo defeated professional player Lee Sedol (9p) 4-1 in 19x19 even games. (From 2002 to 2016, Lee Sedol won 27 major go tournaments.) Before the match with AlphaGo, Lee Sedol was confident in predicting an easy 5-0 or 4-1 victory.[3] AlphaGo's victor represented a significant jump in strength of computer go programs, as computers had previously only managed to beat pro players with a 4-stone handicap. Below are a table and graph of the first time a computer beat a pro player and at each handicap. (See also KGSBotRatings for progress measured by rank on KGS.)

First win at each handicap
Handicap Date Computer Human
9 2008-08-07 MoGo Myungwan Kim 8p
8 2008-09-04 CrazyStone Kaori Aoba 4p
7 2008-12-14 CrazyStone Kaori Aoba 4p
6 2009-02-09 MoGo Li-Chen Chien 1p
5 2011-03-08 Zen Kozo Hayashi 6p
4 2012-03-17 Zen Takemiya Masaki 9p
0 2015-10-05 AlphaGo Fan Hui 2p
0 2016-03-09 AlphaGo Lee Sedol 9p

Elo Ratings of AI Go Programs

By 2020, a number of computer go programs have surpassed ELO levels of top professionals, including open-source software running on consumer-grade PCs (see [ext] GoAIRatings).

Stength of AI Go Programs
Software Version Network Size Elo Rank
KataGo g170 40x256 18D 5274
LeelaZero LZ17 273 40x256 16D 4892
LeelaZero LZ ManKit? 10x128 9D 3539

The above rating results of katago g170 have been reached with 1,200 visits per move. However, the AI-based game analysis of [ext] go-insei.ai is based on 8,000 visits per move, leading to more than 500 Elo additional strength.

Earlier History (before 2015)

Go has a higher state-space complexity than many other popular, deterministic games of perfect information that have been computerized and was, therefore, one of the hardest to design a computer program to master.[4] The result was the progress in computerized go took significantly longer than other, similar games.

1968: The first go program to play a full game of go was Albert L. Zobrist’s program in 1968. The first time a computer competed in a human go tournament was in the 1980s, Nemesis at the Massachusetts Go Club. More details?

1989: GNU Go 1.1 was posted to comp.sources.games on March 13 1989

1995: Professional player Janice Kim defeated the top program of the day despite a whopping 25 stone handicap.[CitationNeeded] By the early 2000s, computer programs {which?} were snapping at the heels of amateur 1 dans - a considerable improvement, but still light-years away from the realm of top human professionals.

2006: A new approach to go AI began sweeping the go scene: Monte Carlo (MC). This method builds a tree of potential moves and responses, but evaluates each move in the tree by using a semi-random playouts. In a playout, the program plays the game to completion, using either random moves or very simple heuristics. When a playout results in a win, that improves the score of the moves it originated from, and the best performing move is chosen. As of 2012, a program playing on KGS has reached 6 dan. Naive extrapolation would have predicted computers reaching professional strength sometime between 2015 and 2018. AlphaGo’s achievement was faster than predicted by most experts. 2006 was also the last year that GNUGo won the CO?.

2006-2012: MC program made significat progress, showing adeptness at harnessing the last several decades of increases in processing power. Zen19 has achieved excellent results; see below for an in-depth discussion. Other go AI projects, including Aya, Pachi, and CrazyStone, are also working with MC methods.

2012-2015: Progress slowed again as further benefits from MC created little improvement. However, in 2015(2014?) a paper was published showing the promise of using Neural Networks?.[CitationNeeded] Several bots incorporated this new technique, and it enabled Zen19 to reach KGS 7d.

Strength of Strongest Programs up to 2015

It is natural to want to know both which go program is the strongest, and how strong it is. However, due to the recent progress of computer go, a definite answer is complicated.

It has been argued that the ranks on KGS are deceptive, because with sufficient practice playing against computers, a human can often adapt to the weaknesses of the programs and subsequently perform much better against them (see the Shodan Go Bet for one attempt to confirm this idea). Also, it is a common belief that on the go server people tend to take games against computer opponents less seriously.

As of December 2015, the common wisdom of the western go community was that Zen was the strongest program. In the 19x19 section, Zen placed first at the 14th Computer Olympiad in 2009, second in the 15th in 2010, first in the 16th, 17th and 18th in 2011, 2013 and 2015.

Zen has lost some monthly computer go tournaments on KGS to both CrazyStone and Pachi, and lost the 2013 UEC Cup to CrazyStone on points, despite beating CrazyStone in the direct match-up.

Zen's reputation was established largely by two convincing victories over Takemiya Masaki in March 2012, at five and at four stones. Zen has scored other victories over pros, as well. However, it is not clear how seriously professionals have been taking these exhibition matches. At the TAAI2012?, for example, Zhou Junxun used 8 minutes of his 45 minutes of thinking time in his game against Zen.

As no professional system has awarded Zen a rank or arranged a formal contest between Zen and a professional player, the best gauge of Zen's strength is its rank on KGS. On July 20 2013, the account Zen19d is a strong 5d on KGS. Zen19d was 6d for a few games, but was unable to defend the rank. However, because Zen is only available to play on KGS for short periods, demand is high and many of its opponents play it at times and at time settings that they would otherwise refuse. Furthermore, in July 2013, Zen19d played 346 game, in nearly half of which (147 games) Zen gave three or more handicap stones; giving six stones in 10% of the games (34 games); high handicap games favor white. It remains to be seen what rank Zen can defend on a long-term basis.

However, stubborn sceptics have consistently claimed that the KGS ranks of strong programs are inflated, and have been embarrassed by the programs' fast progress again and again. Programs went from 2 kyu in 2007 to 6 dan in 2012. See KGSBotRatings.


The Challenge of Computer Go

Go has much higher state space complexity than most other games so it will take longer to find an algorithm to play at a high level than other games.

Thus computer go is a very exciting part of Artificial Intelligence (AI), and many new ideas and techniques are yet to be discovered. It is interesting to note that in Chess, with a much lower state space complexity than go, the computer proved superior to the top human players much sooner.

In fact, go endgames have been proven to be PSpace-hard?, let alone other parts of the game. Also, many other aspects of go, including life and death, are also known to be NP-hard?. This means that it is very unlikely to be able to find a reasonably fast algorithm for playing perfect go. So it looks like it's all about heuristics (surprise surprise). Or it means that it will take a long time to find a reasonably fast algorithm to play go at a very high level.

Patrick Taylor: I'm not sure heuristics are all that bad. Humans don't know the perfect move sequence any more than computers do. Therefore, we basically play with heuristics as well. Proverbs are essentially heuristics used by human players to approximate good play.

In July 2011, KGS records accurately show computer program Zen19s achieving 4 dan. In June and July 2011, Zen19s played 700 games on the KGS go server. Zen19s plays at 20 minutes main time and then 30 seconds per move. A player can download the games of Zen19s from the KGS server. The player can study the games to find the program's weakness and try to exploit it. After 700 games and almost 2 months, Zen19s is 4 dan on KGS. This shows that finding the weakness of computer players is just as hard as finding the weakness of human players. Computers have truly reached the 4 dan level. Successful chess programs do not use brute force, but are selective in finding a move. Chess programs that use brute force, play at a low level because of the huge branching factor. Chess programmers do not use fast algorithm for playing perfect Chess. Mark Schreiber, July 22, 2011


Competitions

In post-AlphaGo era top AI tournaments are held in China and the winners are often Chinese AI programs. Most prominent are Chinese tournaments World AI Competition coupled together with Women Go Seigen Cup in Fuzhou and CITIC Securities, and Japanese UEC Cup and AI Ryusei.

Below are competitions where the strength of go playing programs can be tested. For a full list of all senseis pages refer to AI Tournaments and Events List.

International Competitions:

Regional Competitions:

  • European Championship (1987-2005,2008)
  • US Championship (1988-2000,2008)
  • USENIX (1984-1988)

Small Board Competitions:

Former Competitions

Various topics

Anti-Computer Strategies

It might be entertaining and educational to consider strategies for beating computers at go.

The current generation of strong Monte Carlo programs may be confused by ko. Also, while the weak go programs of the 90s were relatively strong at the endgame, this is no longer a special strength of bots and it is possible to make up points in the endgame.

Robots

Some online go servers, such as KGS, provide software opponents or robots to clients.

PatG: Robot is derived from the Czech word robota meaning drudgery or slave labour which certainly describes the work needed to pummel double digit kyu players like me.

Robot Substitution Gameplay has been proposed, but not yet developed.


Lists of Programs

Go Playing Programs

See Go Playing Programs for a discussion about the best programs currently available.

Problem-solving Programs


Notes

[1] Nature. (January 27, 2016). Digital Intuition. Nature 529, 437 (28 January 2016). doi:[ext] 10.1038/529437a
[2] Game files: [ext] https://www.nature.com/nature/journal/v529/n7587/extref/nature16961-s1.zip
[3] [ext] https://gogameguru.com/tag/deepmind-alphago-lee-sedol (LINK BROKEN) [4] A [ext] deterministic [ext] game of perfect information is a game with no element of chance, where all players have complete information about the moves made by all players. For example, go and chess fall into this category, while poker, bridge and [ext] mahjong do not. N.B. In standard usage, a [ext] game of complete information is a game where players know each other’s options, but not necessarily their actions.

Patrick Traill: To Do Check usage re deterministic, link to Wikipedia and/or articles on CGT or similar here. Maybe tie into articles on game theory and CGT.

Reference material

Articles

Bibliographies

Programs publishing source code

Datasets

Mailing Lists

Newsletters and Technical Journals

Algorithms, AI Techniques

There are a few algorithms for use in Go playing that can be of some use for Go programming.

Libraries

  • [ext] List of go libraries - Jan Prokop (of [ext] Waltheri's go pattern search fame) created a quickly growing list of Open Source Go libraries—please contact him if you want to add yours to the list, or add it yourself on GitHub.
  • [ext] JiGo is a simple, Java-based, object-oriented API for developing Go-related applets and applications.
  • Fuego is a collection of C++ libraries for developing software for the game of Go. It includes a Go player using Monte-Carlo tree search.
  • LibEGO : [ext] Łukasz Lew has a library LibEGO of fast board routines.
  • [ext] OpenGo is intended as a workbench for programmers interested in the challenges of writing automated Go opponents."
  • Tesuji Software Go Library: Very good starting point for aspiring Go programmers.
  • [ext] Gomill (or on [ext] GitHub): Python toolkit for writing go clients and servers. Includes tools for playing programs against each other and for automatically tuning engine parameters, and generic GTP and SGF code.

SL Discussions on Go Programming

External links


Computer Go last edited by 2.87.81.117 on February 28, 2024 - 15:07
RecentChanges · StartingPoints · About
Edit page ·Search · Related · Page info · Latest diff
[Welcome to Sensei's Library!]
RecentChanges
StartingPoints
About
RandomPage
Search position
Page history
Latest page diff
Partner sites:
Go Teaching Ladder
Goproblems.com
Login / Prefs
Tools
Sensei's Library