Computer Go

    Keywords: Software

Table of contents


Current state of computer go

Since 2006, Go AI programmers have focused on Monte Carlo (MC) methods, and between 2010 and 2012 they made quick progress. The MC programs have proven adept at harnessing the last several decades of increases in processing power. Zen19 has achieved excellent results; see below for an in-depth discussion. Other go AI projects, including Aya, Pachi, and CrazyStone, are also working with MC methods.

This represents a great improvement in the power of go programs in only a few years. As recently as 2010, John Tromp was able to beat the most powerful Go program on the market in the Shodan Go Bet. (Tromp was actually 2d EGF at the time of the series.) In a 2012 rematch, Zen beat John Tromp. On the other hand, it is still uncertain how far MC methods can bring computer go. The most powerful computers still cannot defeat strong amateurs in even games, and programs that will run on mobile devices cannot defeat competent beginners. Thus despite promising developments, computer go still lags a long way behind the programs that play chess and other traditional games.

(This section describes the state of computer Go as of July 2013. If it is not updated, this information may become outdated.)

Historical background

Go has the highest state space complexity of any game so Go is the hardest game for a computer program to master. Therefore it will take longer for a computer program to master Go than to master any other game.

The first Go program to play a full game of Go was Al Zobrist's program in 1968. The first time a computer competed in a human Go tournament was in the 1980s, Nemesis at the Massachusetts Go Club. More details?

In 1995, professional player Janice Kim defeated the top program of the day {which?} despite of a whopping 25 stone handicap. By the early 2000s, computer programs {which?} were at the footsteps of amateur 1 dan - quite a large improvement, but still lightyears away from the realm of top human professionals.

In 2006 a new approach to Go AI began sweeping the Go scene: Monte Carlo. This method builds a tree of potential moves and responses, but evaluates each move in the tree by using a semi-random playouts. In a playout, the program plays the game to completion, using either random moves or very simple heuristics. When a playout results in a win, that improves the score of the moves it originated from, and the best performing move is chosen. As of 2012, a program playing on KGS has reached 6 dan. Naive extrapolation would say that computers will reach professional strength sometime between 2015 and 2018. Unfortunately, no one knows any way to reliably estimate future progress. Strength improvements could slow at any time, or new ideas could make them come faster.

It has been argued that these ranks are a bit deceptive, however, because with sufficient practice playing against computers, a human can often adapt to the weaknesses of the programs and subsequently perform much better against them (see the Shodan Go Bet for one attempt to confirm this idea). Also, it is a common belief that on the go server people tend to take games against computer opponents less seriously. Today, the top programs have won against professionals with a 4 or 5 stone handicap.

Thus computer Go is a very exciting part of Artificial Intelligence (AI), and many new ideas and techniques are yet to be discovered. It is interesting to note that Chess with a much lower state space complexity than go, the computer has proven superior to the top human players much sooner in chess than in go.

Go has much higher state space complexity than most other games so it will take longer to find an algorithm to play at a high level than other games.

In fact, Go endgames have been proven to be PSpace-hard, let alone other parts of the game. Also, many other aspects of Go, including life and death, are also known to be NP-hard. This means that it is very unlikely to be able to find a reasonably fast algorithm for playing perfect Go. So it looks like it's all about heuristics (surprise surprise). Or it means that it will take a long time to find a reasonably fast algorithm to play Go at a very high level.

Patrick Taylor: I'm not sure heuristics are all that bad. Humans don't know the perfect move sequence any more than computers do. Therefore, we basically play with heuristics as well. Proverbs are essentially heuristics used by human players to approximate good play.

In July 2011, KGS records accurately show computer program Zen19s achieving 4 dan. In June and July 2011, Zen19s played 700 games on the KGS Go server. Zen19s plays at 20 minutes main time and then 30 seconds per move. A player can download the games of Zen19s from the KGS server. The player can study the games to find the program's weakness and try to exploit it. After 700 games and almost 2 months, Zen19s is 4 dan on KGS. This shows that finding the weakness of computer players is just as hard as finding the weakness of human players. Computers have truly reached the 4 dan level. Successful chess programs do not use brute force, but are selective in finding a move. Chess programs that use brute force, play at a low level because of the huge branching factor. Chess programmers do not use fast algorithm for playing perfect Chess. Mark Schreiber, July 22, 2011


(Note: 2006 was the last year that GNUGo won the CO)

Strength of strongest programs

It is natural to want to know both which Go program is the strongest, and how strong it is. However, due to the recent progress of computer Go, a definite answer is complicated.

As of July 2013, the common wisdom of the western Go community is that Zen is the strongest program. Zen placed first in the 19x19 section of the 14th Computer Olympiad in 2009 (second: Fuego, third: MoGo) and second in the 19x19 section of the 15th Computer Olympiad (losing to Erica, but beating Many Faces of Go). {Are results for 2011 and 2012 available online?} In the last year Zen has lost monthly Computer Go tournaments on KGS to both CrazyStone and Pachi, and lost the 2013 UEC Cup to CrazyStone on points, despite beating CrazyStone in the direct match-up.

Zen's reputation was established largely by two convincing victories over Takemiya Masaki in March 2012, at five and at four stones. Zen has scored other victories over pros, as well. However, it is not clear how seriously professionals have been taking these exhibition matches. At the TAAI2012?, for example, Zhou Junxun used 8 minutes of his 45 minutes of thinking time in his game against Zen.

As no professional system has awarded Zen a rank or arranged a formal contest between Zen and a professional player, the best gauge of Zen's strength is its rank on KGS. On July 20 2013, the account Zen19d is a strong 5d on KGS. Zen19d was 6d for a few games, but was unable to defend the rank. However, because Zen is only available to play on KGS for short periods, demand is high and many of its opponents play it at times and at time settings that they would otherwise refuse. Furthermore, in July 2013, Zen19d played 346 game, in nearly half of which (147 games) Zen gave three or more handicap stones; giving six stones in 10% of the games (34 games); high handicap games favor white. It remains to be seen what rank Zen can defend on a long-term basis.

However, stubborn skeptics have consistently claimed that the KGS ranks of strong programs are inflated, and have been embarrassed by the programs' fast progress again and again. Programs went from 2 kyu in 2007 to 6 dan in 2012. See KGSBotRatings.

References

See the [ext] Computer Go Bibliography maintained by the [ext] Computer Go Group at the [ext] University of Alberta.


List of Existing Go Playing Programs

Please see Go Playing Programs for a discussion about the best programs currently available.


List of Existing Problem Solving Programs

Robots

Some online Go servers, such as KGS, provide software opponents or robots to clients. Robot is derived from the Czech word robota meaning drudgery or slave labour which certainly describes the work needed to pummel double digit kyu players like me. PatG


Anti-Computer Strategy

It might be entertaining and educational to consider strategies for beating computers at Go.

The current generation of strong Monte Carlo programs may be confused by ko. Also, while the weak go programs of the 90s were relatively strong at the endgame, this is no longer a special strength of bots and it is possible to make up points in the endgame.


Competitions

Below are competitions where Go playing programs can be tested.

International Competitions:

Regional Competitions:

  • European Championship (1987-2005,2008)
  • US Championship (1988-2000,2008)
  • USENIX (1984-1988)

Small Board Competitions:


History of Go Programs vs Human Professional Matches

Program vs Professional


History of Go Programs

  • [1989]: GNU Go 1.1 was posted to comp.sources.games March 13 1989
  • [1968]: Albert L. Zobrist wrote the first ever program which played complete Go games.
  • Ishi - old file format

Articles in Magazines


Links

(Smaller Go Bibliography, with comments)

  • [ext] http://gobase.org/go-7.html : Jan van der Steen's computer Go page has numerous links to famous programs and programmers as well as programming resources and articles.

Maillist(-Archives)

  • gnugo-list

Authors: Gounter, Chestnut



Computer Go last edited by 172.0.9.139 on July 27, 2013 - 01:42
RecentChanges · StartingPoints · About
Edit page ·Search · Related · Page info · Latest diff
[Welcome to Sensei's Library!]
RecentChanges
StartingPoints
About
RandomPage
Search position
Page history
Latest page diff
Partner sites:
Go Teaching Ladder
Goproblems.com
Login / Prefs
Tools
Sensei's Library