New York: Achieving a landmark in Artificial Intelligence (AI), a Google-run programme has defeated the reigning human champion of Go - a complex Chinese board game that has beleaguered AI experts for decades.
While machines had outplayed the best human players at most games such as chess, draughts and backgammon in the past, this is the first time ever that a computer program has defeated a human professional player at Go.
Watch: Google's AI beats human Go champion for the first time
Video courtesy: DeepMind/YouTube
DeepMind, Google's taciturn artificial intelligence arm, claimed that its programme AlphaGo defeated Fan Hui - the current champion of the game that was developed in China 2,500 years ago.
Fan Hui has devoted his life to the game since the age of 12 and extended his streak five games to nil in October last year in London.
Go -- a game of profound complexity -- is played by more than 40 million people worldwide. The number of possible positions in the game are more than, as they say, the number of atoms in the universe.
The game Go involves players taking turns to place black or white stones on a board, trying to capture the opponent's stones or surround empty space to make points of territory.
"When we set out to crack Go, we took a different approach. We built a system, AlphaGo, that combines an advanced tree search with deep neural networks," Demis Hassabis, the chief executive of Google DeepMind, said in a statement.
"After all that training, it was time to put AlphaGo to the test. First, we held a tournament between AlphaGo and the other top programmes at the forefront of computer Go," added David Silver of Google DeepMind, the lead author of the study.
Facebook is also working on beating Go and on Tuesday, the social website shared its advancements. Its CEO Mark Zuckerberg posted about the game, noting that while his artificial intelligence (AI) scientists have not beaten it yet, they are “getting close”.
“The ancient Chinese game of Go is one of the last games where the best human players can still beat the best artificial intelligence players. Scientists have been trying to teach computers to win at Go for 20 years,” he wrote.
“We are getting close, and in the past six months we have built an AI that can make moves in as fast as 0.1 second and still be as good as previous systems that took years to build,” he posted.
The first game mastered by a computer was noughts and crosses (also known as tic-tac-toe) in 1952. In 1997, IBM's Deep Blue computer famously beat Garry Kasparov at chess.
In a paper published in the journal Nature on 28th January 2016, researchers announce the achievement detailing the techniques used.
(With Agency inputs)