Software used to play games (such as chess) is an example of the virtual-agent type of robot. Sometimes, but not always, the software type is called a bot (instead of robot) to differentiate between the two types.
Edit: See the comment here for a better definition/explanation. Thanks to RoboJenny for her comments and clarification. (She's a software developer and a bridge player.)
Poker has had matches between humans and bots. If that interests you, then click here to read the article "Get to know your bots," from the Two Plus Two Internet (poker) Magazine.
Bridge software has generally taken a different route. The robots usually compete against each other (rather than against humans). The World Computer Bridge Championship (WCBC) began in 1997 at the World Bridge Championships in Albuquerque NM. It was first sponsored by the ACBL, but is now a joint project between the ACBL and the World Bridge Federation.
The competition is held annually at various cities around the world. In 2005, it was held in Estoril, Portugal, for example, and in Shanghai, China, in 2007. The developers bring their product and take it very seriously. The 2008 competition was held in Las Vegas NV in conjunction with the ACBL Summer NABC. I observed the robots playing. In one match, the humans were watching a match between their robot and a competitor and said "Duck, duck, duck, oohhhh no!" when they could see the only chance to set a contract was for their robot to duck a trick, giving the other robot a chance to make the wrong play. They were definitely into it.
Seven robots entered the team competition in Las Vegas: Bridge Baron (U.S.A.), Jack (the Netherlands), Micro Bridge (Japan), Q-Bridge Plus (Germany), RoboBridge (the Netherlands), Wbridge5 (France) and Shark Bridge (Denmark). (BlueChip Bridge (U.K.) played in the one-day individual tournament, but not the teams.) The two favorites going in were five-time winner Jack and Wbridge5, the defending champ that also won in 2005.
A robotic team consists of four bots, one pair sitting North-South and another pair sitting East-West (at the other table), just as four humans would do. During the qualifying stages, they played a 32-board match against each of the other robotic teams and the IMPs were converted to a 30-point Victory Point scale. This is called a round robin, and took three days to complete.
The top four teams advanced out of the round robin to the semifinal knockout stage, scored by IMPs plus carryover from the round robin. They engaged in 64-board heads-up matches -- the winner advanced, the loser went home. Jack played Microbridge and won 166-112. Wbridge5 advanced by beating Shark Bridge by a score of 139-121. The two favorites met in the last match.
In the final, also a 64-board match, Wbridge5 held on to defeat Jack by a 172-157 margin. You can read more about it in the Daily Bulletin -- look on the bottom left of page 1, and then go to the jump on page 8. After I took a photo of the winner, I wanted one of second place. One of the two (second-place) developers was so upset, he didn't want to be photographed! Finally, he cooled off and I got his picture. I told you they are very serious about this.
This deal was played by the humans at the 2006 World Championship in Verona, Italy, and then replayed by the robots in their competition:
6♦ and 6♥ both make on the lie of the cards, but 6♠ goes down after a club lead. When the humans played this in the women's competition, no pair bid either red-suit slam. Three pairs bid 6♠, making once. In the open competition, 6♥ was reached seven times, 6♦ once and 6♠ was bid six times, making twice.
♠ K 9 7
♥ Q J 5
♦ K Q J 9 4
♣ Q 9
==
♠ A 10 6 5 2
♥ A K 9 7 3
♦ A 3
♣ 5
None of the robots bid 6♠. Micro Bridge bid slam via 1NT (12--14 high-card points) by North, transfer to spades by South, then 3♥ by South to show its second suit, 4♦ by North to show extra values, then 6♥ by South which asked North to pick a slam.
Q-plus Bridge started with a 1♦ by North, and had to contend with a 3♣ bid by East, yet it also reached 6♥.
According to WCBC organizer Al Levy, of Commack NY, the bidding and play of the top robots has improved each year. That's impressive given how complex bridge is. Despite these words, computer bridge is still in its infancy, especially compared to computer chess. Chess is a pure brain-power game, and thus more suitable for programming. Bridge has a partnership/communication aspect and an intuitive aspect that makes it more challenging from a programming standpoint.
Jack and Wbridge5 squared off in the final of the World Computer Bridge Championship.
When the robot encounters a situation that isn't covered by the "rules" that it goes by, it does many simulations. It constructs possible hands that its partner or opponents may have based on the information it has at that point. Then after constructing these deals, it analyzes them and makes the bid that it thinks has the best chance to succeed. Here's a nice article by Jason Rosenfeld and Stephen Smith, How Computers Play Bridge, that explains in more detail.
Wikipedia also has a decent article on computer bridge here.
Bridge expert Karen Walker has written about bridge software here.
Jim Loy tested and reviewed bridge software back in 2000. Even though eight years is an eternity in computer-years, you might find it interesting.
Finally, the World Computer Bridge Championship has its own home page.
There is no strict definition of robot. However, most of my colleagues and myself define a robot as something that senses and reacts to its environment. A virtual-agent is simply an artificial intelligence, and can often be called a bot. Though bot was originally coined to be short for robot, at this point it has a different meaning. Bots are computer agents of artificial intelligence that perform some task automatically. A robot still needs to be physically sensing an environment.
ReplyDeleteAs for whether computer bridge players can eventually beat out the world bridge players, the engineering mind in me makes me think we should be able to encode all the bridge rules and exceptions and accomplish this somehow. Meanwhile, having heard great experts describe how they figure thing out, I realize there is an aspect that you cannot arrive to using a decision tree or even a probabilistic model. I've always thought that there are two types of players who consistently do well at national events. The first type is the type who got into duplicate bridge because it's one big math/logic puzzle. The second type is the type who got into it at a young age due to family members in the game. The latter type is particularly interesting. There is so much that they can do in their heads directly related to bridge that they cannot calculate if you were to restate the problem in sticks and balls. Yet, somehow they can compute more of the complex bridge calculations in their minds in fractions of a second than the best of the previous type. So maybe while everything can supposedly be stated in rules and exceptions, choosing which one to follow isn't necessarily easy to determine.