Eric Nyberg knows a thing or two about Watson, IBM's super intelligent computer competing on Jeopardy! and why the robot has owned the classic quiz game show.

Watson, which is equipped with question answering technology, has taken Jeopardy! by storm by dominating his opponents, Ken Jennings and Brad Rutter. At the end of the second day of a three day competition, Watson has $35,734 while Rutter and Jennings have $10,400 and $4,800 respectively. The competition will conclude tonight as Watson and his competitors Brad Rutter and Ken Jennings will play one last entire game of Jeopardy!, and add their cumulative scores.

The winner will get $1 million. IBM has said it will donate their winnings to charity, while Jennings and Rutter will donate half of their winnings.

Nyberg, a Carnegie Mellon University computer science professor, led a team of researchers at the university's Language Technologies Institute, to assist IBM in the development of the Open Advancement of Question-Answering Initiative (OAQA) architecture and methodology. Two of Nyberg's CMU students even worked on Watson directly as interns this past summer.

I'm satisfied with what we've seen on TV, Nyberg said. I think its representative of Watson's capabilities.

Nyberg, who once got a chance to compete against Watson himself, says the robot has definite strengths and weaknesses. Despite what people watching at home might believe, the robot isn't perfect. Watson, which is made up of 90 IBM Power 750 servers using 15 terabytes of RAM and 2,880 processor cores, is at its best when the clues are easier.

You have to remember Watson is fundamentally different from humans. If I know an answer, I can buzz in and I have a few seconds to retrieve it. Contestants will do this. Watson will not buzz in unless it has the answer. By the time it has buzzed in, it already has a high-confidence answer. Where Watson is dominating is when the clues are easy, it can get a high-confidence answer quickly, Nyberg said.

Whereby humans may take a few seconds to process a question and the clues, if Watson knows the answer, it's almost automatic. With all of its processing CPU power, Watson can scan two million pages of data in three seconds.

However, Watson is not the perfect machine. When the clues are hard to understand or it doesn't have good resources, it comes up with answers you and I would never give. It doesn't dominate, it still has weak spots, Nyberg said.

Yesterday, Watson screwed up on the final Jeopardy! question and show its weaknesses. The answer was, This city's largest airport is named for a World War II hero; its second largest, for a World War II battle. The question was, What is Chicago? Both Jennings and Rutter answered correctly, while Watson answered, What is Toronto? While there are U.S. cities named Toronto, they are not large enough to have two airports.

David Ferrucci, the manager of the Watson project at IBM Research and someone Nyberg has worked with extensively, gave multiple reasons for the odd screwup. He said Watson downgrades the significance of category titles and since what U.S. City wasn't in the question; it probably didn't know it had to be in America.

Also, Ferrucci said Watson was likely confused because there is a city named Toronto in the United States and the Toronto in Canada has an American League baseball team. Ferrucci was pleased with how much Watson bet.

As far as Watson's next frontier is concerned, IBM and Nyberg say it could be in healthcare. In fact, IBM has already begun working on Watson based healthcare applications.

Physicians might be able to use a Watson MD when there are questions about strange symptoms with unusual conditions. You can have Watson sit through textual information about what treatments there are and what kinds of patients have had it. This is important. Most of the information about patients is written in free text, difficult to leverage that without a tool like Watson, Nyberg said.

To contact the reporter responsible for this story call (646) 461 6920 or email g.perna@ibtimes.com.