At the Artificial General Intelligence conference held at the University of Memphis last month, a group of Rensselaer researchers unveiled “Eddie,” a simulated four-year-old in the online world of Second Life. Eddie is a code controlled avatar that can hold beliefs and reason at the same level as a four-year-old.
The project, led by Selmer Bringsjord, the head of Rensselaer’s Cognitive Science Department, was developed at the Rensselaer Artificial Intelligence and Reasoning Lab on campus. The project began over a year ago and, with support from IBM, has recently reached the second stage of development with a proof-of-concept and demonstration.
The showcased simulation has two subjects and Eddie standing before a table on which rests a teddy bear and two boxes labeled “A” and “B.” One subject places the teddy bear in Box A, and then asks the second subject to leave. The first subject then moves the teddy bear to Box B and calls the second subject back. Eddie is then asked where the second subject would look for the teddy bear. Eddie, with the reasoning abilities of a four-year-old, will say Box B.
This demonstrates the reasoning of a four-year-old; at that age children have yet to develop an understanding of the minds of others. Eddie’s mind can easily be improved, however, to know that the second subject would pick Box A.
Eddie is controlled by automated software that simulates keystrokes a player would use in Second Life to control his or her avatar. The software is coupled with a procedure that turns the conversational English used in Second Life into code that can be understood and interpreted. Research will continue on the Second Life engine for the time being, as it “offers the best combination of ‘engineering access’ and popularity with the public,” Bringsjord explained.
The goal of the research is not to predict human behavior through the use of artificial intelligence, but to create more interesting and useful avatars that have mental states that allow for reasoning abilities similar to humans. These avatars will not only be used in “entertainment and gaming, but also education and homeland defense,” the research team stated in a recent press release.
Eddie is just the beginning of the project, however—not the final result. The research done with Eddie is the groundwork for what could one day be a virtual reality system similar to the holodeck on Star Trek.
“The idea is to build a form of the holodeck. That’s the ultimate goal,” Bringsjord stated. He is confident that with the support of RPI, IBM, and the Computational Center for Nanotechnology Innovations (CCNI) this can be accomplished. “All we need is the money to support the relevant [research and development].”
The holodeck research would combine the uses of the CCNI and EMPAC. The use of the supercomputing system will allow for the execution of problems previously unsolvable, such as the search for theorems, specifically if a particular formula is a valid theorem for use in the project. Bringsjord explained, “My colleague [Konstantine Arkoudas] and I believe that there are specific ways to parallelize the search for theorems—and this kind of search is what is going on behind the scenes in the case of Eddie.” The visual and audio capabilities of EMPAC will allow the team the use of extensive immersive environments that will be of use for the holodeck.
Artificial intelligence research is a rapidly growing field that parallels the pace of computer technologies. The technology recently made available to the RPI community through the CCNI will expedite the pace of Bringsjord and his colleagues’ research. What was nothing more than science fiction 20 years ago is now the end goal of an extensive research project.