program a complex program made to emulate human emotions on to a extremely powerful computer and when it is almost human-like, you give it a series of morale no win scenarios without a programmed outlet. The only way for the program to stop running is for the computer to decide how to react with no data on the outcome and no user input on what to do. I guess they think if you let to program sit long enough, the computer will "evolve" and make decision thus the birth of a A.I.
This sounds a lot like Dr. Tenma's theory in Urasawa's "Pluto" - and that's why he claimed Atom wasn't "perfect" . . . of course [spoiler alert] he changed that when introduce the "no win scenarios" and we get to see a side of Atom that is actually scary
