Artificial intelligence, the all too familiar term, gets closer to the real thing as scientists and engineers work hard to integrate more and more human responses into these machines
##CONTINUE##
Robotic technology has improved tremendously during the last few decades. But latest innovations promise that future robots will not only mimic humans in their daily functions, but will also have emotions.
Artificial intelligence, the all too familiar term gets closer to the real thing as scientists and engineers work hard to integrate more and more human responses into these machines.
Science fiction movies like WALL-E and AI have already introduced the idea of robots having human-like emotions and made us even feel happy and sad for them in various situations. In fact, there were not many who did not shed a tear at the little boy-robot in AI.
In reality, there are already many robots that can mimic human emotions and have facial features showing reactions to certain situations. The Massachusetts Institute of Technology has created robots like ‘Leonardo’, ‘Kismet’ and the most recent ‘Nexi’.
This robot has a very mobile face and expresses emotions almost like a human. Its neck mechanism has been designed to have four ‘degrees of freedom’ (DOF), at its base including a tilting of the head. Its movements have the same speed as humans showing emotions, etc. Nexi also uses its eyebrows, eyelids and its gaze to appear more human-like. To top it all, it has an active 3D infrared camera fitted into its head and makes sounds using four microphones to give the real effect!
The chassis or outer covering of Nexi is based on the advanced uBot5 mobile manipulator manufactured by the Laboratory for Perceptual Robotics UMASS Amherst. Its base balances on two wheels with a ‘Segeway-like’ body and has arms capable of picking up 10 pounds of stuff. Moreover, its plastic coated chassis is sensitive to human touch — which gives a new meaning to the phrase, ‘reach out and touch someone’.
Now all the robot needs to do is react and show emotions like the Meiji University’s School of Science and Technology’s ‘Kansai’ robot. Nexi’s emotional responses can also be compared to that of its South Korean counterpart, EveR2-Muse Robot that has a more human face.
But in any case, it is difficult to make a robot actually look human during an emotional reaction as humans feel and see situations in life, and the job becomes even tougher for robotic engineers and virtual reality experts when they reach the ‘uncanny valley’, a point when the almost human responses get mixed up with doll-like movements. The kind that one can see in The Polar Express and Beowulf, where the scenes are a little ethereal or creepy to say the least.
‘It turns out that, as human beings, we’ve developed these incredible capacities to interact with each other using language and visual, nonverbal behaviour. Without non-verbal behaviour it doesn’t look good, it looks sick or demented,’ explains Stacy Marsella, a computer scientist at the University of Southern California.
Furthermore, computer scientists have helped the US Army to develop virtual training simulations, and in this case, the characters developed have to express themselves with the right facial expressions, including body movements, so that the trainees can interact with them comfortably.
Artificial or real?
The challenge scientists have at hand is that a robot must recognise human feelings and then copy the same as reaction, and lastly and fantastically, feel alive! But the most daunting task for the ‘machine’ is to understand the feelings a human might be conveying and then respond appropriately. Psychologists call it the ‘theory of mind’ which enables one to perceive the intentions of a human being or an artificial agent as they are referred to.
The Massachusetts Institute of Technology, including some other institutes, is working on an artificial agent that has only the first glimpses into the so-called ‘theory of mind’. But to create an artificially intelligent robot that can have a nice friendly chat with a human, however, remains a difficult task. Interestingly, the US Army is looking for an artificially intelligent agent to train soldiers to handle difficult and complicated situations like being mediators in meetings with tribal leaders in places like Afghanistan.
Researchers are toying with the idea of trying out their virtual human in online computer games to test whether their artificially intelligent virtual humans felt real to their human counterparts. ‘I think that eventually we’ll be able to convince people that they’re interacting with a human,’ Marcella hopes but also adds that he cannot tell when that might happen.
Another aspect of robotic technology is the integration of robotic parts in human bodily parts that have been damaged. Remember The Bionic Man? Well the idea does not seem so farfetched after all. Work is underway for neural messages to supply artificial vision through cyber-circuits. These kinds of devices will eventually shrink to cellular size, which will be capable of better and enhanced vision to damaged eyes.
At present, the artificial vision systems that restore vision are somewhat ‘clunky’, but Robert Spence, a one-eyed filmmaker wants to make a real cyborg kind of eye enhancement, using a bionic eye camera known as the ‘EyeBorg’. The device is under development and will be set on a kind of ‘peg’ in his right eyeball, so that it will move in all directions. In other words, ‘EyeBorg’ will shrink and embed the vision technology into the human body and will be unnoticeable.
Additionally, walking robots are also seen time and again in various parts of the developed world. For instance, a Japanese robot made a fashion debut on the ramp. Even though the robot did not exactly match the swishy stride of a human supermodel, it did though achieve walking close to the way a human does on both its legs.
As far as animal moves go, the four-legged BigDog can run up to four miles per hour and carry 340 pounds of rubble while climbing up a 35degree slope. And last, but not the least, robots set to explore the vast cosmos and walk alien worlds are already taking over space explorations.
In other words, robots are being designed not only to mimic human functions but also to help and perhaps improve the load of daily work in all aspects of humans’ existence. But as someone truly said ‘robots cannot shed tears at the beauty of a Martian sunset’, or can an artificially intelligent manmade creature experience the joy of holding a newborn for the first time? Or experience the comfort of hugging a loved one? Not ‘really’?
-----------------------------
BY Fatima Sajid
Source:DAWN.COM
©2009 DAWN Media Group. All rights reserved.
Subscribe to:
Post Comments (Atom)
0 comments:
Post a Comment