Showing posts with label human interface. Show all posts
Showing posts with label human interface. Show all posts

Wednesday, June 4, 2014

Exoskeletons: MIT lab designs workload-sharing robotic limbs - Video

Credit: d'Arbeloff Laboratory

Mention "robotic limbs" and one thinks of devices being developed to replace the loss of human limbs.

Mention "exoskeleton" and one thinks of a suit governing and bound to the entire body.

Researchers at the d'Arbeloff Laboratory for Information Systems and Technology at MIT, led by Professor Harry Asada, Ford Professor of Engineering, have been breaking ground in another direction.

They are working in a co-robot world, and they are developing "extras" for what the person already has.

Videos showing people performing tasks tell a story of what future work might look like when an extra set of arms or legs will be of significant help.

"Supernumerary Robotic Limbs" (SRLs) is the formal term to describe robotic limbs that, when worn, augment limbs already in place.

"Imagine that one day humans will have a third arm and a third leg attached to their body. The extra limbs will help them hold objects, support the human body, share a workload, and streamline the execution of a task.

If the movements of such supernumerary limbs are tightly coupled and coordinated with their arms, the human users may come to perceive the extra limbs as an extension of their own body," the Lab team suggest on their site.

"The goal of our work is to build a co-robot that becomes a functional extension of the human body."

In such settings, the extra arm or leg attached to the body helps to hold objects, share workloads, and streamline tasks.

Situations might include trying to open a door when you need to keep holding something with both hands or having an extra hand to keep something in place during construction.

The devices would look odd on people walking down a city street or at a mall, but the designs deliver practical relevance for a workforce.

A note from the Lab's Baldin Llorens and Prof Harry Asada, for example, said, "In the demanding manufacturing industry, Human-Robot collaboration has proved to be a strong alternative when it comes to tasks that cannot be fully automated."

To optimize productivity, the robots in their designs serve to complement, not replace, human actions. The human worker perceives the robot not as machine but as body extension.


In an aircraft assembly scenario, the Laboratory presents an example where the SRLs are coordinated with the workers to help execute specialized aircraft assembly tasks.

"We focus on the task planning process, communication and coordination between the human worker and the SRL and control implementation."

Evan Ackerman, reporting on their work in IEEE Spectrum, explained what goes into that communication between human and extra limb.

How do these robotic limbs know what to do? Ackerman said "the SRL watches what you're doing with your arms to decide how to move.

It does that by monitoring two inertial measurement units (IMUs) that the user wears on the wrists. A third IMU sits at the base of the robot's shoulder mount, to track the overall orientation and motion of the SRL."


With the gyro and accelerometer data, the limb can predict, based on a model created by demonstration learning, the helpful arm position.

If the person raises arms above the head, the SRLs go above the head too, seeing signs that the person is trying to hold something up.

"Using their SRL prototype," said Ackerman, "the researchers are testing different 'behavioural modes' to program the limbs to do what they want."

Credit: d'Arbeloff Laboratory/IEEE

One model has limbs springing from the shoulders for tasks that take place over the head.

Other constructs involve waist-mounted SRLs that can be used as two extra arms, two extra legs, or one of each.

MIT researchers were in Hong Kong at the IEEE International Conference on Robotics and Automation (ICRA) on Monday, said Ackerman, where they presented SRL prototypes.

Saturday, December 17, 2011

Life Robotic: Nextage Industrial Humanoid - YouTube



Robots aren’t new to product assembly, but these bots show a real improvement over the typical industrial machine. Nextage bots work in perfect coordination to complete tasks (in this case, they’re putting together a simple electronic sign).

Each is aware of its surroundings; if one robot is moved, it can orient itself and return to work. The developers, AIST and Kwada of Japan, made the robots look and move like humans so they can easily fit into our work environments. The question, of course, is if they’re skilled enough to actually replace the human workforce in factories.

Monday, December 12, 2011

Cynthia Breazeal: Pioneer of Social Robotics

Mosaic portrait by Charis Tsevis 

Cynthia Breazeal is a pioneer in Social Robotics and Human Robots interaction.

She is leading the Personal RobotsCynthia Breazeal is a pioneer in Social Robotics and Human Robots interaction.

She is currently leading the Personal Robots Group at the MIT Media Lab.

The image shows Cynthia and her robot Kismet. Cynthia created Kismet during her studies in the early 90s.

Background
The Personal Robots Group focuses on developing the principles, techniques, and technologies for personal robots.

Cynthia and her students have developed numerous robotic creatures ranging from robotic flower gardens, to embedding robotic technologies into familiar everyday artifacts (e.g., clothing, lamps, desktop computers), to creating highly expressive humanoids, including the well-known social robot, Leonardo.

Ongoing research includes the development of socially intelligent robot partners that interact with humans in human-centric terms, work with humans as peers, and learn from people as an apprentice.

Other projects have explored how HRI can be applied to enhance human behaviour as applied to motor learning and cognitive performance.

More recent work investigates the impact of long-term HRI applied to communication, quality of life, health, and educational goals.

The ability of these robot systems to naturally interact, learn from, and effectively cooperate with people has been evaluated in numerous human subjects experiments, both inside the lab and in real-world environments.

Wednesday, December 7, 2011

Share Your Body with Telepresence Robot



Ever wished you could be in two places at once? Now you can share your body with a telepresence robot created by Dzmitry Tsetserukou of Toyohashi University of Technology in Japan and his team while receiving information about its body to experience another location remotely. "It's like body-to-body telepresence," says Tsetserukou.

By wearing a special belt and using your body like a joystick, the robot can be steered from a distance. Flexible sensors in the belt detect the direction and angle of your movement, allowing the robot to zip forward if you lean forward heavily.

At the same time, the belt can give you a sense of the robot's environment. An on-board laser scanner detects obstacles outside the robot's field of view and conveys their direction by activating vibrating sensors in your belt. For example, when the robot faces a table, the front sensors start to buzz.

Special glasses also allow you to share the robot's view by receiving stereoscopic images from cameras in its eyes.

According to Tsetserukou, existing telepresence robots only allow you to experience what a robot sees and don't tap into the other senses. The team plans to develop their robot to make teleconferencing more immersive and to allow people to play sports remotely.

The robot will be presented at Siggraph Asia, a yearly conference on computer graphics and interactive techniques, from December 13-15 in Hong Kong.

Wednesday, November 16, 2011

Brain Computer Interfaces: Melding Man and Machine

Controlling computers and other electronic gadgets with just a thought seems like technology that would decades before it becomes a reality but scientists are already developing computer chips that can connect to a computer system.

Brain computer interface technology is a way for a human brain to connect to an external computer system to control electronic devices like computer cursors or robotic arms.

An electrode chip can be implanted into the brain to capture the electric signals from the brain.

The computer will then translate these electric signals to the appropriate actions by applying signal processing algorithms.

The first ever human to be implanted with BCI chip was Matthew Nagle back in June 22, 2004.

Nagle was a tetraplegic who was paralyzed from the waist down after being stabbed. John Donaghue, a Brown University professor and his team implanted an electrode in Nagle's brain that has allowed him to open his e-mail, control a computer mouse and draw on the screen.

He could also send commands to an external prosthetic hand. The device was removed from him after a year.

The brain implant system or BrainGate uses a neural interface with a wired connection running out of the metal nub embedded in the skull. A new team, called BrainGate 2 is working on a wireless interface that would use the same sensors imbedded in the brain as the original BrainGate but would use a laser to transmit the data from the brain.

BrainGate 2 is collaboration among engineers and doctors from Brown, Harvard, and Sanford.

While the BrainGate system uses implanted electrodes, another team from the University of Maryland is looking at a brain cap that will perform the same functions as the implanted electrode but without the invasive procedure.

"We are on track to develop, test and make available to the public- within the next few years -- a safe, reliable, noninvasive brain computer interface that can bring life-changing technology to millions of people whose ability to move has been diminished due to paralysis, stroke or other injury or illness," said Associate Professor of Kinesiology Jose Contreras-Vidal of the university's School of Public Health.

Monday, October 10, 2011

First two-way interaction between a primate brain and a virtual body

In a first-ever demonstration of a two-way interaction between a primate brain and a virtual body, two monkeys trained at the Duke University Center for Neuroengineering learned to employ brain activity alone to move an avatar hand and identify the texture of virtual objects.

“Someday in the near future, quadriplegic patients will take advantage of this technology not only to move their arms and hands and to walk again, but also to sense the texture of objects placed in their hands, or experience the nuances of the terrain on which they stroll with the help of a wearable robotic exoskeleton,” said study leader Miguel Nicolelis, MD, PhD, professor of neurobiology at Duke University Medical Center and co-director of the Duke Center for Neuroengineering.

Without moving any part of their real bodies, the monkeys used their electrical brain activity to direct the virtual hands of an avatar to the surface of virtual objects and, upon contact, were able to differentiate their textures.

Although the virtual objects employed in this study were visually identical, they were designed to have different artificial textures that could only be detected if the animals explored them with virtual hands controlled directly by their brain’s electrical activity.

The texture of the virtual objects was expressed as a pattern of minute electrical signals transmitted to the monkeys’ brains. Three different electrical patterns corresponded to each of three different object textures.

Because no part of the animal’s real body was involved in the operation of this brain-machine-brain interface, these experiments suggest that in the future patients severely paralyzed due to a spinal cord lesion may take advantage of this technology, not only to regain mobility, but also to have their sense of touch restored, said Nicolelis, who was senior author of the study published in the journal Nature on October 5, 2011.

“This is the first demonstration of a brain-machine-brain interface (BMBI) that establishes a direct, bidirectional link between a brain and a virtual body,” Nicolelis said

“In this BMBI, the virtual body is controlled directly by the animal’s brain activity, while its virtual hand generates tactile feedback information that is signaled via direct electrical microstimulation of another region of the animal’s cortex.”

“We hope that in the next few years this technology could help to restore a more autonomous life to many patients who are currently locked in without being able to move or experience any tactile sensation of the surrounding world,” Nicolelis said.

“This is also the first time we’ve observed a brain controlling a virtual arm that explores objects while the brain simultaneously receives electrical feedback signals that describe the fine texture of objects ‘touched’ by the monkey’s newly acquired virtual hand,” Nicolelis said.

Read more of this article here

Monday, April 4, 2011

Electronic components made from human blood used in cyborg interfaces

Could electronic components made from human blood be the key to creating cyborg interfaces?

Circuitry that links human tissues and nerve cells directly to an electronic device, such as a robotic limb or artificial eye might one day be possible thanks to the development of biological components.

Writing in the International Journal of Medical Engineering and Informatics, a team in India describes how a "memristor" can be made using human blood.

Memristors were a theoretical electronic component first suggested in 1971 by Berkeley electrical engineer Leon Chua and finally developed in the laboratory by scientists at Hewlett Packard using titanium dioxide in 2008.

A memristor is a passive device, like a resistor, with two terminals but rather than having a fixed electrical resistance, its ability to carry a current changes depending on the voltage applied previously; it retains a memory of the current, in other words.

There are countless patents linking the development of memristors to applications in programmable logic circuits, as components of future transistors, in signal processing and in neural networks. S.P. Kosta of the Education Campus Changa in Gujarat and colleagues have now explored the possibility of creating a liquid memristor from human blood.

In parallel work they are investigating diodes and capacitors composed of liquid human tissues.

They constructed the laboratory-based biological memristor using a 10 ml test tube filled with human blood held at 37 Celsius into which two electrodes are inserted; appropriate measuring instrumentation was attached.

The experimental memristor shows that resistance varies with applied voltage polarity and magnitude and this memory effect is sustained for at least five minutes in the device.

Having demonstrated memristor behavior in blood, the next step was to test that the same behavior would be observed in a device through which blood is flowing. This step was also successful.

The next stage will be to develop a micro-channel version of the flow memristor device and to integrate several to carry out particular logic functions. This research is still a long way from an electronic to biological interface, but bodes well for the development of such devices in the future.

"Human blood liquid memristor" in Int. J. Medical Engineering and Informatics, 2011, 3, 16-29

Thursday, May 14, 2009

Female(?) Robot asks for Directions



Robots are getting better at finding their way around unknown areas, and making their own maps as they explore. But robots lost in urban areas don't need to rely on their own faculties to get from place to place, German roboticists have shown.

Their mobile robot simply rolls up to any humans nearby and asks for directions. By using that strategy, their robot has become one of the first to be properly let loose in the real world, not just carefully controlled environments.

Martin Buss's team at the Technical University of Munich dumped their mobile robot outside the university and instructed it to find its way to the Marienplatz in the centre of Munich, some 1.5 kilometres away.

Wednesday, May 6, 2009

Maschinenmensch - The hunt for Singularity

This fictional robot, known as the Maschinenmensch or "false Maria", featured in Fritz Lang's 1927 film Metropolis (Image: Everett Collection / Rex)

Vernor Vinge

Although Kurzweil is the public face of the singularity today, Vernor Vinge coined the term.

He was inspired by a monograph written in 1964 by the statistician and Bletchley Park code-breaker Irving John Good entitled "Speculations concerning the first ultra-intelligent machine". Good argued that for humanity to survive, we must create a machine more intelligent than ourselves. Such a machine would be able to continually improve itself, becoming more and more intelligent, essentially without limit.

The idea was developed by Vinge, a computer scientist and science fiction author, in an essay written in 1993 called "The coming technological singularity". He argued that the singularity would occur in the mid-21st century, and that barring civilisation-wide disasters it is inevitable.

Vinge has also explored the idea in a number of science fiction novels.

Statistician Irving John Good (1965) speculated on the consequences of machines smarter than humans:

"Let an ultra-intelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultra-intelligent machine could design even better machines; there would then unquestionably be an ‘intelligence explosion,’ and the intelligence of man would be left far behind. Thus the first ultra-intelligent machine is the last invention that man need ever make."

Wednesday, April 22, 2009

Go Badger!


A University of Wisconsin-Madison biomedical engineering doctoral student, Adam Wilson is among a growing group of researchers worldwide who aim to perfect a communication system for users whose bodies do not work, but whose brains function normally. Among those are people who have ALS, brain-stem stroke or high spinal cord injury.