Saturday, 31 October 2015

ECCI



Robot in first place is the most advanced in the world but also one of the spookiest. Complete with prosthetic limbs and tendons are made of plastic, this robot even has its own brain that can learn from its mistakes and correct them. So much like the terminator, the type of exoskeleton robot is actually very real, let's hope they do not turn on us and attack us, of course.

Ecchi is the first robot that has a network of muscles and tendons, as well as a framework that helps it move like a human. Everything is made of plastic material specially developed.
He also has a brain that is capable of correcting his mistakes.

Developed by a team of scientists at the University of Zurich, Ecce stands Eccerobot. Ecce is the Latin word for Behold

This robot uses electric motors to drive the serial collection of tendons are interconnected. A computer was placed in his brain so that he is able to learn from its mistakes.

For example, if the motion he made to make him stumble or drop an item, then the information is stored and analyzed in the brain to prevent it repeating the same mistakes.

Scientists hope these robots can start a new generation of robots that can be used as artificial limbs in the future.

Rolf Pfeifer, director of the laboratory for artificial intelligence at the University, said, "This opens up many opportunities but specifically this will help us understand better how complicated when a human body moving goods. If we can make the robot's hand like a human hand then it opens all kinds of possibilities for artificial limbs. It also means a robot that moves like a human can overcome some jobs that require human hands. "

Scientists have been working on the project millions of pounds for three years with funds provided separately by private companies and two million euros from EU funds.

It looks like a stripped-down version of Star Wars character C-3PO.
But this robot is science fact not fiction - and one of the most advanced in the world.
Ecci, as it has been named, is the first ever robot to have 'muscles' and 'tendons', as well as the 'bones' they help move. All made of a specially developed plastic.
And most advanced of all, it also has a brain with the ability to correct its mistakes a trait previously only seen in humans.

If, for example, a movement is causing him to stumble or drop something the information is studied and analysed to avoid making the same mistake next time.
The creation also has the same vision capacity of humans too, despite only having one cyclops style eye.
The scientists now hope their creation will usher in a whole new generation for robots and could aid development of artificial limbs.

Rolf Pfeifer, who is director of the laboratory for artificial intelligence at the University said 'It opens up a lot of possibilities but in particular it will help us to understand better how the human moving apparatus works a complicated task.

'If we can make a robot hand operate like ours then it opens up all sorts of possibilities for artificial limbs. It would also mean a robot that moved like a person could take over some of the jobs done by people where human hands are needed.'
Scientists have worked on the multi million pound project for three years, with funding provided partially by private enterprise alongside two million euros from EU funds.
The team now plan to present a more complete version of Ecci in two months time.



ASIMO HONDA



The development of AI (Artificial Intelligence / artificial intelligence) very rapidly lately become a major trigger a "race" to make an intelligent robot. Honda as one of the leading companies, translating philosophy through production Robotikanya named ASIMO. With in-depth research, this robot has achieved perfection. With the theme "The Power Of Dreams" manufacturer Honda has a wonderful imagination to realize Robotics experts.

Background The birth of ASIMO
Noted, the dream of the robot actually been around since 7000 years ago, but only implemented in the early 18th century by Frenchman Jacques de Vauncanson makes 3 humanoid robotics pioneer Maybe this Frenchman who has grown up now. The idea of ​​this had turned manufacturer Honda ASIMO which was launched on the same principle that is a robot that can do the same as in humans.
ASIMO development takes 20 years to become what it is today. Starting from the labor curiosity Honda, with dreaming and mengimajinasi how human-like robot that can be a thing that allows it to be made. To realize the scientists began by studying the way humans walk and make robotic legs first. Of 13 prototype made finally obtained an answer on how humans walk.
ASIMO began development of type E0 which appeared in 1986 ago. Then developed and followed by the type of E1, E2, E3, E4, E5, E6 to begin simplified form again into type P1, P2, P3. The next type of growing up to now the results are finishing ASIMO.
ASIMO form itself that impressed little has actually been taken into account by scientists. With a shape like that ASIMO be free to operate in a human environment in order to create the impression of people-friendly. Even with such a size, ASIMO can operate the light switches, open doorknobs, desks and tables using a small seating. His height is designed the same as the human eye height when sitting.
ASIMO has a height of about 120 cm, a height as this is perceived by scientists Honda appropriate for active robot in a real human environment.
ASIMO development continues moving towards perfect. Until December 13, 2005 the Japanese manufacturer is launching a new ASIMO humanoid robot that has the ability to adapt in real life as in the office, or level, more real life again. New ASIMO also has more ability to interact with humans such as hand in hand with human and can also be run while carrying goods.
All of this can not be separated from the function of "total control system". The system also can make ASIMO deliver and store important information and present delivery services. Not only that, New ASIMO also designed to improve the performance speed by running up to speed of 6km / hour and can run in a circular pattern. New ASIMO is also complemented by the '' IC Tele-interaction Communication Card '' which is useful as an identifier identity. This tool will work if used by someone, ASIMO can identify and search for the man with the specification maximum distance is 360 degrees.
On this launch, Honda has been successfully developing humanoid robots are real and can be used (Truly useful humanoid robot). Nevertheless, Honda still will continue and will be more active and more focused in developing the field of AI (artifical INTELLIGENCE).
The development of ASIMO

E0 (1986)
This is the first model of ASIMO which form only in the form of two pieces of robotic leg which is the idea of ​​scientists Honda about a dream of making a humanoid robot.
The new robots can walk slowly because at the time it was discovered that only by walking one leg over the other leg first and then the center point of gravity of the body on the feet can be balanced so as not to fall. The time needed to step one foot is about 5 seconds, after which the other foot follows.

E1 (1987-1991)
In this model there is an improvement of E0 is that E1 was able to walk back and forth continuously at a speed of 0.25km / h, with a definite difference of motion between the two legs, so that the balance has been assured.

E2 (1987-1991)
This model became the pioneer runs generally resemble humans, namely at a speed 1.2 km / h. How to walk from E2 is already with the dynamic pace on a flat surface, with a balance point at the midpoint of the robot body.

E3 (1987-1991)
In this model is the development certainly of E2 that gait of the robot has more advanced that already equaled the normal human gait in general is about 3km / h.

E4 (1991-1993)
In this form the conversion began physically different from the three previous form where only changed the way he walked, the form of this robot to 4 knee is extended along 40 cm so as to reduce the speed again to 4.7 km / h.

E5 (1991-1993)
In this model has used an automated propulsion are mounted on legs that are shaped like a locomotive train. In this form the head of the robot becomes very large.

E6 (1991-1993)
This model has achieved a balance control that can be used when the robot goes up or down stairs or when hit a slope. In this model the robot also had to get past an obstacle in front of him.

On the development of three models in the top Honda has found a way so that the robot can run stable and found 3 control techniques, namely:

  1. Floor Reaction Control
  2. Target ZMP Control
  3. Foot Planting Location Control

Mechanism of gait has been successfully applied to the robot E5 where the character is stable, able to walk with two legs alternately walaupaun on the straight path, sloping, stairs, or slope.
In this way, the Honda began to focus on how the development of humanoid robot head shapes (Humanoid robot).

P1 (1993-1997)
It is developing a model of the E- model which has to be formed humanoid robot. Form Of P1 is the first human form which has no arm and body.
Height 1,915 mm and weighs 175kg, turning on and turning off computers and other electrical equipment, open doors, pick up and carry. Research has been reached how to coordinate the movements of the arms and legs.

P2 (1993-1997)
This is the first form of DAPT surprisingly charming public nature of the geraqkannya of a robot.Robot is also called the world's first robot that can manage the system independently.
1.820mm height and weighs 210kg with wireless techniques. His body consists of a computer, motor, wireless radio, and other common equipment. This robot has been able to move freely and go up and down stairs, pushing a stroller, and do other things done without cables or wires are connected to the robot because it uses wireless technology.

P3 (1993-1997)
In this form only change the size and weight into a humanoid robot that is interesting and amusing to look at. This robot has become a robot complete first free and runs using two legs.
Height 1,600 mm and weighs 130kg. Height and weight can be reduced because by changing the material component and centralize the control system. This interesting size which is touted to be better and appropriate if used in a real human environment.

ASIMO / New ASIMO (2000-present)
This form is called as the final form of the robot Honda also more human-friendly and smoother. Increase in evolutionary terms of shape, size, weight and gait making robot models are very attractive to the experts.
The advantages of this model are:

  1. Neat and light
  2. Has a more intelligent road technology
  3. Movement of the knee that is free to move
  4.  Evolution beautiful and simple
  5. More has a human-friendly design

Recognition technology
In the 2000s model of ASIMO, Honda has added many features that enable ASIMO to interact better with humans. These features fall under 5 categories:

1. Recognition of moving objects
Using visual information from a camera mounted in its head, ASIMO can detect the movement of multiple objects and judging distances and directions. Common applications of these features include: the ability to follow the movement of the camera, follow a person, or greet a person when he approached.

2. Introduction of posture and movement
ASIMO can also interpret the positioning and movement of a hand, as well as recognizing postures and movements. ASIMO can react to and be directed not only to voice commands, but also with a wide range of natural human movements. This makes it possible to, for example, recognize when a handshake is offered or when someone waved and Asimo respond appropriately. It can also recognize movement directions such as pointing.

3. Introduction of environment
Asimo can recognize objects and situations of the environment and act in a way that is safe for himself and people nearby. For example, recognizing potential hazards such as stairs, and avoid hitting humans or other moving objects.

4. Distinguishing sounds
ASIMO's ability to identify the source of the sound and be able to distinguish between the human voice and other sounds. It can respond to call his name, recognize people's faces when you're spoken to, and recognize unusual sounds such as the sound of a falling object or a collision, and turned her face towards it. Asimo can also answer questions, either with a curt nod, shaking his head or respond verbally.

5. Face recognition
ASIMO has the ability to recognize faces, even when ASIMO or the human being moved. Individually Asimo can recognize approximately 10 different faces and recognize facial owner's name if it has been registered before.

Human Support Robot



Toyota has unveiled a new assistant robot designed to help the disabled live more independently. Called the Human Support Robot (HSR), it represents the latest initiative in Toyota's Partner Robot program and is intended to help out around the home by fetching things, opening curtains, and picking up objects that have fallen to the floor.

The HSR can be controlled using a simple graphical user interface via tablet PC. It can also wear a tablet atop its head, which would allow caregivers and family members to communicate with the robot's owner over Skype or other services. But unlike recent telepresence robots including the recently announced iRobot RP-VITA, the HSR has an arm and gripper for doing the simple tasks we often take for granted.

The robot is able to pick stuff up from the floor or atop tables and high counters thanks to its telescopic body, which gives it a height of 2.7 to 4.3 feet, and an arm length of 2.5 feet. When not in use, the robot's single arm is designed to fold in tightly to reduce its body's overall diameter to just 14.5 inches (an important factor when maneuvering in compact Japanese homes). The robot's arm uses little power and moves slowly to prevent accidents and injuries.

Technical specs are still a bit unclear, but the robot weighs 70 lbs and is capable of holding objects that weigh up to 2.6 lbs with its simple two-fingered gripper. Designed for use indoors, the robot travels at a max speed of 1.8 miles per hour. It can overcome bumps in the floor up to 0.3 inches (enough to traverse from hardwood to carpet) and can climb slopes up to 5 degrees. Although not specifically mentioned in the press release the robot appears to have both a Prosense (Microsoft Kinect) sensor and stereo cameras in its head, which would allow it to sense depth and visually identify people and objects.

Toyota has been testing the HSR with the cooperation of the Yokohama Rehabilitation Center since 2011, with patients providing feedback on the robot's design. The company will be demonstrating the robot to the public from 26- 28 September at Tokyo Big Sight as part of the "bleeding edge development of health care equipment" project. No word on the expected price of the robot (or its battery life), but given that Japanese public health insurance will cover 90% of associated costs (a law designed specifically for robot technology that was passed recently), it seems HSR will have a decent shot at becoming a real consumer product, though it may take another couple of years of development.

A robot that can fetch and carry could be used to help care for sick and elderly people in their own homes.

Toyota, the Japanese car manufacturer, has developed the Human Support Robot, or HSR for short, so it will pick up anything it is told to using a mechanical arm that can grasp objects.

The robot also have a suction cup to help it pick up smaller more delicate items, such as medication or single sheets of paper.

The Human Support Robot, shown above, has a single mechanical arm that can grasp and pick up objects like a TV remote control, bottles of water, medication and even carry family photographs to bed ridden patients. Engineers behind the technology hope it could be used to help disabled patients and the elderly in their homes

ROBOBEAR CAN CARRY PATIENTS

A cute robot with the face of a teddy bear could be the home-carer of the future.


The 'Robear' has a cub-like face but packs enough strength to transfer patients from a wheelchair or a floor-level bed to a bath, for example.

It weighs 309lb (140kg) with extending legs that stop the 'bear' from falling over and it moves slowly and smoothly thanks to advance actuators in its mechanical arms.

Robear was developed by the Riken-SRK Collaboration Centre for Human-Interactive Robot Research in Nagoya, Japan.

Robear is capable of lifting a 12 stone 8lb (80kg) person and its built-in sensors detect a person's weight so the arms know how much force is needed.

Kouichi Ikeda, the lead engineer on the project, said it could be particularly useful for people with spinal injuries, back problems and conditions such as arthritis who struggle to stoop down and pick things up.

He said: 'Although it can only do one simple task of picking up, it's already making disabled people quite happy.

'We're just getting started, but eventually we want it to enter people's homes.'

Toyota was showing off the robot, which is 4 feet 4 inches tall (135cm), at a welfare and nursing technology exhibition in Yokohama, south of Tokyo.

The HSR uses several cameras to help it navigate around a room and detect objects to pick them up. Toyota put two of the cameras on the head to make it look like it has eyes.

The robot is controlled using a tablet or smartphone, with users simply clicking on an object in a video feed from the robot's eyes and then tap on whatever they want it to retrieve.

A screen on top of the head can allow video calls, meaning a nurse in a remote location can check in on a patient without having to be there. They can even control the robot to give a patient their medication.

HSR has soft rubber grips and bumpers to ensure it does not hurt people in the room or damage any furniture.

Currently it can pick up items weighing around 1.2kg (2.6lbs), including bottles of water, television remote controls, food and photographs to bring them closer to patients.

Toyota first developed the basic concept model for HSR in 2012 but a new revamped model is now due to be released to universities and research facilities next year.

With a maximum speed of just half a mile per hour (0.8km/s), it scoots around on wheels.

Partner Robot Family
Partner Robot, as its name suggests, is a robot that assists people with a combination of caring and intelligence.

Toyota has been working to develop commercially viable "partner robot" by building on its expertise in the field of industrial robotics and applying cutting-edge technology from areas such as the automotive industry and IT.

Personal Assist Robot
Human Support Robot(HSR)
Concept
The Human Support Robot (HSR) is being developed to assist people in their everyday activities. In the future, the HSR will coexist with family members in the home, providing support to improve living conditions and the overall quality of life.

Features
Three key features make it possible for the HSR to operate in indoor environments around people.

1. Compact, Lightweight Body
To better accommodate a wide variety of households, the HSR needs to be lightweight and maneuverable. An articulated arm and telescoping body allow the HSR to cover a large workspace despite its compact footprint.

2. Safe Interaction
Realizing that contact between human and robot is an essential aspect of support in domestic situations, a safety-conscious design was a top priority for the HSR. The robot's arm uses little power and moves slowly to prevent accidents and injuries. Obstacle avoidance and collision detection help the HSR to operate safely in a human-centric environment.

3. Simple Interface
HSR can be controlled intuitively through voice commands or a simple graphical user interface via any number of common handheld touchscreen-enabled devices, such as tablet PCs and smartphones.

Functions
HSR has three basic modes: Pick-up, Fetch, and Manual Control.

Pick-up
The arm has a simple gripper to pick up objects such as pens and TV remotes, while thinner hard-to-grasp objects like paper or cards can be lifted off the floor using a small vacuum installed in the hand.

Fetch
Using voice commands or the touch-screen GUI, the user can command the robot to retrieve objects from boxes and shelves by simply specifying what to fetch.

Manual Control
Tasks that are currently beyond the scope of HSR's autonomous capability can be performed manually via the user interface. Manual control is also useful for remote operation ("telepresence"), which would allow caregivers and family members to communicate with the robot's owner over Skype or other services, by means of a display on top of its head.

Technology
1. Folding Arm
HSR is intended to help out around the home by fetching things, opening curtains, and picking up fallen objects. Along with a telescoping body, the robot's single arm can extend to pick stuff up from the floor or atop tables and high counters. When not in use, the arm is designed to fold in tightly to reduce its body's overall diameter.

2. Flexible Hand
Attached to the HSR's arm is a two-finger gripper, which is soft to the touch. This flexible hand conforms to the shape of objects it grasps, and includes a pressurized suction pad to lift thin items such as cards or paper.

3. Object Recognition & Grasp Planning
Object recognition algorithms allow HSR to understand the size and shape of items tasked to pick-up or grasp. This information is used to compute an appropriate path for the arm and position of the hand.

4. Environment Recognition and Autonomous Mobility
Onboard sensors keep HSR apprised of its surroundings, so that it can safely navigate inside the home, avoiding obstacles while continuing along the optimum route to its instructed destination.

5. Remote Functions
Family members and caregivers can access and operate HSR using a network enabled device to perform the following tasks:

  • Remote Control

 Perform household tasks (Retrieve objects, open curtains, etc)


  • Remote Monitoring

Watch over a disabled family member or check-in on an empty house


  • Remote Communication

Video chat with family members ("telepresence")

Currently, this functionality is limited to use on a local network, but access from remote locations will be available in the near future.
HSR will keep family, friends, and society connected.

Social Contribution
TOYOTA has developed the HSR prototype to assist independent home living for persons with limited arm or leg mobility. Aiming to improve quality of life, TOYOTA has developed the HSR prototype in cooperation with the Japan Service Dog Association to identify the needs and desires of individuals with limited limb mobility, and developed functions focused around picking up dropped objects, retrieving items, and communicating with family members and caregivers.

In 2011, TOYOTA conducted in-home trials using the robot with individuals with limb disabilities in cooperation with the Foundation for Yokohama Rehabilitation Service and incorporated user feedback into development.

Additionally, in response to the aging of Japan's population, TOYOTA will collaborate with research organizations such as universities as well as persons involved in nursing and healthcare to research and develop new functions for the HSR such as remote monitoring and assistance with the aim of practical application in the field of care for the elderly.


Friday, 30 October 2015

MH - 2



About The MH2 Walking Robot

The MH2 hexapod robot is designed around a simple mechanical leg design with all metal brackets. This leg design minimizes the number of parts required to make a two DOF (degree of freedom) leg and allows this robot to be steered like a tank. Forward, reverse and in place turning is supported. The robot uses standard sized Hitec servos for the legs. The combo kit includes everything you need to make a functional robot except batteries and optional PS2 controller.

The Mechanics
The body of the robot is made from ultra-tough laser-cut PVC structural components while the legs are made from high-quality aluminum Servo Erector Set brackets. The MH2 includes 12 servos; six HS-422 and 6x 645MG. The robot is able to adjust its height, walk forward and in reverse, as well as rotate clockwise and counter-clockwise, tit forward and backward, left and right and more.

Controlling the Robot
There are three control options for the MH2. The first combo kit, the MH2, comes with the SSC-32 and the Bot Board II + BASIC Atom Pro 28. This kit is programmed using Basic Micro Studio. The second kit, the MH2U, includes the SSC-32 and a BotBoarduino, and is intended to be programmed using the popular Arduino software. By offloading the servo pulse generation and sequence movement timing to the SSC-32, the microcontroller has plenty of power to do some really cool things. The sample code allows the robot to walk using the PS2 controller (sold separately).

The third kit includes a full version of Flow Botics Studio and includes the SSC-32 as well as a serial to Bluetooth board and Bluetooth module which allows for remote control from the computer. This approach offloads the higher level calculations to the computer. FlowBotics Studio includes a demo project for the MH2 with pre-written walking algorithms.

You never have to be alone again. At least, that's the thinking behind the Japanese shoulder robot.
Created by Yuichi Tsumaki, Fumiaki Ono and Taisuke Tsukuda from Yamagata University in Japan, the MH-2 robot miniature humanoid perches on the shoulder like a bird and enables users to share experiences with friends and family who cannot be present in person.

No details have been given of the robots being outfitted in angel or devil costumes.
The MH-2 robot was recently introduced at the 2012 IEEE International Conference on Robotics and Automation. The device falls under the category of telepresence, a type of tech that enables users to feel as if they are actually present in real life.

The robot is designed to be able to mimic human mannerisms and actions in order to make the connection between the real-life user and and digitally connected friend as realistic as possible.

The at-home friend should use a 360 degree immersive reality display in order to see through the robot's eyes. A Kinect like device is necessary for the at home user to instruct the robot's movements.

The robot has 20 degrees of freedom, or rotation, throughout its form to simulate human actions and movements. Seven degrees of freedom are contained within the robot's arms the same amount as a human arm. The gadget has enough movement ability that it can even simulate breathing.

However, all that movement comes at a price. The gadget requires a bulky backpack in order to store the servomechanisms and motors necessary to send instructions to the robot's head, arms and body. Powered by 22 actuators, or motors, joint movements and rotations are controlled by the tug of a wire.

With its human-like movement ability, the MH-2 follows recent developments in the automation industry that are trending toward creating more lifelike robots. The ECCEROBOT, short for Embodied Cognition in a Compliantly Engineered Robot, is the world's first 'anthropomimetic' robot, meaning the robot copies inner human mechanism, rather than simply mimicking outward actions.

The ECCEROBOT may lead the pack of creepy robots, but the Japanese shoulder robot MH-2 is the only one you can cart around with you on your shoulder.

The MH-2 is a telepresence robot like no other we have seen, and believe us, we’ve seen our share of weird robots. This tiny humanoid figure is always there for you, perching on your shoulder, ready to be remotely inhabited by your friends. Conceived by the researchers at Yamagata University in Japan, MH-2 is designed to imitate human behavior accurately enough for you to feel like your friend is actually there with you.

The truth, however, is that this friend of yours is back at home, in the living room, making wild gestures in front of some sort of a motion capture set-up and watching the video captured in real time by the MH-2. Meanwhile, the robot is busy copying all these movements, flailing its limbs around and acting as a physical beacon of your friend’s engagement in the situation. The robot's expressive capabilities are impressive, with the arms having seven degrees of freedom (DoF), while the head has three DoF, and the body has two, plus one more dedicated to imitating breathing movements.

Although both the friend and the robot are guaranteed quite a workout, it’s you who needs to do most of the heavy lifting. To make sure your Miniature Humanoid’s movements are smooth, as many as 22 bulky actuators are required. You need to carry the servomotors in your backpack so that they can pull the strings attached to the robot's joints, causing it to move.

The researchers are already at work trying to make the whole package a little smaller and more convenient, but obviously the MH-2 has never been about convenience in the first place. It’s not a production prototype. It's a bold experiment in human-robot relations.

Powering Options
The robot is compatible with the following batteries and chargers.
  Chargers
    > NiCad & Ni-MH Universal Smart Charger (USC-01)
  Batteries
    > 6.0 Volt Ni-MH 1600mAh Battery Pack (BAT-03)
    > 6.0 Volt Ni-MH 2800mAh Battery Pack (BAT-05)

Important!
To keep costs down we are not providing printed Assembly Guides. They are provided online, so you will need to print them when you order the kits. By providing the Assembly Guides online we can provide more detailed and up to date information than the old hardcopy method allowed.

Thursday, 29 October 2015

Baxter



It sounds like the name of a favorite pet dogs instead? Well probably shy and loyal robot characteristics may have what is causing his name. But even though his face looks funny in monitar 12 'he was one of the most advanced robot in the industry today. Built as the production of pick-and-place robot, Baxter does not move much. But with innovative technology well enough, he knows the location of where the bottle cap. Although progress is not new, built in almost a fraction of the price of traditional industrial robot, you can expect to soon see Baxter useful work on all your products.

For decades, manufacturers have had very few cost-effective options for handling low volume, high mix production jobs. No longer. Meet Baxter the safe, flexible, affordable alternative to outsourced labor and fixed automation. Leading companies across North America have already integrated Baxter into their workforce, and gained a competitive advantage for their business in the process.

Baxter is a proven solution for a wide range of tasks from line loading and machine tending, to packaging and material handling. If you walk the floor of your facility and see lightweight parts being handled near people, you’ve likely just found a great job for Baxter. This smart, collaborative robot is ready to get to work for your company doing the monotonous tasks that free up your skilled human labor to be exactly that.

Baxter Research Robot is a compelling addition to the world of Research and Education. With the same industry-tested hardware of our flagship robot Baxter that is revolutionizing the manufacturing world, Baxter Research Robot is providing a safe, affordable, robust platform for schools and labs to utilize in exciting ways. With market-leading value, a unique inherent safety system and the real-world relevance of the Baxter manufacturing solution, it is quickly becoming a must have tool for leading institutions around the globe.
Baxter Research Robot allows direct programming access to the system via a standard, open-source ROS API interface.Users can run custom programs from a connected development workstation, or locally through access to the on-board CPU. It is entirely safe to operate around humans without the need for safety cages or other guarding equipment, and comes with a suite of example programs, demonstration videos, online documentation and a robust user community to help new users get started quickly in developing new applications.

The Baxter Platform
SAFE, POWERFUL AND AFFORDABLE
For decades, robotics researchers and educators had been forced to make concessions between critical elements such as system cost, safety, range of functionality, and ease of use. In 2013, Rethink’s Baxter Research Robot changed all that and in doing so, changed how this community could leverage robotics in their work.

Robo Builder



A football field is probably not the best place for the robot and can find a promising career, of course unless it is a robot created specifically for it. It is not quite clear whether the decreased performance of football players or inspiration from Manga comics that led to the creation of these robots play football 3 feet. But of course, the robot is pretty good at it and really put on a real event that is far better than conventional football games. Imagine a relentless attacker robot, a robot impenetrable defense and goalkeeper electronics all in the same field soccer robots can not get better than this.

HAL Exoskeleton

   
The Hybrid Assistive Limb (also known as HAL) is a powered exoskeleton suit developed by Japan's Tsukuba University and the robotics company Cyberdyne. It has been designed to support and expand the physical capabilities of its users, particularly people with physical disabilities. There are two primary versions of the system: HAL 3, which only provides leg function, and HAL 5, which is a full-body exoskeleton for the arms, legs, and torso.

In 2011, Cyberdyne and Tsukuba University jointly announced that hospital trials of the full HAL suit would begin in 2012, with tests to continue until 2014 or 2015. By October 2012, HAL suits were in use by 130 different medical institutions across Japan. In February 2013, the HAL system became the first powered exoskeleton to receive global safety certification. In August 2013, HAL received EC certification for clinical use in Europe as the world's first non-surgical medical treatment robot. In addition to its medical applications, the HAL exoskeleton has been used in construction and disaster response work

History
The first HAL prototype was proposed by Yoshiyuki Sankai, a professor at Tsukuba University. Fascinated with robots since he was in the third grade, Sankai had striven to make a robotic suit in order “to support humans.” In 1989, after receiving his Ph.D. in robotics, he began the development of HAL. Sankai spent three years, from 1990 to 1993, mapping out the neurons that govern leg movement. It took him and his team an additional four years to make a prototype of the hardware.

The third HAL prototype, developed in the early 2000s, was attached to a computer. Its battery alone weighed nearly 22 kilograms (49 lb) and required two helpers to put on, making it very impractical. By contrast, later HAL-5 model weighs only 10 kilograms (22 lb) and has its battery and control computer strapped around the waist of the wearer.

Cyberdyne began renting the HAL suit out for medical purposes in 2008. By October 2012, over 300 HAL suits were in use by 130 medical facilities and nursing homes across Japan. The suit is available for institutional rental, in Japan only, for a monthly fee of US$2,000. In December 2012, Cyberdyne was certified ISO 13485 – an international quality standard for design and manufacture of medical devices – by Underwriters Laboratories. In late February 2013, the HAL suit received a global safety certificate, becoming the first powered exoskeleton to do so. In August 2013, the suit received an EC certificate, permitting its use for medical purposes in Europe as the first medical treatment robot of its kind.

Design and mechanics
When a person attempts to move their body, nerve signals are sent from the brain to the muscles through the motor neurons, moving the musculoskeletal system. When this happens, small biosignals can be detected on the surface of the skin. The HAL suit registers these signals through a sensor attached to the skin of the wearer. Based on the signals obtained, the power unit moves the joint to support and amplify the wearer's motion. The HAL suit possesses a cybernic control system consisting of both a user-activated “voluntary control system" known as Cybernic Voluntary Control (CVC) and a “robotic autonomous control system" known as Cybernic Autonomous Control (CAC) for automatic motion support.

Users
HAL is designed to assist the disabled and elderly in their daily tasks, but can also be used to support workers with physically demanding jobs such as disaster rescue or construction. HAL is mainly used by disabled patients in hospitals, and can be modified so that patients can use it for longer-term rehabilitation. In addition, scientific studies have shown that, in combination with specially-created therapeutic games, powered exoskeletons like the HAL-5 can stimulate cognitive activities and help disabled children walk while playing.

During the 2011 Consumer Electronics Show, it was announced that the United States government had expressed interest in purchasing HAL suits. In March 2011, Cyberdyne presented a legs-only HAL version for the disabled, health care professionals and factory workers. In November 2011, HAL was selected to be used for cleanup work at the site of the Fukushima nuclear accident. During the Japan Robot Week exhibition in Tokyo in October 2012, a redesigned version of HAL was presented, designed specifically for the Fukushima cleanup. In March 2013, ten Japanese hospitals conducted clinical tests of the newer legs-only HAL system. In late 2014, HAL exoskeletons modified for construction use entered service with the Japanese construction contractor Obayashi Corporation.

April 16, 2009 Anyone who has seen Aliens will remember the exoskeleton forklift that Ripley wears to fight the alien queen at the end of the movie. Well, Japanese company Cyberdyne has unveiled a robotic suit that works on a similar idea of a robotic suit capable of augmenting human motion and strength. The Robot Suit Hybrid Assistive Limb (HAL for short) is a wearable robot that uses a “voluntary control system” first to interpret the wearers' planned movement and then assist them in it.

The suit's "voluntary control system" works by capturing bio-electrical signals detected on the surface of the skin, before the muscles actually move. The system analyzes these signals to determine how much power the wearer intends to generate and calculates how much power assist must be generated by which power units. The power units then generate the necessary torque and the limbs move. All this takes place a split second before the muscles start moving, allowing the relevant robotic joints to move in unison with the wearer’s muscles.

The suit also uses a "robotic autonomous control system" that provides human-like movement based on movements stored in a database. The movements, which are automatically updated based on information that sensors collect from the body, allow HAL to autonomously coordinate each motion. This means HAL can be used even if no bio-electrical signals are detected, due to problems, say, in the central nervous system or the muscles.

The battery, worn on the back, provides about two hours and 40 minutes of continuous running time, although a newer battery promises more like five hours of use, assisting in daily activities such as standing up from a chair, walking, climbing up and down stairs or lifting heavy objects. At 1.6m tall, the suit weighs 23kg, but the wearer is not expected to carry the burden since the exoskeleton supports its own weight.

With the ability to multiply the wearer’s strength by a factor of between two and 10, depending on the type of robnotic suit being worn, Cyberdyne expects HAL to be used in a range of ways and areas, such as rehabilitation and physical training support, helping disabled people, heavy lifting, and assisting in rescue at disaster sites. The company also sees potential in the entertainment industry – perhaps a cage match fight to the death between HAL and the alien queen, if anyone's game.

The Robot Suit HAL is only available to Japanese residents, although an office has been set up to introduce it to the European Union.


LS Quadruped



The Legged Squad Support System (LS3) is a DARPA project for a legged robot which could function autonomously as a packhorse for a squad of soldiers or marines. Like BigDog, its quadruped predecessor, the LS3 is ruggedized for military use, with the ability to operate in hot, cold, wet, and dirty environments

Specifications
The Legged Squad Support System is to "Go where dismounts go, do what dismounts do, work among dismounts". The machine is to carry 400 pounds (180 kg) of squad equipment, sense and negotiate terrain, maneuver nimbly, and operate quietly when required.

The LS3 is approximately the shape and size of a horse. A stereo vision system, consisting of a pair of stereo cameras mounted into the 'head' of the robot, has been integrated alongside a light detecting and ranging (LIDAR) component in order to enable it to follow a human lead and record intelligence gathered through its camera.

History
The initial contract for the Legged Squad Support System was awarded to Boston Dynamics on December 3, 2009. The continued evolution of the BigDog has led to the development of the LS3, also known as AlphaDog. The contract schedule called for an operational demonstration of two units with troops in 2012.

DARPA, which has continued to support the program, carried out the first outdoor exercise on the latest variation of the LS3 in February 2012, with it successfully demonstrating its full capabilities during a planned hike encompassing tough terrain.

Following its initial success, an 18-month plan was unveiled, which saw DARPA complete the overall development of the system and refine its key capabilities, due to start in summer 2012.

On September 10, 2012, two LS3 prototypes were demonstrated in an outdoor test. One of them had done so earlier in the year. The LS3 prototypes completed trotting and jogging mobility runs, perception visualization demonstrations, and a soldier-bounded autonomy demonstrations. They were roughly "10 times quieter" than the original platform. Other improvements included a 1 to 3 mph walk and trot over rough, rocky terrain, an easy transition to a 5 mph jog, and a 7 mph run over flat surfaces. Testing will continue approximately every quarter at military bases across the country.

In early December 2012, the LS3 performed walks through woods in Fort Pickett, Virginia. These tests were with a human controller giving voice commands to the robot to give it orders. Giving voice commands is seen as a more efficient way of controlling the LS3, because a soldier would be too preoccupied with a joystick and computer screens to focus on a mission. There are currently ten commands that the system can understand. Saying "engine on" activated it, and saying "follow tight" made it walk on the same path as the controller. Saying "follow corridor" made the LS3 generate the path most efficient for itself to follow the human operator. Others include basic orders like "stop" and "engine off." Continued work is being made to make the LS3 more mobile, like traversing a deep snow-covered hill, or avoiding gunfire and bombs on the battlefield. DARPA intends to supply a Marine company with an LS3 by 2014.

From 7-10 October 2013, the LS3 took part in testing, along with other systems, at Fort Benning, Georgia as part of the U.S. Army's Squad Multipurpose Equipment Transport (S-MET) program. The program objective is to find an unmanned robotic platform to transport soldier equipment and charge batteries for their electronic gear. Requirements for the vehicle are to carry 1,000 lb (450 kg) of gear, equal to the amount a nine-man infantry squad would need on a 72-hour mission. Cubic volume is seen as more of a problem for load-carrying unmanned vehicles, as their center of gravity changes when more gear has to be stacked. It has to travel 4 km/h (2.5 mph) for eight hour marches and speed up in bursts of up to 38 km/h (24 mph) for 200 meters. The proposed S-MET vehicle needs to traverse forward and backward on slopes of up to 30 percent and descending on slopes of 60 percent.

The LS3 was used by Marines in July 2014 during Exercise RIMPAC 2014. After five years of development, the system had reached a level of maturity for it to operate with Marines from the 3rd Battalion, 3rd Marine Regiment in a realistic combat exercise. One company nicknamed the machine "Cujo" and used it to resupply various platoons in places difficult to reach by all-terrain vehicles. Operators were surprised at the level of stability and reliability it had walking; although it was able to traverse 70-80 percent of terrain, it did have problems negotiating obliques and contours of hills. When it did fall over, the system was able to right itself most of the time, and even when it needed assistance it only required one person because it is designed to be easily rolled upright. Controls, like joysticks, are similar to those for video games, making them simple to learn. Due to loud noise during movement and difficulty traversing certain terrains, the LS3 was used as a logistical tool rather than a tactical one. Further development is continuing on creating more space for equipment, including heavy weapons.

Cheetah Robot



Do you think you can outrun a robot? Well if you do, then you need to look at this new discovery of Boston Dynamics. Dubbed as the robot cheetah, they have a good enough reason to bear the title, because it is the fastest robot ever created. Although not able to 100km / h as the native animals that name comes from Africa, the robot is able to outrun even the fastest runner, the average speed in 45Kmph. And it can do so for a sustained period of time thanks to inbuilt efficient hydraulic pump.

In a leap for robot development, the MIT researchers who built a robotic cheetah have now trained it to see and jump over hurdles as it runs making this the first four legged robot to run and jump over obstacles autonomously.
To get a running jump, the robot plans out its path, much like a human runner: As it detects an approaching obstacle, it estimates that object’s height and distance. The robot gauges the best position from which to jump, and adjusts its stride to land just short of the obstacle, before exerting enough force to push up and over. Based on the obstacle’s height, the robot then applies a certain amount of force to land safely, before resuming its initial pace.
In experiments on a treadmill and an indoor track, the cheetah robot successfully cleared obstacles up to 18 inches tall more than half of the robot’s own height while maintaining an average running speed of 5 miles per hour.
“A running jump is a truly dynamic behavior,” says Sangbae Kim, an assistant professor of mechanical engineering at MIT. “You have to manage balance and energy, and be able to handle impact after landing. Our robot is specifically designed for those highly dynamic behaviors.”
Kim and his colleagues including research scientist Hae won Park and postdoc Patrick Wensing will demonstrate their cheetah’s running jump at the DARPA Robotics Challenge in June, and will present a paper detailing the autonomous system in July at the conference Robotics: Science and Systems.

See, run, jump
Last September, the group demonstrated that the robotic cheetah was able to run untethered a feat that Kim notes the robot performed “blind,” without the use of cameras or other vision systems.
Now, the robot can “see,” with the use of onboard LIDAR a visual system that uses reflections from a laser to map terrain. The team developed a three-part algorithm to plan out the robot’s path, based on LIDAR data. Both the vision and path-planning system are onboard the robot, giving it complete autonomous control.
The algorithm’s first component enables the robot to detect an obstacle and estimate its size and distance. The researchers devised a formula to simplify a visual scene, representing the ground as a straight line, and any obstacles as deviations from that line. With this formula, the robot can estimate an obstacle’s height and distance from itself.
Once the robot has detected an obstacle, the second component of the algorithm kicks in, allowing the robot to adjust its approach while nearing the obstacle. Based on the obstacle’s distance, the algorithm predicts the best position from which to jump in order to safely clear it, then backtracks from there to space out the robot’s remaining strides, speeding up or slowing down in order to reach the optimal jumping-off point.
This “approach adjustment algorithm” runs on the fly, optimizing the robot’s stride with every step. The optimization process takes about 100 milliseconds to complete about half the time of a single stride.
When the robot reaches the jumping-off point, the third component of the algorithm takes over to determine its jumping trajectory. Based on an obstacle’s height, and the robot’s speed, the researchers came up with a formula to determine the amount of force the robot’s electric motors should exert to safely launch the robot over the obstacle. The formula essentially cranks up the force applied in the robot’s normal bounding gait, which Kim notes is essentially “sequential executions of small jumps.”
Optimal is best, feasible is better
Interestingly, Kim says the algorithm does not provide an optimal jumping control, but rather, only a feasible one.
“If you want to optimize for, say, energy efficiency, you would want the robot to barely clear the obstacle but that’s dangerous, and finding a truly optimal solution would take a lot of computing time,” Kim says. “In running, we don’t want to spend a lot of time to find a better solution. We just want one that’s feasible.”
Sometimes, that means the robot may jump much higher than it needs to and that’s OK, according to Kim: “We’re too obsessed with optimal solutions. This is one example where you just have to be good enough, because you’re running, and have to make a decision very quickly.”
The team tested the MIT cheetah’s jumping ability first on a treadmill, then on a track. On the treadmill, the robot ran tethered in place, as researchers placed obstacles of varying heights on the belt. As the treadmill itself was only about 4 meters long, the robot, running in the middle, only had 1 meter in which to detect the obstacle and plan out its jump. After multiple runs, the robot successfully cleared about 70 percent of the hurdles.
In comparison, tests on an indoor track proved much easier, as the robot had more space and time in which to see, approach, and clear obstacles. In these runs, the robot successfully cleared about 90 percent of obstacles.
Kim is now working on getting the MIT cheetah to jump over hurdles while running on softer terrain, like a grassy field.
This research was funded in part by the Defense Advanced Research Projects Agency .

Robonaut 2



NASA is training a humanoid space robot to pull double duty as an emergency doctor in space — a surrogate physician that could one day be controlled by experts on Earth to help sick or injured astronauts.

The $2.5 million Robonaut 2, nicknamed R2, is designed to work alongside the astronauts and even take over some of their more tedious duties inside and outside the International Space Station. The new NASA training is adding telemedicine skills to that mix.

In a new video of Robonaut 2's telemedicine training, the automaton performed an ultrasound scan on a mannequin and even used a syringe like it would to administer a real-life injection. The tests were performed using a ground-based version of R2 robot, the mechanical twin of the one currently aboard the space station.

"I would say that within an hour I trained him more than with other students I'm working for a week, so I think that he's learning really fast," Dr. Zsolt Garami, of the Houston Methodist Research Institute, says in the video.

Far from earthly hospitals, astronauts who currently live on space station, typically in six-month-long stints, must be trained in basic surgery and medical procedures in case of an emergency. But Robonaut 2, which has a camera-equipped head, could administer care to spaceflyers, controlled by doctors on the ground.

So far, tests with Robonaut 2 have shown that human controllers can perform tasks "correctly and efficiently by using R2's dexterity to apply the appropriate level of force and can track their progress using R2's vision system," NASA officials explained in a video description. Garami said the robot might eventually be able to learn to do some tasks on its own.

Robonaut 2's telemedical skills could be useful on Earth, too, allowing doctors to conduct complex medical procedures on humans in remote locations, according to NASA.

The space-bound Robonaut 2 launched to the International Space Station as just a torso with arms in 2011 during the final flight of the space shuttle Discovery. That Robonaut 2 will get legs, a set of high-tech limbs with seven joints each, sometime later this year, NASA officials have said.

Finally, through the preparation for 15 years, the first human-like robot, will be blasted off into space on Monday, November 1, 2010 to come.

The entourage of astronauts aboard the space shuttle Discovery, Robonaut 2, the name of the robot, will join astronauts and cosmonauts who will hang out for a few months on the international space station.

Robonaut Robonaut 2 is the successor to the first generation robot, which was developed by NASA with a variety of partners, including DARPA. Later, Robonaut 2 was developed by NASA along with General Motors since 2007. "This will be the first humanoid robot in space,"

Arriving at the space station, the robot valued at US $ 2.5 million that will be tested to ensure its ability to do various things in zero-gravity conditions. This robot will help the tasks that are at risk for humans.

"The challenge that we received when starting the project Robonaut is to build something that has the ability to accomplish tasks deft as it is done by humans," said Rob Ambrose, Acting Chief of the Automation, Robotics, and Simulation Division, Johnson Space Center, NASA, such as MSNBC quoted from the site.

From the beginning, the robot can be made to complete various tasks with the mandate, and safe, side by side with humans. Therefore, all of the metal in her body covered with soft materials.

He has fingers, palms similar to humans, so that it can grip or hold an envelope like humans. Sensors in the robot is also programmed as safe as possible.

When in charge of robot feel unexpected objects, for example, he hit the head of the human astronauts, the robot is programmed to immediately stop its movement. When he felt the pressure or blows with a large enough force, it will immediately shut itself down.

In the early days of testing, the robot is controlled via the console like a computer, it will be given light duties tedious for humans, such as cleaning the banister or clean the air filter.

Next, the robot will be given the task increasingly difficult, including for maneuver in the conduct spacewalk outside the space station and technical improvements.

"This project demonstrates the expectations of future generations of robots both in space and on earth. Not as a substitute for a human, but as a companion who can help a variety of activities, "said NASA.

After that, NASA has not had the intention to bring Robonout 2 to return to earth. That is, for the robot, the launch on Monday, will be a one-way ticket to the 'home' in space.