Robotics
Dual-arm robot could recreate touch in artificial limbs
Published
3 months agoon
By
News Editor

An innovative bimanual robot developed by a researchers in the UK demonstrates tactile sensitivity close to human-level dexterity, using AI to inform its actions.
The new Bi-Touch system, designed by researchers at the University of Bristol and based at the Bristol Robotics Laboratory, allows robots to carry out manual tasks by sensing what to do from a digital helper.
The research shows how an AI agent interprets its environment through tactile and proprioceptive feedback, and then control the robots’ behaviours, enabling precise sensing, gentle interaction and effective object manipulation to accomplish robotic tasks.
This development could revolutionise industries such as fruit picking, domestic service and eventually recreate touch in artificial limbs.
Lead author Yijiong Lin from the Faculty of Engineering, said: “With our Bi-Touch system, we can easily train AI agents in a virtual world within a couple of hours to achieve bimanual tasks that are tailored towards the touch.
“And more importantly, we can directly apply these agents from the virtual world to the real world without further training.
“The tactile bimanual agent can solve tasks even under unexpected perturbations and manipulate delicate objects in a gentle way.”
Bimanual manipulation with tactile feedback is key to human-level robot dexterity.
However, the topic is less explored than single-arm settings, partly due to the availability of suitable hardware along with the complexity of designing effective controllers for tasks with relatively large state-action spaces.
The researchers were able to develop a tactile dual-arm robotic system using recent advances in AI and robotic tactile sensing.
The team built up a virtual world (simulation) that contained two robot arms equipped with tactile sensors.
The researchers then designed reward functions and a goal-update mechanism that could encourage the robot agents to learn to achieve the bimanual tasks and developed a real-world tactile dual-arm robot system to which they could directly apply the agent.
The robot learns bimanual skills through Deep Reinforcement Learning (Deep-RL) which is one of the most advanced techniques in the field of robot learning.
Deep-RL is designed to teach robots to do things by letting them learn from trial and error akin to training a dog with rewards and punishments.
For robotic manipulation, the robot learns to make decisions by attempting various behaviours to do designated tasks, for example, lifting up objects without dropping or breaking them.
When the robot succeeds, it gets a reward, and when it fails, it learns what not to do.
With time, the robot figures out the best ways to grab things using these rewards and punishments.
The AI agent is visually blind relying only on proprioceptive feedback – the body’s ability to sense movement, action and location and tactile feedback.
The researchers were able to successfully enable to the dual arm robot to successfully safely lift items as fragile as a single Pringle crisp.
Co-author Professor Nathan Lepora added: “Our Bi-Touch system showcases a promising approach with affordable software and hardware for learning bimanual behaviours with touch in simulation, which can be directly applied to the real world.
“Our developed tactile dual-arm robot simulation allows further research on more different tasks as the code will be open-source, which is ideal for developing other downstream tasks.”
Yijiong concluded: “Our Bi-Touch system allows a tactile dual-arm robot to learn sorely from simulation, and to achieve various manipulation tasks in a gentle way in the real world.
“And now we can easily train AI agents in a virtual world within a couple of hours to achieve bimanual tasks that are tailored towards the touch.”
Image: Yijiong Lin
60
SHARES
You may like


TheHill secures UK gov funding and Barclays support to help advance digital innovation


Real time data collection changes the game for the stroke patient pathway


Inside BT’s mission to boost NHS connectivity


UCB and Open Medical partnership will support Fracture Liaison Services


Radar Healthcare announces Aamal Medical partnership


Photodisinfectant: can light curb the antimicrobial resistance crisis?


Video games may help teens discuss mental health


Why it’s time to revisit workplace mental health initiatives and make them work for everyone


Innovations in self-diagnostics technology: Paving the way to a healthier future?


Telehealth solution revolutionising stroke care in Cardiff and Vale UHB
Sign up for free updates from Health Tech World
Trending stories
- Medtech4 weeks ago
Surgical Holdings to attend MEDICA to highlight how distributors can achieve greater sustainability
- Leadership3 weeks ago
Four steps to improving primary care
- Events5 days ago
Online event to help healthcare professionals with business support
- Life sciences4 weeks ago
AI identifies potential gonorrhoea vaccine proteins