Intelligently manipulating objects, especially in collaboration with humans, is still a widely researched topic in robotics. This project aims to create intelligent methods for natural and intuitive human-robot interaction (HRI), with the goal of equipping robots with the required intelligence to understand a given task in such a way to actively support a human completing it. The focus will be on natural, intuitive tool handover in a work environment, such as an operating theatre, a factory floor or the extreme environment of a nuclear power plant.
While humans (amongst each other) conduct handover tasks with great ease, there are still many shortcomings when attempting to carry out this HRI task using existing robotic systems. The project aims to create new AI-based approaches to instil robots with capabilities that allow them to interactively negotiate object 'affordances' between humans and robots and to dynamically adapt to shifting view points during the handover for reliable HRI.
Explaining the science
The field of human-robot interaction (HRI) has seen a dramatic development over the last decade, with a wide range of researchers moving the field forward considerably. HRI has gained in importance with an observed need for robots that can operate in the vicinity of humans or even get in physical contact with them; application areas include robot-assisted minimally-invasive surgery, cooperative robotics in manufacturing (Industry 4.0), for the remote handling of objects in extreme environments including those in the nuclear industry, for rehabilitation and the care of the disabled and elderly.
Purposefully handling objects comes very naturally to humans – we instinctively understand how to pick up a tool and rearrange it in our hands so that we can use the tool to conduct a specific task. We are also adept at anticipating when and how an object will be received. Even the way in which a hand approaches an object can tell us a lot about how it will be used. The term affordance is used in this context; it describes the physical action possibilities of an object from the perspective of a (prospective) user, e.g. the handle of a cup can be used to easily pick up the cup and drink from it; a screwdriver is held by its handle to turn screws.
Current research in computer vision and robotics has focused on automatically detecting either robot-centric object affordances (i.e. the robot is the prospective user), to underpin intelligent object handling by robotic devices, or human-centric object affordances (i.e. the human is the prospective user), to infer the possible uses of everyday objects. However, the interplay between human-centric and robot-centric affordances of the same object has been largely overlooked. This research take a different perspective, as the robot and the human-centric models have to be considered when an object is handed over.
This project plans to equip robots with the ability to integrate and interactively negotiate human-centric and robot-centric affordances of objects (and in particular, of tools) and to anticipate (i.e. before the actual handover) and dynamically adapt to shifting perspectives during the handover, so that a natural and intuitive collaboration between robot and human can be achieved. The robot needs to obtain an understanding of the object affordance from the human user’s perspective and to make use of the object’s complementary or collaboration affordances when picking up the object as to facilitate the handover to the human user who aims to complete a specific task using the object. For example, the robot should grasp a scalpel from the side of the blade so that the surgeon can grasp the handle during handover and perform the required incision.
The work introduces the term ‘coffordances’ to describe these collaboration affordances. Moreover, it will be investigated how different cues (i.e. touch, spoken dialogue, body movements) can be integrated both before and during the handover task to improve the robot's understanding of the situation, and to trigger corrective adjustments.
This project aims to answer a number of pertinent research questions relating to the challenge of natural and intuitive handover of tools in the context of human robot interaction:
- How can we teach a robot an understanding of the affordances of a tool (the properties that show the possible actions a user can take with a tool), both from the perspective of a human collaborator and from its own perspective with the aim to facilitate handover ('coffordances')?
- How can a robot learn local, dynamic adaptations of affordances during handover, and what interactive resources humans (i.e. spoken language, body movement), objects (i.e., extracted from robot vision and tactile/force sensing) and the environment (e.g. co-location in 3D space) provide for this?
- How can robots be enabled to understand human intentions through both verbal and physical interactions to achieve natural handover from robot to human?
- How humans adapt their actions to joint affordance aware actions with robots?
- Given such a tightly coupled task between human and robot action, how can a robot continuously estimate when its own intentions within the handover activity become communicated to the user, in line with recent work on the legibility of robotic action?
- How do we solve the challenging vision and planning problems, of making a robot arm-hand-eye system dynamically adapt its grasping with respect to objects being handed over by a human arm?
Assistive technologies for the operating theatre
The operating theatre is commonly a stressful working environment often requiring the clinicians to make quick and, at times, life-saving decisions on how to continue with a surgical procedure. Quick and appropriate reaction from all involved to make available required tools is essential. Ideally, the transferring of instruments is achieved without too much additional commands being given, but rather anticipated by attending support staff.
An intelligent robotic device could make a difference here, providing instruments when and where required, and at the same time understanding the affordances/coffordances of the instrument to be delivered so that the human surgeon can grasp the instrument and seamlessly conduct the intended procedure. This element of the proposed research programme will exploit the expertise of the applicants in robot-assisted minimally invasive surgery and robot-assisted ergonomics, HRI, etc, and will bring on board the expertise of involved surgeons and experts for tool-handover in surgical environments.
This work will make use of the surgery simulation facilities at QMUL (BARTS Health) providing the opportunity to obtain pilot data in a realistic setting without the need to apply for ethical approval as required for real surgery situations. The work will also involve collaboration with Professor Christian Heath (KCL) who is an expert in work and interaction. With input from interaction experts and clinicians, grant applications will be prepared in the area of robotics technologies to aid surgery.
Assistive technologies for nuclear waste decommissioning
Despite advancements in robot-assisted remote operation and tele-operation, a large number of work tasks still need to be done manually, e.g. in legacy plant contaminated by alpha-emitting plutonium dust. To clean up the UK’s nuclear legacy, it is estimated that one million entries of human workers will be needed into hazardous zones, wearing plastic suits with air-hoses or respirators, only capable of working for two hours work per day. Fine tool manipulation with up to seven layers of protective gloves is extremely hard. For the foreseeable future, humans will be required to enter highly ionised areas to carry out tasks that cannot be done by robotic devices.
This project will investigate new approaches where robots can join the human workers so that they can conduct the required tasks swiftly and, thus, reduce the time in the potentially harmful radioactive environment. The project will focus on developing assistant robots that will be particularly adept in handing over required tools in a way that the human user can focus on the task at hand.
The element of the work programme is strongly linked to the currently running research in the framework of the National Centre for Nuclear Robotics (NCNR) led by Birmingham and where QMUL is a partner of. The project will be strongly supported by Prof Rustam Stolkin (Birmingham and Coordinator of NCNR). It is the intention to apply for follow-on funds jointly with Birmingham and other suitable partners of NCNR – especially Innovate UK and SBRI funds for higher TRL industrialisation, increasingly calling for practical robotic devices to assist humans when working in difficult and hazardous conditions.
Assistive technologies for the manufacturing sector
Extending from work conducted in the framework of EU project FourByThree, methods will be developed that will allow the robot to consider not only the coffordances of an object to be passed on, but also understand the need of the user to assume an ergonomically optimised posture at the handover point and during the subsequent tasks; specifically, one approach will explore how to create robots that can assist the worker by re-orientating the workpiece so that the worker’s posture is ergonomically optimal at all times.
The PI has many years of expertise in the area of robot-assisted ergonomics. With the work-related injuries due to bad posture on the rise, this element of the work will have a beneficial impact in virtually all manufacturing sectors. In addition to manufacturing assembly, a challenging task is disassembly of complex multi-material objects for recycling in the circular economy. This strand of work will be in collaboration with the Faraday Institute project on robotic disassembly and recycling of electric vehicle batteries (Stolkin, Birmingham), which currently can only be done by expert humans.