What if, before leaving on vacation, we took a trip into the future reading an article proposing ways to help responders better manage a CBRN event?
What is it about?
What are the options for :
Increase physical capabilities (man-machine symbiosis, increased strength), increase cognitive capabilities (assistance systems, increased situational awareness), increase human sensory capabilities (vision, hearing, touch, taste and smell).
All these tracks have been studied and summarized in an article« Augmented CBRNE Responder – Directions for Future Research, Regal G. et al. International Conferences Proceeding Series (ICPS), AH2022, May 26–27, 2022.
CBRN responders work in difficult and dangerous conditions, and perform very complex tasks combining detection, identification, decontamination, demining but also communication with other units, civilians, etc. In civil (accidents, terrorism…) or military situations, the responder must quickly know the hazardous substance and evaluate the situation in its entirety in order to best manage the different steps necessary to resolve the problem.
So what are the opportunities to increase human capabilities in the area of chemical, biological, radioactive, nuclear and explosive (CBRNE) events?
The use of pilotless robots, on the ground or in the air, that can access dangerous and/or contaminated areas, is similar to remote control. It is currently evolving towards a real collaboration between man and semi-autonomous robots: this is the case for example of the ATEROS system, in which autonomous robots explore dangerous and contaminated areas, and provide geo-referenced information on a CBRN incident, for example the intensity of radiation, to the teleoperator through a head-up display (HUD) in augmented reality (AR)
For firefighters and other first responders, we are moving towards the use of drones capable of locating and assisting people in danger outside the operators’ field of vision, creating a true symbiosis between man and machine and also allowing assistance in transporting heavy equipment and improving real-time situational awareness through mobile sensors.
In recent years, we have seen a shift from remote/teleoperated systems to autonomous systems where robots have moved from being a tool to being a member of the response team, filling a dedicated team role.
The development of 3D mapping is currently in development, the question being how to provide such information to CBRN responders, through real-time visualization.
Exoskeletons can be passive (for example, using springs or elastic bands) or active (wearable robotics, hydraulics, etc.). Currently, most exoskeletons are used in the industrial field. The harsh environment and wide range of tasks currently preclude wider use of exoskeletons in firefighting but studies are underway.
Assistance to medical personnel through an e-health system (online) has been offered to CBRN responders.
To support human cognitive abilities, decision support through Artificial Intelligence (AI) systems is promising; what data is needed, how is it captured (humans, drones, automatic ground vehicles…), what information can be derived from it, how to provide it to the stakeholders…? So many questions currently under study.
Research is focused on the coupled use of virtual guidance and augmented reality.
In the field of detection and identification, there is a strong reliance on optical support and step-by-step guidance of sampling probes.
For decontamination, optical guidance and visualization of traces on a decontaminated object could facilitate the process.
AI-based CBRN assistance systems and software could be used to automatically define mission planning, such as how long people can stay in contaminated areas, how often to change teams, etc.
Improved situational awareness
Robotic systems are being developed that can assist in situational awareness for first responders by providing a more coherent picture of an event site through sensors. Contextual geo-information systems as well as data such as wind direction or short-term meteorology can provide valuable information about the environment.
The human senses
Increased vision, hearing and touch
The integration of augmented reality with head-up displays in protective equipment is a promising avenue. Tests for the piloting of drones controlled simply by the operator’s gaze are currently being studied. We can even add a 3D vision. A 3D reconstruction of a building, coupled with a thermal camera, can be very useful to firefighters in case of fire. In previous works, an acoustic augmented reality guidance system has been developed to intuitively guide blind and visually impaired people by using 3D virtual sounds as landmarks. This system could be adapted to the movement of first responders. A vibrotactile glove has been proposed that provides feedback on the presence and distance of obstacles in environments where vision is reduced. Tactile vests that provide feedback on the position and distance of team members have also been tested.
This auditory and tactile information could take over from the visual information that can sometimes hinder the soldier who has just started a fight.
Increased taste and smell
In the past, CBRN responders have used their sense of smell to detect substances with characteristic odors, such as mustard gas. But, “Most of the time, if you smell something, your mask is defective and then you have a serious problem. So odors must be analyzed and recognized by other means.
The use of electrical stimulation of olfactory receptors by combining stimulating electrodes with sensors, to provide a sensation of “smelling” the substances, could be an option in the future but it still exists only in “concept” form.
With the “Sensabulle” system, an alert can be given through bubbles that release a smell when they burst.
Increasing physical capabilities through collaboration between robots and drones: human-machine symbiosis;
Increase physical strength by adapting existing exoskeletons for use in harsh environments;
Increasing cognitive abilities through e-assistance systems;
Augmenting the senses of vision, hearing and touch: their use for feedback on environmental factors can improve the analysis of the situation.