The project deals with the development of several technologies whose goal is to monitor and to support people with particular needs in an indoor (domestic) environment. These technologies can be considered in the larger framework of Ambient Assisted Living (AAL). The project aims to develop different technologies (including wearable sensors and treatment devices) and to design and realize tools and methodologies that enable efficient communication with a central control system, that process and integrate the acquired data to detect relevant events that manage the communication with providers of health-care services and with familiar and professional caregivers. The research activities of ISSIA in this project are mainly related to the visual observation of the environment and of the human beings inside it. The Computer Vision and Robotics group is working on a network of sensors (visual, infrared and 3D cameras) whose aim is to monitor continuously the people inside the environment of interest. A set of properly placed sensors of various kinds is intended to enrich the information about the observed subject as long as to widen in space and time the coverage that could be provided by a single sensor. The sensors can be placed on mobile platforms to increase the flexibility of their locations. Mobile sensors can be sent to points that provide better coverage and more favorable points of view. Therefore, research activities include the development of a semi-autonomous mobile platform that should be able to receive a goal by the central control system in the environment and to navigate autonomously to reach that specific position in the environment.
Network of fixed and mobile agents
TYPE AND DATE
NATIONAL PROJECT (PON)
STARTING DATE – Jul 1, 2011
ENDING DATE – Apr 30, 2015
TOTAL COST €: 9.636.472,00
TOTAL FUNDING € 7.260.151,10
COST € 289.240,00
FUNDING € 245.854,00
The research topics involved in the project can be summarized in the following list:
Map building: A spatial representation of the environment must be created and shared with the central control system. It must be used to communicate missions to the mobile platforms and to enable their autonomous navigation to the assigned observation points in the environment.
Autonomous navigation Each mobile platform must be able to plan a trajectory to reach the assigned goals in the environment. During each mission the mobile platform must account for any change dynamically occurred in the environment, must sense and deal with obstacles and human beings that are detected along the path and that prevent the execution of the planned trajectory.
Visual observation: Every visual sensor, fixed or on the mobile platform, must be able to process the acquired video sequence to reach a few results: to detect the presence of human beings moving in the observed scene, to track their movements inside the covered field of view, to process and analyze each movement to identify a finite set of activities of interest. At this level of the system each sensor act individually by exploiting, if available, the information extracted by the coordination system and spread through the network of sensors.
Coordination: The network of sensors must be identify and exploiting the correlations between the information extracted by each sensor. The system as a whole must be able to re-identify human beings moving between the fields of view of different visual sensors as long as to integrate the data provided by the contemporary observations of sensors with partially overlapping fields of view. The final goal is to allow the network to act as a single logical sensor whose final output exploits all the available information to better evaluate the human activities observed inside the environment.
This video shows some different robot functionalities in a realworld domestic-like environment. In particular, a squared domestic like environment of 5×5 meters is considered, where three stationary cameras and one mobile robot are employed. The aim is to demonstrate that critical tasks, such as people tracking and following, can be successfully performed in a completely distributed manner.
SOURCE OF FOUNDING
Centro di progettazione, Design & Tecnologie dei Materiali (CETMA), Italy
National Research Council of Italy, Italy
Cupersafety sas di Sacchetti Saverio &C., Italy
Dida Network Srl, Italy
Agenzia Spaziale per le nuove tecnologie, l’energia e lo sviluppo sostenibile (ENEA), Italy
Isopharma Cosmetics Srl, Italy
Item Oxygen Srl, Italy
Laboratorio dr. Pignatelli Srl, Italy
MATRIX Spa, Italy
Software Engineering Research & Practices Srl, Italy
TecnoMarche – Parco Scintifico e Tcnologico delle Marche, Italy