Automatic Recognition System Newborn Cries
Coordinated by: Corneliu Burileanu
Other persons involved in the project
Mircea Sorin Rusu (Softwin)
The SPLANN project aims to design and develop an automatic infant crying recognition system, linking neonatal knowledge with signal processing and pattern recognition methods. The goal is to obtain technologies, legally protected by patents, with a high degree of future applicability in health, child care and computer science, with real chances of being successfully exploited on the market.
The SPLANN consortium consists in:
- a company with experience in software development, IT solutions implementation, biometrics research and R&D project management - SOFTWIN (Coordinator);
- a faculty with expertise in signal processing and speech recognition - The Faculty of Electronics, Telecommunications and Information Technology, in University “Politehnica” of Bucharest (Partner 1);
- a hospital with a high standard maternity, having a neonatal intensive care unit, and a long tradition in scientific research activities - The Emergency Clinical Hospital “Sf. Pantelimon” (Partner 2).
The neonatology research highlights the importance of distinguishing the infants’ needs. The consistent, prompt recognition of infant needs ensures proper care, adequate nutrition, increased sleep quality and improved parent-infant interaction, while relieving stress from parents and helping caregivers (in home or hospital environments). The project aims to improve the quality of child care by providing means of recognizing the infants’ state.
The infants’ cry research is vast and full of challenges, including cry analysis, cry detection, cry pattern construction and recognition, cry-mood, cry-need and cry-pathology association.
SPLANN aims to contribute to the progress of this field with:
- a large database of accurate labelled infant cries;
- an intense study of infant cry meanings, made by neonatal and signal processing professionals;
- a novel method of assessing the infant need based on its cry.
To analyze the cry acoustic signal previous techniques use frequency and amplitude information. They extract cry features by mathematical operations (mean, standard deviation, skewness, etc.) or algorithms as Mel Frequency Coefficients. The pattern recognition methods include: genetic algorithms, neural networks, Hidden Markov Models, Support Vector Machines.
The SPLANN project will use SOFTWIN proprietary algorithms for signal processing, feature extraction and pattern recognition. These algorithms are an innovative approach as they construct feature vectors using the shape of the signal. The classification uses similarity analysis methods as Levenstein distance and Dynamic Time Warping algorithm. With Partners’ 1 expertise, other innovative algorithms will be developed during this project.
The current market competition for infant cry recognition exists only at international level (e.g., Cry Translator and WhyCry technologies).
The bottlenecks identified in the state-of-the-art include low noise tolerance, high response time and inconsistent cry interpretation. The recognition rates are low or not sustained by complex clinical trials made on large cry databases.
SPLANN aims to eliminate the bottlenecks, to provide better products, offering methods with:
- high recognition rate (minimum 95%), sustained by clinical tests on a extended, accurate labelled cry database;
- high level of consistency in cry interpretation;
- low response time (maximum 3-5 seconds);
- high noise tolerance.
An API (Application Programming Interface) will incorporate the resulted libraries for acquisition, signal processing and pattern recognition. The API will ease the integration of the infant language recognition methods in applications developed on different platforms or in stand-alone devices. Such applications can run on computers or smart-phones, bundled with wireless or internal microphones placed near the infant and capturing infants’ sounds. When the infant cries, after a brief processing, the application shows the infant’s state/need, along with other information.