sensors-logo

Journal Browser

Journal Browser

Current Trends in Sensors and Methods for Food and Fluid Intake Recognition and Quantification

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Intelligent Sensors".

Deadline for manuscript submissions: 30 April 2024 | Viewed by 3665

Special Issue Editor


E-Mail Website
Guest Editor
Centre for Robotics Research, Department of Engineering, Faculty of Natural and Mathematical Sciences, King’s College London, London, UK
Interests: biological sensors/signals and their applications in prostheses; fluid quantification and monitoring; cardiology
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The world’s ageing population will profoundly affect the support ratio, defined as the number of workers per retiree. By 2050, the support ratio (number of persons aged 20 to 64 divided by the number aged 65 or over) in several developed countries (especially in Europe) is expected to go below two, making it an emerging global challenge. Owing to this super ageing population, caregivers involved in eldercare are in danger of an enormously increased burden. Thus, the European Commission (EC) is taking proactive measures to tackle the future challenges posed by an ageing population by prioritizing initiatives that will contribute to building a healthy and active population for the future. Diet and nutrition are critical determinants of ageing healthily, within which optimal hydration is important as a balance between fluids and electrolytes is necessary if cells are to survive and function normally. This global challenge requires urgently developing validated methods for assessing food and fluid intake to support clinical practice and research on interventions to prevent malnutrition and dehydration. Several sensors and methods have been proposed in the literature to address this global challenge.

This Special Issue attempts to unravel the latest trends in sensors and methods for action detection, quantification, and monitoring food and fluid intake. Topics of interest include but are not limited to the following: wearable sensors, inertial sensors, sensor fusion, video-based sensors, physiological sensors and signals, surface sensors and smart containers and swallowing research where food or fluid intake is involved.

Dr. Ernest N. Kamavuako
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • action detection
  • fluid intake monitoring
  • food intake monitoring
  • ageing
  • sensors and signals
  • swallowing
  • drinking and eating

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Review

27 pages, 3376 KiB  
Review
Technology to Automatically Record Eating Behavior in Real Life: A Systematic Review
by Haruka Hiraguchi, Paola Perone, Alexander Toet, Guido Camps and Anne-Marie Brouwer
Sensors 2023, 23(18), 7757; https://doi.org/10.3390/s23187757 - 08 Sep 2023
Viewed by 1450
Abstract
To monitor adherence to diets and to design and evaluate nutritional interventions, it is essential to obtain objective knowledge about eating behavior. In most research, measures of eating behavior are based on self-reporting, such as 24-h recalls, food records (food diaries) and food [...] Read more.
To monitor adherence to diets and to design and evaluate nutritional interventions, it is essential to obtain objective knowledge about eating behavior. In most research, measures of eating behavior are based on self-reporting, such as 24-h recalls, food records (food diaries) and food frequency questionnaires. Self-reporting is prone to inaccuracies due to inaccurate and subjective recall and other biases. Recording behavior using nonobtrusive technology in daily life would overcome this. Here, we provide an up-to-date systematic overview encompassing all (close-to) publicly or commercially available technologies to automatically record eating behavior in real-life settings. A total of 1328 studies were screened and, after applying defined inclusion and exclusion criteria, 122 studies were included for in-depth evaluation. Technologies in these studies were categorized by what type of eating behavior they measure and which type of sensor technology they use. In general, we found that relatively simple sensors are often used. Depending on the purpose, these are mainly motion sensors, microphones, weight sensors and photo cameras. While several of these technologies are commercially available, there is still a lack of publicly available algorithms that are needed to process and interpret the resulting data. We argue that future work should focus on developing robust algorithms and validating these technologies in real-life settings. Combining technologies (e.g., prompting individuals for self-reports at sensed, opportune moments) is a promising route toward ecologically valid studies of eating behavior. Full article
Show Figures

Figure 1

31 pages, 2115 KiB  
Review
Vision-Based Methods for Food and Fluid Intake Monitoring: A Literature Review
by Xin Chen and Ernest N. Kamavuako
Sensors 2023, 23(13), 6137; https://doi.org/10.3390/s23136137 - 04 Jul 2023
Cited by 5 | Viewed by 1849
Abstract
Food and fluid intake monitoring are essential for reducing the risk of dehydration, malnutrition, and obesity. The existing research has been preponderantly focused on dietary monitoring, while fluid intake monitoring, on the other hand, is often neglected. Food and fluid intake monitoring can [...] Read more.
Food and fluid intake monitoring are essential for reducing the risk of dehydration, malnutrition, and obesity. The existing research has been preponderantly focused on dietary monitoring, while fluid intake monitoring, on the other hand, is often neglected. Food and fluid intake monitoring can be based on wearable sensors, environmental sensors, smart containers, and the collaborative use of multiple sensors. Vision-based intake monitoring methods have been widely exploited with the development of visual devices and computer vision algorithms. Vision-based methods provide non-intrusive solutions for monitoring. They have shown promising performance in food/beverage recognition and segmentation, human intake action detection and classification, and food volume/fluid amount estimation. However, occlusion, privacy, computational efficiency, and practicality pose significant challenges. This paper reviews the existing work (253 articles) on vision-based intake (food and fluid) monitoring methods to assess the size and scope of the available literature and identify the current challenges and research gaps. This paper uses tables and graphs to depict the patterns of device selection, viewing angle, tasks, algorithms, experimental settings, and performance of the existing monitoring systems. Full article
Show Figures

Figure 1

Back to TopTop