1. Introduction
Physical exercise is an important part of the daily routine that helps to overcome the risk of diseases as well as improve quality of life. Physical exercise is essential to prevent and reduce the risk of many diseases and improve physical and mental health [
1]. People are quite busy in modern times and this trend is increasing day by day, therefore, this is a time to think about the appropriate technology for monitoring their activities and caring for their elderly [
2]. An intelligent interface between the human body and the monitoring mechanism could solve this type of problem. In the recent years, many technologies have been implemented for extracting and collecting physiological data, such as heart rate, blood pressure, temperature, etc., using various types of sensors before, during, and after physical exercise [
3,
4,
5], however, these techniques are not adequate for regular monitoring. An intelligent model using image processing techniques is required to measure physical exercise intensity during physical exercise.
In recent years, many studies have proposed various methodologies to extract various physiological data and monitor physical exercise. Mainly, two approaches (contact sensor technology and contactless technology) are widely used in recent times to extract physiological features whether during physical exercise or resting. Both the approaches have pros/cons, however, the proposed implementation is parallelly used. In contact sensor techniques, the sensors are attached on the body and the result is interpreted using machine learning or statistical approaches. For non-contact/contact sensor techniques, various types of cameras are used to capture video/images and processed with various image processing techniques including Infrared camera and thermal camera. Whatever the image extraction technique, they must be followed by computer vision or machine learning and deep learning techniques to interpret the result. The facial expression in different states of tiredness may depend on the age of people or may be different for male and female users. The intelligence interface should be more useful for elderly people who cannot control a physical activity machine themselves during physical activities. They do not even want to use a wearable device on the body; in this situation, this model gives a better way of monitoring.
Exercise monitoring uses various techniques corresponding to the physiological parameter monitoring during exercise. Naik [
6] presented a review on sports video analysis to detect players, predicting trajectory for strategy planning. Likewise, Ahad [
7] reviewed the video-based prediction of various parameters of the elderly patients and all the articles were focused on healthcare. A similar review article was presented by Debnath reviewing the articles related to vision-based approaches for physical rehabilitation related research studies and Horak [
8] presented review articles on school students’ activity monitoring using computer vision techniques. However, a lot of review articles are published in state-of-art articles, to our knowledge, none of the review articles cover only non-contact and video-based techniques for exercise monitoring. In the review article, physical exercise monitoring using non-contact techniques is covered. The primary goal of this review article is to review the overview of the related articles. Each published paper is analyzed in terms of technology, dataset, methodology and results. The targeted subject for the experiments includes sportspeople, general people, special patients (if related) and the elderly. A wide range of human subjects are considered as the experimental subjects. We hope future researchers will have valuable guidelines for their further research direction by studying this article.
2. Data Sources
The original research studies were sourced from the IEEE Xplore, ScienceDirect, Web of Science, Spring link, PubMed, Psych info, ACM digital library, and Human and kinetics journals. The search criteria were: (“Physical exercise” OR “Physical activity” OR Exercise OR “Physiological features” OR Fitness) AND (Monitoring OR Monitor OR measurement) AND (non-invasive OR Contactless OR non-contact OR contact free). The publication years of the research articles were selected from 2000 to 2022. Original research articles published in journals, conference proceedings, book chapters, and review papers were selected for the review.
Table 1 presents the Inclusion and Exclusion criteria for the article selection.
The methods of the systematic review have been developed in accordance with the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. The completed PRISMA-P [
9] checklist is available as a supplementary file to this protocol. The article collection and selection were performed in three different stages. The article collection flow diagram using the PRISMA flow diagram is shown in
Figure 1.
Stage 1: The first stage is about the selection of articles according to the title. In this stage, the duplicate articles from various databases were removed and irrelevant articles only according to the title were removed from the list.
Stage 2: In this stage, the abstract of the articles is reviewed. The methodology and results were also reviewed in this stage. The articles that have irrelevant methodology and poor results were removed and the articles that have similar methodologies were also filtered out.
Stage 3: In this stage, the articles selected from stage two were fully read and the results were analyzed thoroughly. The final extraction output will be compared with others.
Taxonomy
It has three levels of taxonomy; Analysis Parameters, Target Group, and Intelligent Output (See
Figure 2). Level 1 is about the data collection or raw input data, which includes how and what type of data are collected to monitor the physical exercises. The second level includes input and output information focused for the target group. In particular, how the data collected in Level 1 is processed to obtain meaningful information. The third level includes some intelligent processing to obtain meaningful full information from the data collected from the subject. Machine learning and deep learning techniques are implemented at this level.
3. Discussion about Articles
In this section, the papers are classified according to the data extraction while performing the physical exercise. Sub-sections are defined according to the physiological feature extraction. Included are some literature review articles related to this topic. All the included articles are illustrated in
Table 2.
3.2. Heart Rate Variability
Heart rate variability (HRV) is the physiological phenomenon of the variation of the time interval between heartbeats. It is an important physiological parameter to be considered to monitor intensity during exercise and many approaches have been proposed to measure HRV using contact and non-contact sensor technologies. [
16,
46,
48,
95].
A completely new approach was proposed by Hung to measure HRV by continuous monitoring of the Pupil Size Variability (PSV) during physical activity. The authors also measure blood pressure variability using the same approach. It was hypothesized that pupil size variability is highly correlated to both physiological parameters, which are also indicators of exercise intensity. Many parameters including electrocardiogram, respiration effort, finger arterial pressure, and pupil images were recorded from ten subjects before and after five minutes of exercise on a treadmill. The pupil images were captured by a 1/3 Charged Coupled Device (CCD) camera connected to an 8-bit monochrome video frame-grabber, set to capture at a resolution of 512 × 512 pixels and capture speed of 2 frames per second. All the signals, including pupil images, were acquired by a data acquisition unit [
50] during exercise. The experiments underwent six phases of 5 min recording sessions. The findings suggested that PSV may be a valid indicator of cardiovascular variability.
3.3. Blood Pressure
Blood pressure level can signpost the effort level when an individual is performing exercise, hel** to estimate the workload [
96]. Blood pressure is also important to determine some patterns that could correspond to cardiac diseases, thus, it is significant for the monitoring of health output during exercise [
97].
3.3.1. Photoplethysmography and Pulse Arrival Time
The Pulse Arrival Time (PAT) is the time between the peak of the electrocardiogram (ECG) and the arrival of the pulse detected by the PPG. Fatemeh Shirbani [
51,
52] investigated the correlation between image-based photoplethysmography pulse arrival time (iPPG-PAT) and diastolic BP (DBP) during one-minute seated rest and three minutes of isometric handgrip exercise. The video was recorded from the face using a standard web camera and estimates were compared to a ground-truth device. It was found that the beat-to-beat iPPG-PAT and DBP were negatively correlated.
3.3.2. Photoplethysmography and Pulse Transmit Time
Pulse Transit Time (PTT) is the time a pulse wave takes to travel between two different arterial points, which is an important cue to estimate blood pressure. Several studies were carried out to obtain the correlation between PTT and blood pressure. The authors of [
53] introduced a contactless approach to estimate blood pressure using PPT, with seven healthy subjects. Image based (iPTT) and image-based PPG were recorded using a high speed camera at 426 Hz during physical exercise in a stationary bicycle. The exercises were carried out at three different times: rest, peak exercise, and recovery. The study found a highly positive correlation between iPTT and iPPG during exercise, concluding that measuring BP using PTT is reliable. It was shown that skin color changes due to blood pulsation and such changes could be identified by the three-color component processing in facial images.
3.4. Body Temperature
Body temperature rises due to physical exertion and relates to other physiological parameters, therefore, it can also be considered when monitoring physical exercise intensity [
60]. Thermal imaging is the technology that allows for measuring the body temperature without any contact sensors.
Thermal imaging captures images of an object based on the infrared radiation emanated from it [
56,
62,
86]. Its use in medical and sports environments is widespread [
54]. However, it seems that body location is important in terms of assessment quality as thermal emission might differ depending on the body part aimed to be analyzed [
57]. Ludwig [
54] presented a critical comparison between the main methods used to obtain body temperature from images, and also proposed an alternative. It was found that the temperature obtained within an ROI selection of a well-defined area can be considered as the most reliable.
James [
55] proposed an approach to investigate the validity and reliability of skin temperature measurement using a Telemetry Thermistor system (TT) and thermal camera during exercise in a hot and humid environment. Another similar study was presented by [
56] to compare data loggers (skin adhesive), thermal imaging, and wired electrodes for the measurement of skin temperature during exercise in a similar environment. The authors concluded that data logger and thermal imaging can be used as alternative measures for skin temperature in exercising, especially on higher temperatures and humidity.
Another study explored temperature in several different body parts pre, during and after moderate aerobic exercise using infrared thermography, proposed by Fernandes [
98]. The authors concluded that there are significant distinctions in the skin temperature distribution during exercise depending on the body part.
The changes in body temperature during endurance work-outs in highly trained male sprinters were analyzed by Korman [
58] using a thermal camera. The aim was to characterize experiments: before the session (pre-exercise), warm-up, specific drills for athletes’ warming up by comparing body temperature in the four phases of the sprinting techniques, and endurance exercise. Significant differences were found between the temperature of athletes’ backs and from body profiles, as well as significant changes before and after exercise. However, the thermography results were not compared to ground-truth measurements of temperature.
3.5. Energy Expenditure
Energy expenditure is highly correlated to exercise intensity, thus is essential to the planning, prescription, and monitoring of physical exercise programs [
63,
67,
69]. Although its estimation is challenging, the oxygen uptake has been considered the best direct way to measure energy expenditure [
64,
66]. Indirectly, it can be performed through heart rate and body acceleration [
67], which involves the development and application of non-contact technology as presented in the following sections.
3.5.1. Thermal Imaging
The introduction of thermal imaging in exercise monitoring also allowed for the development of reliable contactless techniques for energy expenditure measurements. Thermal cameras capture infrared radiation in the mid or long-wavelength infrared spectrum, depending on sensor type, emitted from any object. Therefore, the pixel values in the images are converted to temperature values and finally mapped with energy expenditure.
Jenson [
63] validated the thermal imaging method to estimate energy expenditure using oxygen uptake as a comparative. Fourteen endurance-trained subjects completed an incremental exercise test on a treadmill. Heart rate, gas exchange, and mean accelerations of the ankle, thigh, wrist, and hip were measured throughout the exercise. A linear correlation was found between the energy expenditure calculated using the optical flow of the thermal imaging and the oxygen uptake values. The contactless measurement of energy expenditure during exercise was also presented by Gade [
64]. The authors used thermal video analysis to automatically extract the cyclic motion pattern in walking and running. The results indicated a linear correlation between the proposed method and oxygen uptake.
3.5.2. RGB Depth
One of the noticeable technologies used for image capturing is the RBG-Depth camera, which captures objects in three dimensions. Tao [
69] presented a framework for the estimation of vision-based energy expenditure using a depth camera and validating the method with oxygen uptake measures. The method was found suitable for monitoring in a controlled environment, showing advantages as pose-variant and individual-independent way of measuring energy expenditure, in real time and remotely.
The deep learning technique has been considered one of the best tools to estimate, classify, and analyze quantitative data. Its application in sports has been increasing in recent years. This method is suitable for implementation in controlled environments, where the system first detects the presence of humans and then tracks the human body. This process is followed by a CNN-based feature extraction, then activity recognition and, finally, prediction of the calories produced [
69].
A novel approach using a fully contactless and automatic method, based on computer vision algorithm, was presented by Koporec [
67]. The RGB-Depth images are captured using Microsoft Kinect during exercise and a histogram of Oriented Optical Flow (HOOF) descriptors are extracted from the depth images and are used to predict heart rate. It feeds a regression model that finally estimates the energy consumption.
3.6. Respiratory Rate
Breathing is the process of taking air inside the body so oxygen can be absorbed, and then expelling carbon dioxide out of the body. Physical exertion not only increases the frequency of breathing, but also demands the exchange of a higher volume of air. Respiratory rate is directly linked to exertion, and thus, to energy expenditure as well [
73,
75,
76,
77,
99].
3.6.1. Video-Based Image Processing
The ventilation threshold is an important variable to investigate physical exertion. It is related to the anaerobic threshold, an event characterized by the increase in ventilation at a faster rate than what the body is capable of absorbing. In recent decades, many methods have been proposed to measure the respiratory rate during exercise using contactless technology. Aoki [
73] proposed a technique to measure respiratory rate during pedal stroke using optical techniques. A dot matrix optical element was arranged in front of the participant’s face and a laser was emitted to be captured by the camera CCD. After using low-pass digital filtering, a sinusoidal wave that vibrates at the respiratory frequency was calculated. The results showed high correlation when compared to data obtained using a gas analyzer.
3.6.2. RGB Depth
Aoki and colleagues not only proposed and evaluated the use of CCD cameras for respiratory rate measurements but also authored a series of publications exploring new trends in contactless sensors such as Kinect. In 2015, the use of the Kinect camera for respiratory rate measurement was validated [
46]. RGB depth images were captured and processed to obtain sinusoidal waves during exercise performed in a stationary cycle ergometer. The frequency found in the set of waves represented the respiratory rate. Later, the authors investigated whether the new method could provide good estimates of the ventilation threshold (VT). The experimental setup was maintained, but was applied to an incremental test, specifically to identify this variable. The authors found that respiratory rate measures are possible from increments above 160 W and ventilatory threshold values can be estimated with ±10 W of deviation from the VT calculated by gas analyzer [
72].
3.7. Muscle Fatigue
Fatigue is a subjective symptom of malaise and aversion to perform the activity or to objectively impaired performance [
79]. It can be assessed by either self-report scales or performance-based measures [
80]. Fatigue can be either physical or mental and both types are important to assess due to its high correlation to health-related parameters [
81,
82].
Facial expression is effective in assessing physical and mental fatigue [
83,
85]. Irani [
84] proposed an approach to measure fatigue by tracking facial features during exercise. The main hypothesis of the study was that, towards a fatiguing state, the points of interest in the image would increase vibration, thus, it could be identified in the power spectral analysis of the signal. This model was tested in maximal and submaximal dumbbell lifting tests, against force measures obtained by a dynamometer. The results showed that the temporal point of interest in the face could be easily found using the method.
Deep learning and thermal imaging were fused to automatically detect in the face exercise-induced fatigue [
100]. Different devices captured RGB, near infra-red, and thermal images, while the pre-trained CNN, Alexnet [
101], and Visual Geometry Group-16 (VGG16)/VGG19 [
102] were the deep learning methods used for the classification of different regions in the face according to fatigued/rested state. The authors found that the Alexnet applied to the region around the mouth showed the best classification of the fatiguing state.
3.8. Other Approaches
3.8.1. Muscle Oxygenation
Muscles need oxygen supply to work, thus, aerobic muscle performance increases muscle oxygenation. This parameter is related to heart rate and blood pressure during exercise. It is also closely related to muscle fatigue, thus, might bring important information to the research about physical exercise intensity [
70].
3.8.2. Facial Expression
The human face is a door for expressing feelings yielded either by physical or mental condition. Pain, tiredness, and illness due to exertion is reflected in facial expressions; therefore, monitoring exercise intensity level by analyzing facial expression might be an interesting idea.
Khanal [
44,
89] explored various methods of automatic classification of exercise intensities using computer vision techniques of a subject performing sub-maximal incremental exercise on a cycle ergometer. The facial expression was analyzed by extracting 70 facial feature points. The exercise intensity was classified according to the distance between points and stage of the incremental exercise. The intensity was classified into two, three, and four classes using kNN, Support Vector Machine (SVM), and discriminant analyses. The results showed that facial expression is a good method to identify exercise intensity levels. A regression based facial color analysis to estimate the heart rate at particular instances of time was presented by Khanal et al. [
89] where the autoregression model is proposed to predict the heart rate from the facial color changes.
A different way of using facial expression to analyze physical effort was presented by Uchida [
85]. The facial images were analyzed at different levels of resistance training. The authors evaluated the changes in facial expression using Facial Action Coding System (FACS) and the facial muscle activity using surface electromyography. The association of these parameters was mild, however, statistically significant.
Miles [
103] also presented an analysis of the reliability in tracking data from facial features across incremental exercise on a cycle ergometer. The results differed according to the face parts analyzed, but higher reliability was found for the lower face. A non-linear relationship between facial movement and power output were also determined. The power output, heart rate, RPE, blood lactate, positive and negative effects in corresponding exercise intensity were satisfied in the two blood lactate thresholds and maximum a posteriori probability MAP. These results show the potential in using the tracking of facial features as a non-invasive way of obtaining psychophysiological measures to access exercise intensity.
Still regarding the use of facial features to evaluate levels of exertion, the mouth and eyes are particularly interesting parts that express information by muscle actions. Thus, there is a variety of facial expressions and emotions heavily oriented by the eyes and mouth. Therefore, tracking the movement of these parts could be a key idea to analyze exercise intensity [
45]. Recently, the eye-blink rate and open-close rate of the mouth were tracked using Viola and Jones algorithm for image processing [
104] during sub-maximal exercise on a cycle ergometer. The eye-blinking rate was correctly identified with 96% accuracy. Additionally, the higher the exercise intensity, the higher the eye and mouth movement.
4. Conclusions
In the last two decades, the use of image and video processing to monitor physical exercise has evolved and brought attention to contactless technology. Most of the research work presented in this review contributed to the development of methods capable of assessing important variables related to physical exertion, but there is still a gap in the implementation of such methods and technology. This review is intended to provide a current and useful summary of the recent technology available for contactless devices and its application in sports sciences.
Most of the research studies presented in the review focused only on one type of sensor to extract the physiological parameters. The accuracy of the physiological parameters’ measurement could be improved by considering multi-sensor technology. With an improvement in wireless sensing technology, exercise monitoring using physiological parameters can be improved and expanded to multiple parameters using the same modalities. Recent computer vision technology is leading with deep learning, which can also help to upgrade exercise monitoring technology.
After the revision of the articles, one of the noticeable limitations is the lack of universal and multimodal technologies using low-cost multiple sensors. The low-cost multi-sensing system using deep learning technology should be the noticeable interest in this area for the future direction. It also has the potential to use big data technology to monitor exercise in real time using universal models instead of individual models.
Author Contributions
Conceptualization, S.R.K., D.P. and V.F.; methodology, S.R.K., D.P. and V.F.; software, S.R.K.; formal analysis, S.R.K., J.S. and V.F.; investigation, S.R.K., D.P., V.F. and J.B.; resources, S.R.K., J.S. and A.R.; writing—original draft preparation, S.R.K.; writing—review and editing, S.R.K., J.S., A.R., J.B. and V.F.; project administration, J.B. and V.F.; funding acquisition, J.B., V.F. and J.S. All authors have read and agreed to the published version of the manuscript.
Funding
This work was supported by the RD Project “Continental Factory of Future, (CONTINENTAL FoF)/POCI-01-0247-FEDER-047512”, financed by the European Regional Development Fund (ERDF), through the Program “Programa Operacional Competitividade e Internacionalizacao (POCI)/PORTUGAL 2020”, under the management of aicep Portugal Global—Trade Investment Agency.
Institutional Review Board Statement
Not applicable.
Informed Consent Statement
Not applicable.
Data Availability Statement
Not applicable.
Acknowledgments
The authors would like to thank all the participants who involved in the experiments.
Conflicts of Interest
The authors declare no conflict of interest.
References
- Paterson, D.H.; Warburton, D.E.R. Physical activity and functional limitations in older adults: A systematic review related to Canada’s Physical Activity Guidelines. Int. J. Behav. Nutr. Phys. Act. 2010, 7, 38. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Borg, R.L. Music In the Education of Children. J. Res. Music. Educ. 1962, 10, 79–80. [Google Scholar] [CrossRef]
- Faulkner, J.; Eston, R.G. Percieved exertion research in the 21st century: Developments, reflections and questions for the future. J. Exerc. Sci. Fit. 2008, 6, 1. [Google Scholar]
- Huanga, D.H.; Chioua, W.K.; Chenb, B.H. Judgment of perceived exertion by static and dynamic facial expression. In Proceedings of the 19th Triennial Congress of the IEA, Melbourne, Australia, 9–14 August 2015. [Google Scholar]
- Mei, M.; Leat, S.J. Quantitative assessment of perceived visibility enhancement with image processing for single face images: A preliminary study. Investig. Ophthalmol. Vis. Sci. 2009, 50, 4502–4508. [Google Scholar] [CrossRef]
- Naik, B.T.; Hashmi, M.F.; Bokde, N.D. A Comprehensive Review of Computer Vision in Sports: Open Issues, Future Trends and Research Directions. Appl. Sci. 2022, 12, 4429. [Google Scholar] [CrossRef]
- Ahad, M.A.R.; Antar, A.D.; Shahid, O. Vision-based Action Understanding for Assistive Healthcare: A Short Review. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) Workshops, Long Beach, CA, USA, 16–20 June 2019. [Google Scholar]
- Hõrak, H. Computer Vision-Based Unobtrusive Physical Activity Monitoring in School by Room-Level Physical Activity Estimation: A Method Proposition. Information 2019, 10, 269. [Google Scholar] [CrossRef] [Green Version]
- Doher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.G. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. PLoS Med. 2009, 6, e1000097. [Google Scholar] [CrossRef] [Green Version]
- Knobloch, K.; Hoeltke, V.; Jakob, E.; Vogt, P.M.; Phillips, R. Non-invasive ultrasonic cardiac output monitoring in exercise testing. Int. J. Cardiol. 2008, 126, 445–447. [Google Scholar] [CrossRef]
- Pour Ebrahim, M.; Sarvi, M.; Yuce, M.R. A Doppler Radar System for Sensing Physiological Parameters in Walking and Standing Positions. Sensors 2017, 17, 485. [Google Scholar] [CrossRef] [Green Version]
- Jöbsis, F.F. Noninvasive, infrared monitoring of cerebral and myocardial oxygen sufficiency and circulatory parameters. Science 1977, 198, 1264–1267. [Google Scholar] [CrossRef]
- Xu, G.; Mao, Z.; Wang, B. Noninvasive detection of gas exchange rate by near infrared spectroscopy. In Proceedings of the Seventh International Conference on Photonics and Imaging in Biology and Medicine, Wuhan, China, 8–10 August 2009. [Google Scholar] [CrossRef]
- Astaras, A.; Kokonozi, A.; Michail, E.; Filos, D.; Chouvarda, I.; Grossenbacher, O.; Koller, J.M.; Leopoldo, R.; Porchet, J.A.; Correvon, M.; et al. Pre-clinical physiological data acquisition and testing of the IMAGE sensing device for exercise guidance and real-time monitoring of cardiovascular disease patients. In XII Mediterranean Conference on Medical and Biological Engineering and Computing 2010; Springer: Berlin/Heidelberg, Germany, 2010; pp. 240–243. [Google Scholar] [CrossRef]
- Capraro, G.; Kobayashi, L.; Etebari, C.; Luchette, K.; Mercurio, L.; Merck, D.; Kirenko, I.; van Zon, K.; Bartula, M.; Rocque, M. ‘No Touch’ Vitals: A Pilot Study of Non-contact Vital Signs Acquisition in Exercising Volunteers. In Proceedings of the 2018 IEEE Biomedical Circuits and Systems Conference (BioCAS), Cleveland, OH, USA, 17–19 October 2018; pp. 1–4. [Google Scholar] [CrossRef]
- Li, K.H.C.; White, F.A.; Tipoe, T.; Liu, T.; Wong, M.C.; Jesuthasan, A.; Baranchuk, A.; Tse, G.; Yan, B.P. The Current State of Mobile Phone Apps for Monitoring Heart Rate, Heart Rate Variability, and Atrial Fibrillation: Narrative Review. JMIR Mhealth Uhealth 2019, 7, 11606. [Google Scholar] [CrossRef] [PubMed]
- Kumar, M.; Veeraraghavan, A.; Sabharwal, A. DistancePPG: Robust non-contact vital signs monitoring using a camera. Biomed. Opt. Express 2015, 6, 1565–1588. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Gambi, E.; Agostinelli, A.; Belli, A.; Burattini, L.; Cippitelli, E.; Fioretti, S.; Pierleoni, P.; Ricciuti, M.; Sbrollini, A.; Spinsante, S. Heart Rate Detection Using Microsoft Kinect: Validation and Comparison to Wearable Devices. Sensors 2017, 17, 1776. [Google Scholar] [CrossRef] [PubMed]
- ** for disturbance rejection of very-low-frequency heart rate variability. Biomed. Signal Process. Control 2016, 30, 31–42. [Google Scholar] [CrossRef] [Green Version]
- Stricker, R.; Muller, S.; Gross, H.-M. Non-contact video-based pulse rate measurement on a mobile service robot. In Proceedings of the 23rd IEEE International Symposium on Robot and Human Interactive Communication, IEEE RO-MAN 2014, Edinburgh, Scotland, 25–29 August 2014; pp. 1056–1062. [Google Scholar] [CrossRef]
- Li, S.; Li, X.; Lv, Q.; Zhang, D. WiFit: A Bodyweight Exercise Monitoring System with Commodity Wi-Fi. In Proceedings of the 2018 ACM International Joint Conference and 2018 International Symposium on Pervasive and Ubiquitous Computing and Wearable Computers, Singapore, 8–12 October 2018. [Google Scholar]
- Shirbani, F.; Blackmore, C.; Kazzi, C.; Tan, I.; Butlin, M.; Avolio, A.P. Sensitivity of Video-Based Pulse Arrival Time to Dynamic Blood Pressure Changes. In Proceedings of the 2018 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Honolulu, HI, USA, 18–21 July 2018. [Google Scholar] [CrossRef]
- Shirbani, F.; Moriarty, A.; Hui, N.; Cox, J.; Tan, I.; Avolio, A.P.; Butlin, M. Contactless video-based photoplethysmography technique comparison investigating pulse transit time estimation of arterial blood pressure. In Proceedings of the 2021 43rd Annual International Conference of the IEEE Engineering in Medicine & Biology Society (EMBC), Guadalajara, Jalisco, Mexico, 26–30 July 2021; pp. 5650–5653. [Google Scholar] [CrossRef]
- Jeong, I.C.; Finkelstein, J. Introducing Contactless Blood Pressure Assessment Using a High Speed Video Camera. J. Med. Syst. 2016, 40, 77. [Google Scholar] [CrossRef]
- Ludwig, N.; Formenti, D.; Gargano, M.; Alberti, G. Skin temperature evaluation by infrared thermography: Comparison of image analysis methods. Infrared Phys. Technol. 2014, 62, 1–6. [Google Scholar] [CrossRef]
- James, C.A.; Richardson, A.J.; Watt, P.W.; Maxwell, N.S. Reliability and validity of skin temperature measurement by telemetry thermistors and a thermal camera during exercise in the heat. J. Therm. Biol. 2014, 45, 141–149. [Google Scholar] [CrossRef] [Green Version]
- McFarlin, B.; Venable, A.; Williams, R.; Jackson, A. Comparison of techniques for the measurement of skin temperature during exercise in a hot, humid environment. Biol. Sport 2015, 32, 11–14. [Google Scholar] [CrossRef]
- de Andrade Fernandes, A.; dos Santos Amorim, P.R.; Brito, C.J.; Sillero-Quintana, M.; Marins, J.C.B. Regional Skin Temperature Response to Moderate Aerobic Exercise Measured by Infrared Thermography. Asian J. Sport. Med. 2016, 7, e29243. [Google Scholar] [CrossRef] [Green Version]
- Korman, P.; Straburzynska-Lupa, A.; Kusy, K.; Kantanista, A.; Zielinski, J. Changes in body surface temperature during speed endurance work-out in highly-trained male sprinters. Infrared Phys. Technol. 2016, 78, 209–213. [Google Scholar] [CrossRef]
- Vardasca, R. Infrared Thermography in Water Sports, in Application of Infrared Thermography in Sports Science; Springer International Publishing: Cham, Switzerland, 2017. [Google Scholar]
- Sawka, M.N.; Wenger, C.B.; Young, A.J.; Pandolf, K.B. Physiological Responses to Exercise in the Heat. In Nutritional Needs in Hot Environments: Applications for Military Personnel in Field Operations; Marriott, B.M., Ed.; National Academies Press (US): Washington, DC, USA, 1993; Volume 3. [Google Scholar]
- Lükens, J.; Boström, K.J.; Puta, C.; Schulte, T.L.; Wagner, H. Using ultrasound to assess the thickness of the transversus abdominis in a sling exercise. BMC Musculoskelet. Disord. 2015, 16, 203. [Google Scholar] [CrossRef] [Green Version]
- Manullang, M.C.T.; Lin, Y.-H.; Lai, S.-J.; Chou, N.-K. Implementation of Thermal Camera for Non-Contact Physiological Measurement: A Systematic Review. Sensors 2021, 21, 7777. [Google Scholar] [CrossRef]
- Jensen, M.M.; Poulsen, M.K.; Alldieck, T.; Larsen, R.G.; Gade, R.; Moeslund, T.B.; Franch, J. Estimation of Energy Expenditure during Treadmill Exercise via Thermal Imaging. Med. Sci. Sports Exerc. 2016, 48, 2571–2579. [Google Scholar] [CrossRef] [Green Version]
- Gade, R.; Larsen, R.G.; Moeslund, T.B. Measuring energy expenditure in sports by thermal video analysis. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Honolulu, HI, USA, 21–26 July 2017; pp. 187–194. [Google Scholar] [CrossRef]
- Ndahimana, D.; Kim, E. Measurement methods for physical activity and energy expenditure: A review. Clin. Nutr. Res. 2017, 6, 68–80. [Google Scholar] [CrossRef] [Green Version]
- Koehler, K.; Drenowatz, C. Monitoring Energy Expenditure Using a Multi-Sensor Device—Applications and Limitations of the SenseWear Armband in Athletic Populations. Front. Physiol. 2017, 8, 983. [Google Scholar] [CrossRef] [Green Version]
- Koporec, G.; Vučković, G.; Milić, R.; Perš, J. Quantitative Contact-Less Estimation of Energy Expenditure from Video and 3D Imagery. Sensors 2018, 18, 2435. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Tao, L. SPHERE-Calorie; University of Bristol: Bristol, UK, 2017. [Google Scholar] [CrossRef]
- Tao, L.; Burghardt, T.; Mirmehdi, M.; Damen, D.; Cooper, A.; Hannuna, S.; Camplani, M.; Paiement, A.; Craddock, I. Calorie Counter: RGB-Depth Visual Estimation of Energy Expenditure at Home. In Proceedings of the Computer Vision—ACCV 2016 Workshops, Proceedings of ACCV 2016 International Workshops, Revised Selected Papers, Taipei, Taiwan, 20–24 November 2016; Springer: Cham, Switzerland, 2016. [Google Scholar]
- Ellwein, L.; Samyn, M.M.; Danduran, M.; Schindler-Ivens, S.; Liebham, S.; LaDisa, J.F. Toward translating near-infrared spectroscopy oxygen saturation data for the non-invasive prediction of spatial and temporal hemodynamics during exercise. Biomech. Model. Mechanobiol. 2017, 16, 75–96. [Google Scholar] [CrossRef] [Green Version]
- Lucero, A.A.; Addae, G.; Lawrence, W.; Neway, B.; Credeur, D.P.; Faulkner, J.; Rowlands, D.; Stoner, L. Reliability of muscle blood flow and oxygen consumption response from exercise using near-infrared spectroscopy. Exp. Physiol. 2017, 103, 90–100. [Google Scholar] [CrossRef] [Green Version]
- Aoki, H.; Nakamura, H. Non-Contact Respiration Measurement during Exercise Tolerance Test by Using Kinect Sensor. Sports 2018, 6, 23. [Google Scholar] [CrossRef] [Green Version]
- Aoki, H.; Ichimura, S.; Kiyooka, S.; Koshiji, K. Non-contact measurement method of respiratory movement under pedal stroke motion. In Proceedings of the 2007 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Lyon, France, 22–26 August 2007; IEEE: New York, NY, USA, 2007; pp. 374–377. [Google Scholar]
- Aoki, H.; Sakaguchi, M.; Fujimoto, H.; Tsuzuki, K.; Nakamura, H. Noncontact Respiration Measurement under Pedaling Motion with Upright Bicycle Ergometer Using Dot-matrix Pattern Light Projection. In Proceedings of the Tencon 2010 IEEE Region 10 Conference, Fukuoka, Japan, 21–24 November 2010; pp. 1761–1765. [Google Scholar] [CrossRef]
- Anbu, A.; Selvakumar, K. Non-contact breath cycle analysis for different breathing patterns using RGB-D videos. Smart Health 2022, 25, 100297. [Google Scholar] [CrossRef]
- Persinger, R.; Foster, C.; Gibson, M.; Fater, D.C.W.; Porcari, J.P. Consistency of the Talk Test for exercise prescription. Med. Sci. Sports Exerc. 2004, 36, 1632–1636. [Google Scholar] [CrossRef]
- Sun, G.; Matsui, T.; Watai, Y.; Kim, S.; Kirimoto, T.; Suzuki, S.; Hakozaki, Y. Vital-SCOPE: Design and Evaluation of a Smart Vital Sign Monitor for Simultaneous Measurement of Pulse Rate, Respiratory Rate, and Body Temperature for Patient Monitoring. J. Sens. 2018, 2018, 4371872. [Google Scholar] [CrossRef] [Green Version]
- Aguilar, J.G. Respiration Tracking Using the Wii Remote Game-Controller. In User Centred Networked Health Care; IOS Press: Amsterdam, The Netherlands, 2011; Volume 169, pp. 455–459. [Google Scholar]
- Sharpe, M.; Wilks, D. Fatigue. BMJ 2002, 325, 480–483. [Google Scholar] [CrossRef]
- Krupp, L.B. Fatigue in multiple sclerosis: Definition, pathophysiology and treatment. CNS Drugs 1972, 17, 225–234. [Google Scholar] [CrossRef]
- Hulme, K.; Safari, R.; Thomas, S.; Mercer, T.; White, C.; Van der Linden, M.; Moss-Morris, R. Fatigue interventions in long term, physical health conditions: A sco** review of systematic reviews. PLoS ONE 2018, 13, 203367. [Google Scholar] [CrossRef]
- Karlsen, K.; Larsen, J.P.; Tandberg, E.; Jørgensen, K. Fatigue in patients with Parkinson’s disease. Mov. Disord. 1999, 14, 237–241. [Google Scholar] [CrossRef]
- Haque, M.A.; Irani, R.; Nasrollahi, K.; Thomas, M.B. Facial video based detection of physical fatigue for maximal muscle activity. IET Comput. Vis. 2016, 10, 323–329. [Google Scholar] [CrossRef] [Green Version]
- Irani, R.; Nasrollahi, K.; Moeslund, T.B. Contactless measurement of muscle fatigue by tracking facial feature points in video. In Proceedings of the IEEE International Conference on Image Processing (ICIP), Paris, France, 27–30 October 2014; pp. 4181–5186. [Google Scholar]
- Uchida, M.C.; Carvalho, R.; Tessutti, V.D.; Pereira Bacurau, R.F.; Coelho-Junior, H.J.; Portas Capelo, L.; Prando Ramos, H.; dos Santos, M.C.; Teixeira, L.F.M.; Marchetti, P.H. Identification of muscle fatigue by tracking facial expressions. PLoS ONE 2018, 13, 208834. [Google Scholar] [CrossRef]
- Lopez, M.B.; Del-Blanco, C.R.; Garcia, N. Detecting exercise-induced fatigue using thermal imaging and deep learning. In Proceedings of the 2017 Seventh International Conference on Image Processing Theory, Tools and Applications (IPTA), Montreal, QC, Canada, 28 November–1 December 2017. [Google Scholar]
- Grassi, B.; Marzorati, M.; Lanfranconi, F.; Ferri, A.; Longaretti, M.; Stucchi, A.; Vago, P.; Marconi, C.; Morandi, L. Impaired oxygen extraction in metabolic myopathies: Detection and quantification by near-infrared spectroscopy. Muscle Nerve 2007, 35, 510–520. [Google Scholar] [CrossRef]
- Khanal, S.R.; Barroso, J.; Sampaio, J.; Filipe, V. Classification of physical exercise intensity by using facial expression analysis. In Proceedings of the 2018 Second International Conference on Computing Methodologies and Communication (ICCMC), Erode, India, 15–16 February 2018; pp. 765–770. [Google Scholar] [CrossRef]
- Khanal, S.R.; Sampaio, J.; Exel, J.; Barroso, J.; Filipe, V. Using Computer Vision to Track Facial Color Changes and Predict Heart Rate. J. Imaging 2022, 8, 245. [Google Scholar] [CrossRef]
- Prawiro, E.A.P.J.; Hu, C.C.; Chan, Y.S.; Chang, C.H.; Lin, Y.H. A heart rate detection method for low power exercise intensity monitoring device. In Proceedings of the 2014 IEEE International Symposium on Bioelectronics and Bioinformatics (IEEE ISBB 2014), Chung Li, Taiwan, 11–14 April 2014. [Google Scholar]
- Wiles, J.D.A.; Allum, S.R.; Coleman, D.A.; Swaine, I.L. The relationships between exercise intensity, heart rate, and blood pressure during an incremental isometric exercise test. J. Sports Sci. 2008, 26, 155–162. [Google Scholar] [CrossRef] [PubMed]
- Comon, P. Independent component analysis, A new concept? Signal Process. 1994, 36, 287–314. [Google Scholar] [CrossRef]
- Pal, M.; Roy, R.; Basu, J.; Bepari, M.S. Blind source separation: A review and analysis. In Proceedings of the 2013 International Conference Oriental COCOSDA held jointly with 2013 Conference on Asian Spoken Language Research and Evaluation (O-COCOSDA/CASLRE), Gurgaon, India, 25–27 November 2013. [Google Scholar]
- Vojciechowski, A.S.; Natal, J.Z.; Gomes, A.R.S.; Rodrigues, E.V.; Villegas, I.L.P.; Korelo, R.I.G. Effects of exergame training on the health promotion of young adults. Fisioter. Mov. 2017, 30, 59–67. [Google Scholar] [CrossRef]
- Hung, K.; Zhang, Y. Preliminary investigation of pupil size variability: Toward non-contact assessment of cardiovascular variability. In Proceedings of the 2006 3rd IEEE/EMBS International Summer School on Medical Devices and Biosensors, Cambridge, MA, USA, 4–6 September 2006. [Google Scholar]
- Inder, J.D.; Carlson, D.J.; Dieberg, G.; McFarlane, J.R.; Hess, N.C.; Smart, N.A. Isometric exercise training for blood pressure management: A systematic review and meta-analysis to optimize benefit. Hypertens. Res. 2016, 39, 88. [Google Scholar] [CrossRef]
- Palatini, P. Blood Pressure Behaviour During Physical Activity. Sport. Med. 1988, 5, 353–374. [Google Scholar] [CrossRef]
- Fernandes, A.A.; Gomes Moreira, D.; Brito, C.J.; da Silva, C.D.; Sillero-Quintana, M.; Mendonca Pimenta, E.; Bach, A.J.E.; Silami Garcia, E.; Bouzas Marins, J.C. Validity of inner canthus temperature recorded by infrared thermography as a non-invasive surrogate measure for core temperature at rest, during exercise and recovery. J. Therm. Biol. 2016, 62, 50–55. [Google Scholar] [CrossRef]
- Aliverti, A. The respiratory muscles during exercise. Breathe 2016, 12, 165–168. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Lopes, A.T.; Aguiar, E.; Souza, A.F.; Oliveira-Santos, T. Facial expression recognition with Convolutional Neural Networks: Co** with few data and the training sample order. Pattern Recognit. 2017, 61, 610–628. [Google Scholar] [CrossRef]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. ImageNet classification with deep convolutional neural networks. In Proceedings of the 25th International Conference on Neural Information Processing Systems, Lake Tahoe, NV, USA, 3–6 December 2012; p. 1. [Google Scholar]
- Simonyan, K.; Zisserman, A. Very deep convolutional networks for large-scale image recognition. ar**v 2014, ar**v:1409.1556. [Google Scholar]
- Miles, K.H.; Clark, B.; Périard, J.D.; Goecke, R.; Thompson, K.G. Facial feature tracking: A psychophysiological measure to assess exercise intensity? J. Sports Sci. 2018, 36, 934–941. [Google Scholar] [CrossRef]
- Viola, P.; Jones, M. Robust Real-time Object Detection. In Proceedings of the Second International Workshop on Statistical and Computational Theories of Vision, Vancouver, BC, Canada, 13 July 2001; pp. 1–25. [Google Scholar]
| Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).