Abstract
Depth cameras can improve non-contact patient monitoring systems in the Neonatal Intensive Care Unit (NICU). Camera placement is secondary to equipment used for patient care; therefore, a method was developed to correct for non-ideal camera placement. The mean absolute percentage error (MAPE) of the perspective transformation method of correcting the viewpoint of the camera was tested on 28 patients was found to be 5.58% for camera angles up to 38.58° away from the optimal camera placement.
Since depth data can be more privacy-preserving than RGB or RGB-D data, Region-of-Interest (ROI) selection using depth cameras can be enable the automatic blurring of identifiable features. An ROI selection method was developed and tested for the use of extracting a respiratory rate signal. The ROI selection method was evaluated against manually selected ROIs and found to result in an average Sørensen–Dice coefficient of 0.62 and Jaccard index of 0.46.
The signal extracted from the automatically selected ROI was compared to a simpler method resulting in an improvement to the percentage of acceptable estimates, where the mean absolute error is less than 5 breaths per minute (3.60% to 13.47% in the frequency domain and 6.12% to 8.97% in the time domain).
Clinical interventions and routine care in the NICU can disrupt the process of data collection, and commonly need to be excluded from recording when studying patients in the NICU. Detecting these periods automatically can decrease the time needed for hand-annotating segments of recordings and may further be used for intervention classification in the future. An intervention detection method based solely on depth data was developed using a vision transformer model. Multiple variables were investigated, and the performance was compared to the state-of-the-art in the field. The best performing model was utilized ~85M trainable parameters and was trained and evaluated on data that had been perspective transformed and HHA encoded and was found to achieve a sensitivity of 85.6%, precision of 89.8%, and F1-Score of 87.6%.
Download
You can view the thesis as a PDF here. The DOI is 10.22215/etd/2023-15371. The Git repository containing the code and scripts used for analysis can be found here.
Citation
@thesis{hajj-ali-DepthbasedPatientMonitoring-2023,
type = {Master of Applied Science},
title = {Depth-Based {{Patient Monitoring}} in the {{NICU}} with {{Non-Ideal Camera Placement}}},
author = {Hajj-Ali, Zein},
year = {2023},
school = {{Carleton University}},
location = {{Ottawa, Ontario}},
doi = {10.22215/etd/2023-15371},
url = {https://curve.carleton.ca/7e6b31bc-7351-4a66-a4ac-3955dea2e7b5}
}
Copyright Information
© 2023 Zein Hajj-Ali.