Created: August 2019 (very initial stage)
Author(s): Dr. Youngjun Cho (Assistant Professor, Department of Computer Science, University College London, UCL)
This project is to support the ACII 2019's tutorial on Thermal Imaging-based Physiological and Affective computing
Full source code: https://github.com/deepneuroscience/TIPA
Example dataset: Link
Temporary TIPA opensource project website: http://youngjuncho.com/TIPA
[1] Youngjun Cho and Nadia Bianchi-Berthouze. 2019. Physiological and Affective Computing through Thermal Imaging: A Survey. arXiv:1908.10307 [cs], http://arxiv.org/abs/1908.10307
[2] Cho, Y., Julier, S.J., Marquardt, N. and Bianchi-Berthouze, N., 2017. Robust tracking of respiratory rate in high-dynamic range scenes using mobile thermal imaging. Biomedical optics express, 8(10), pp.4480-4503. https://doi.org/10.1364/BOE.8.004480
[3] Cho, Y., Julier, S.J. and Bianchi-Berthouze, N., 2019. Instant Stress: Detection of Perceived Mental Stress Through Smartphone Photoplethysmography and Thermal Imaging. JMIR mental health, 6(4), p.e10140. https://doi.org/10.2196/10140
[4] Cho, Y., Bianchi-Berthouze, N. and Julier, S.J., 2017. DeepBreath: Deep learning of breathing patterns for automatic stress recognition using low-cost thermal imaging in unconstrained settings. In 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII) (pp. 456-463). IEEE. https://doi.org/10.1109/ACII.2017.8273639
[5] Cho, Y., Bianchi-Berthouze, N., Marquardt, N. and Julier, S.J., 2018. Deep Thermal Imaging: Proximate Material Type Recognition in the Wild through Deep Learning of Spatial Surface Temperature Patterns. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, ACM. https://doi.org/10.1145/3173574.3173576
Let's start!
import sys
from platform import python_version
# sys.path.insert(0,'./TIPA_library/')
from TIPA_library.main.data_preparation import *
from TIPA_library.main.thermal_image_processing import *
from TIPA_library.utils import timshow as tim
from TIPA_library.utils import rvs
The TIPA project mainly uses the TIPA frame protocol below by default.
For FLIR ONE (SDK) users, you can simply implement the code from the link below. https://github.com/deepneuroscience/DeepThermalImaging/tree/master/example%20code%20for%20FLIR%20One%20sdk
Figure 1. TIPA (Thermal Imaging-based Physiological and Affective computing) Project Dataframe protocol
# The matrix size has to be known in advance. e.g.320 x 240
# data = data_preparation_TIPA_protocol('./data/example_data.dat',320,240)
data = data_preparation_TIPA_protocol('./data/example_data_in_front_of_building.dat',320,240)
# print(data.time_stamp)
For the ones using other thermal cameras,
simply load raw thermal matrices with your known framerate
(you may need to convert the ascii format, e.g. *.asc to a matrix.)
contact youngjun[dot]cho[at]ucl.ac.uk for any quiries
# data = data_preparation_raw_matrix(matrix, framerate)
simply, you can examine temperatures on your thermal image with your cursor.
from ipywidgets import interact, interactive, fixed, interact_manual
import ipywidgets as widgets
import matplotlib.pyplot as plt
%matplotlib notebook
def update(fig):
fig.canvas.draw()
def interactive_thermal_matrix_view(matrix):
fig= plt.figure()
ax = fig.add_subplot(1,1,1)
plt.imshow(matrix)
interact(update(fig))
frame_number = 1
interactive_thermal_matrix_view(data.thermal_matrix[:,:,frame_number])
simply, you can change the value on the slider (frame_num). e.g. frame number = 100
from ipywidgets import interactive, FloatRangeSlider, Output, VBox
%matplotlib inline
interactive_plot = interactive(data.interactive_imshow_cond, frame_number=(0, data.thermal_matrix.shape[2]-1))
output = interactive_plot.children[-1]
output.layout.height = '320px'
interactive_plot
# VBox([range_slider,interactive_plot])
in short, you can change the value on the slider (frame_num). e.g. frame number = 100
and you can adjust the thermal (temperature) range of interest
from ipywidgets import interactive, FloatRangeSlider, Output, VBox, Layout
%matplotlib inline
# layout = Layout(width='500px')
range_slider = widgets.FloatRangeSlider(
value=[0, +50],
min=0., max=+60., step=0.1,
description='thermal range of interest',
readout_format='.1f',
# layout=layout
)
# range_slider
interactive_plot = interactive(data.interactive_imshow_cond2, frame_number=(0, data.thermal_matrix.shape[2]-1), thermal_range=range_slider)
output = interactive_plot.children[-1]
output.layout.height = '320px'
interactive_plot
# VBox([range_slider,interactive_plot])
in short, you can change the value on the slider (frame_num). e.g. frame number = 100
and this will automatically select your thermal (temperature) range of interest
We will come back here again on the preprocessing section
Cho, Y., Julier, S.J., Marquardt, N. and Bianchi-Berthouze, N., 2017. Robust tracking of respiratory rate in high-dynamic range scenes using mobile thermal imaging. Biomedical optics express, 8(10), pp.4480-4503. https://doi.org/10.1364/BOE.8.004480
from ipywidgets import interactive, FloatRangeSlider, Output, VBox, Layout
%matplotlib inline
interactive_plot = interactive(data.interactive_imshow_cond3, frame_number=(0, data.thermal_matrix.shape[2]-1))
output = interactive_plot.children[-1]
output.layout.height = '320px'
interactive_plot
# VBox([range_slider,interactive_plot])
Here, we focus on the case where an automatic ROI tracking method is used. For the ROI selection, a ROI can be chosen either manually or automatically. (here we select it manually)
Figure 2. Computational pipeline that has been commonly applied to studies on thermal imaging-based physiological computing: this consists of three main steps, the ROI selection, automatic ROI tracking and spatial interpretation. a) with automatic ROI tracking, b) without automatic ROI tracking (in this case a head fixation mount is used)
[Reference]
Youngjun Cho and Nadia Bianchi-Berthouze. 2019. Physiological and Affective Computing through Thermal Imaging: A Survey. arXiv:1908.10307 [cs], http://arxiv.org/abs/1908.10307
A) Non-optimal (linear) quantization
Mapping temperatures to thermal images with a selected temperature range of interest, which is traditionally fixed from the first thermogram frame (e.g. 30°C to 40°C)
B) Optimal quantization
Mapping temperatures to thermal images adaptively against environmental temperature effects.
Go back to Section 3 and compare Tool C with Tool D (optimal quantization is applied)
Cho, Y., Julier, S.J., Marquardt, N. and Bianchi-Berthouze, N., 2017. Robust tracking of respiratory rate in high-dynamic range scenes using mobile thermal imaging. Biomedical optics express, 8(10), pp.4480-4503. https://doi.org/10.1364/BOE.8.004480
Figure 3. High thermal dynamic range scenes: fixed thermal range of interest is not suitable in preserving the morphological facial shape within varying ambient temperature: [top] examples of thermogram shots collected from a person walking outdoor (for 6 minutes), [bottom] temperature histograms
frame_number = 1
temp_mat = copy.deepcopy(data.thermal_matrix[:,:,frame_number]) ## Object should be deep copied. not mat = data.thermal_matrix
output = nonoptimal_quantization(temp_mat, 30, 40, True)
%matplotlib inline
tim.timshow(output)
frame_number = 1
temp_mat = copy.deepcopy(data.thermal_matrix[:,:,frame_number]) ## Object should be deep copied. not mat = data.thermal_matrix
output = optimal_quantization(temp_mat, True)
%matplotlib inline
tim.timshow(output)
print('1. Select your ROI and press Enter')
print('2. Press ESC to exit')
ROI_seq, t_video =thermal_tracker(data.thermal_matrix, 'optimal', 'MEDIANFLOW', False, False)
# ROI_seq, t_video =thermal_tracker(data.thermal_matrix, 'optimal', 'TLD', False, False)
# ROI_seq, t_video =thermal_tracker(data.thermal_matrix, 'non-optimal', 'MEDIANFLOW', False, False, True, 0, 30)
# ROI_seq, t_video =thermal_tracker(data.thermal_matrix)
data.tracked_matrix=t_video
from ipywidgets import interactive, FloatRangeSlider, Output, VBox
%matplotlib inline
interactive_plot = interactive(data.interactive_imshow_cond4, frame_number=(0, data.thermal_matrix.shape[2]-1))
output = interactive_plot.children[-1]
output.layout.height = '320px'
interactive_plot
# VBox([range_slider,interactive_plot])
Here, we use a classical method only - averaging.
# print(ROI_seq)
ft_vector = np.zeros((5,data.thermal_matrix.shape[2]))
for i in range(data.thermal_matrix.shape[2]) :
ft_vector[1,i]= np.mean(data.thermal_matrix[int(ROI_seq[0,i]):int(ROI_seq[0,i]+ROI_seq[2,i]) ,int(ROI_seq[1,i]):int(ROI_seq[1,i]+ROI_seq[3,i]),i])
Now you have to prgramm your codes by yourself to refine your signals (e.g. bandpass filtering, frequency analysis, etc)
from ipywidgets import interactive, FloatRangeSlider, Output, VBox, Layout, fixed
%matplotlib inline
layout = Layout(width='500px')
range_slider = widgets.FloatRangeSlider(
value=[10, +50],
min=0., max= data.time_stamp[-1], step=1,
description='range',
readout_format='.1f',
layout=layout
)
# range_slider
def interactive_timeplot(time, signal, fig_w, range_bar):
fig = plt.gcf()
yourDPI = fig.get_dpi()
# plt.figure(figsize=(fig_w/yourDPI,(fig_w/yourDPI)/3))
plt.plot(time, signal)
plt.axis([range_bar[0], range_bar[1],-1+min(signal[int(range_bar[0]):int(range_bar[1])]), 1+max(signal[int(range_bar[0]):int(range_bar[1])])])
# plt.axis([time[int(range_bar[0])], time[int(range_bar[1])],min(signal[int(range_bar[0]):int(range_bar[1])]), max(signal[int(range_bar[0]):int(range_bar[1])])])
# min(signal[range_bar[0]:range_bar[1]]), max(signal[range_bar[0]:range_bar[1]])]
# plt.show()
m_interactive_timeplot = interactive(interactive_timeplot, time= fixed(data.time_stamp), signal=fixed(ft_vector[1,:]), fig_w=fixed(500), range_bar=range_slider)
m_interactive_timeplot
close your extra opencv window
cv2.destroyAllWindows()
Note: currently working on codes for the spectrogram analysis
TIPA_library.utils: overlap_windows, overlap_matrix, gausswin, compute_frequency_grid, rvs
if your targeted signature is about respiratory or cardiac pulse, then you could use this code (respiration variability spectrogram) later
A key reference for this:
Cho, Y., Bianchi-Berthouze, N. and Julier, S.J., 2017. DeepBreath: Deep learning of breathing patterns for automatic stress recognition using low-cost thermal imaging in unconstrained settings. In 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII) (pp. 456-463). IEEE. https://doi.org/10.1109/ACII.2017.8273639
you need to use your own data as the given samples are with non-fixed sampling rates.
rvs_output= rvs.rvs(8, ft_vector) (incomplete)