Thermal Imaging-based Physiological and Affective computing (TIPA)

TIPA Opensource project

Created: August 2019 (very initial stage)
Author(s): Dr. Youngjun Cho (Assistant Professor, Department of Computer Science, University College London, UCL)

This project is to support the ACII 2019's tutorial on Thermal Imaging-based Physiological and Affective computing

Full source code: https://github.com/deepneuroscience/TIPA
Example dataset: Link
Temporary TIPA opensource project website: http://youngjuncho.com/TIPA

Key Reference

[1] Youngjun Cho and Nadia Bianchi-Berthouze. 2019. Physiological and Affective Computing through Thermal Imaging: A Survey. arXiv:1908.10307 [cs], http://arxiv.org/abs/1908.10307

Further Technical References

[2] Cho, Y., Julier, S.J., Marquardt, N. and Bianchi-Berthouze, N., 2017. Robust tracking of respiratory rate in high-dynamic range scenes using mobile thermal imaging. Biomedical optics express, 8(10), pp.4480-4503. https://doi.org/10.1364/BOE.8.004480

[3] Cho, Y., Julier, S.J. and Bianchi-Berthouze, N., 2019. Instant Stress: Detection of Perceived Mental Stress Through Smartphone Photoplethysmography and Thermal Imaging. JMIR mental health, 6(4), p.e10140. https://doi.org/10.2196/10140

[4] Cho, Y., Bianchi-Berthouze, N. and Julier, S.J., 2017. DeepBreath: Deep learning of breathing patterns for automatic stress recognition using low-cost thermal imaging in unconstrained settings. In 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII) (pp. 456-463). IEEE. https://doi.org/10.1109/ACII.2017.8273639

[5] Cho, Y., Bianchi-Berthouze, N., Marquardt, N. and Julier, S.J., 2018. Deep Thermal Imaging: Proximate Material Type Recognition in the Wild through Deep Learning of Spatial Surface Temperature Patterns. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, ACM. https://doi.org/10.1145/3173574.3173576

Let's start!

Basic tutorial

1. Import TIPA libraries

In [1]:
import sys
from platform import python_version
# sys.path.insert(0,'./TIPA_library/')

from TIPA_library.main.data_preparation import *
from TIPA_library.main.thermal_image_processing import *
from TIPA_library.utils import timshow as tim
from TIPA_library.utils import rvs 

2. Loading a raw sequence of thermal 2d matrices

The TIPA project mainly uses the TIPA frame protocol below by default.

For FLIR ONE (SDK) users, you can simply implement the code from the link below. https://github.com/deepneuroscience/DeepThermalImaging/tree/master/example%20code%20for%20FLIR%20One%20sdk

frame_protocol.png

    Figure 1. TIPA (Thermal Imaging-based Physiological and Affective computing) Project Dataframe protocol

Example Dataset

We provide example data.

Download this dataset - Link
Unzip, move them to a directory (./data)
./data/example_data.dat
./data/example_data_in_front_of_building.dat

In [2]:
# The matrix size has to be known in advance. e.g.320 x 240
# data = data_preparation_TIPA_protocol('./data/example_data.dat',320,240)
data = data_preparation_TIPA_protocol('./data/example_data_in_front_of_building.dat',320,240)
# print(data.time_stamp)

For the ones using other thermal cameras,
simply load raw thermal matrices with your known framerate
(you may need to convert the ascii format, e.g. *.asc to a matrix.)
contact youngjun[dot]cho[at]ucl.ac.uk for any quiries

In [3]:
# data = data_preparation_raw_matrix(matrix, framerate)

3. Manual Inspection of the loaded thermal matrices


Tool 1: interactive_thermal_matrix_view

simply, you can examine temperatures on your thermal image with your cursor.

In [4]:
from ipywidgets import interact, interactive, fixed, interact_manual
import ipywidgets as widgets
import matplotlib.pyplot as plt

%matplotlib notebook
def update(fig):
    fig.canvas.draw()
    
def interactive_thermal_matrix_view(matrix):
    fig= plt.figure()
    ax = fig.add_subplot(1,1,1)
    plt.imshow(matrix)
    
    interact(update(fig))


frame_number = 1
interactive_thermal_matrix_view(data.thermal_matrix[:,:,frame_number])

Tool 2: Interactive ImShow Cond1 for multiple frames

simply, you can change the value on the slider (frame_num). e.g. frame number = 100

In [5]:
from ipywidgets import interactive, FloatRangeSlider, Output, VBox
%matplotlib inline

interactive_plot = interactive(data.interactive_imshow_cond,  frame_number=(0, data.thermal_matrix.shape[2]-1))
output = interactive_plot.children[-1]
output.layout.height = '320px'
interactive_plot
# VBox([range_slider,interactive_plot])

a.png

Tool 3: Interactive ImShow Cond2 for multiple frames with a function to select your thermal range of interest

in short, you can change the value on the slider (frame_num). e.g. frame number = 100
and you can adjust the thermal (temperature) range of interest

In [6]:
from ipywidgets import interactive, FloatRangeSlider, Output, VBox, Layout
%matplotlib inline

# layout = Layout(width='500px')
range_slider = widgets.FloatRangeSlider(
    value=[0, +50],
    min=0., max=+60., step=0.1,
    description='thermal range of interest',
    readout_format='.1f',
#     layout=layout
)
# range_slider

interactive_plot = interactive(data.interactive_imshow_cond2,  frame_number=(0, data.thermal_matrix.shape[2]-1), thermal_range=range_slider)
output = interactive_plot.children[-1]
output.layout.height = '320px'
interactive_plot
# VBox([range_slider,interactive_plot])

b.png

Tool 4: Interactive ImShow Cond 3

for multiple frames with the optimal quantization to automatically select your thermal range of interest

in short, you can change the value on the slider (frame_num). e.g. frame number = 100
and this will automatically select your thermal (temperature) range of interest
We will come back here again on the preprocessing section

[Key reference]

Cho, Y., Julier, S.J., Marquardt, N. and Bianchi-Berthouze, N., 2017. Robust tracking of respiratory rate in high-dynamic range scenes using mobile thermal imaging. Biomedical optics express, 8(10), pp.4480-4503. https://doi.org/10.1364/BOE.8.004480

In [7]:
from ipywidgets import interactive, FloatRangeSlider, Output, VBox, Layout
%matplotlib inline


interactive_plot = interactive(data.interactive_imshow_cond3,  frame_number=(0, data.thermal_matrix.shape[2]-1))
output = interactive_plot.children[-1]
output.layout.height = '320px'
interactive_plot
# VBox([range_slider,interactive_plot])

c.png

4. Computational Pipeline

Here, we focus on the case where an automatic ROI tracking method is used. For the ROI selection, a ROI can be chosen either manually or automatically. (here we select it manually)

figure2.png
Figure 2. Computational pipeline that has been commonly applied to studies on thermal imaging-based physiological computing: this consists of three main steps, the ROI selection, automatic ROI tracking and spatial interpretation. a) with automatic ROI tracking, b) without automatic ROI tracking (in this case a head fixation mount is used)

[Reference] Youngjun Cho and Nadia Bianchi-Berthouze. 2019. Physiological and Affective Computing through Thermal Imaging: A Survey. arXiv:1908.10307 [cs], http://arxiv.org/abs/1908.10307


4.1. Preprocessing (Quantization)

[Two standard methods]

A) Non-optimal (linear) quantization
Mapping temperatures to thermal images with a selected temperature range of interest, which is traditionally fixed from the first thermogram frame (e.g. 30°C to 40°C)

B) Optimal quantization
Mapping temperatures to thermal images adaptively against environmental temperature effects.

Go back to Section 3 and compare Tool C with Tool D (optimal quantization is applied)

[Key reference]

Cho, Y., Julier, S.J., Marquardt, N. and Bianchi-Berthouze, N., 2017. Robust tracking of respiratory rate in high-dynamic range scenes using mobile thermal imaging. Biomedical optics express, 8(10), pp.4480-4503. https://doi.org/10.1364/BOE.8.004480

figure3.png
Figure 3. High thermal dynamic range scenes: fixed thermal range of interest is not suitable in preserving the morphological facial shape within varying ambient temperature: [top] examples of thermogram shots collected from a person walking outdoor (for 6 minutes), [bottom] temperature histograms

4.1.1. Non-optimal (linear) quantization

In [39]:
frame_number = 1

temp_mat = copy.deepcopy(data.thermal_matrix[:,:,frame_number])   ## Object should be deep copied. not mat = data.thermal_matrix
output = nonoptimal_quantization(temp_mat, 30, 40, True)

%matplotlib inline
tim.timshow(output)
your fixed thermal range of interest is [30.000000, 40.000000]

4.1.2. Optimal quantization

In [40]:
frame_number = 1
temp_mat = copy.deepcopy(data.thermal_matrix[:,:,frame_number])   ## Object should be deep copied. not mat = data.thermal_matrix
output = optimal_quantization(temp_mat, True)

%matplotlib inline
tim.timshow(output)
optimal thermal range is [24.259115, 33.570000]

4.2. Automatic ROI Tracking

This consists of 1) ROI selection, 2) Quantization, 3) ROI Tracking

Note: advanced trackers need to be implemented here we are only using widely used motion tracking methods such as Median Flow, TLD etc.

In [14]:
print('1. Select your ROI and press Enter')
print('2. Press ESC to exit')

ROI_seq, t_video =thermal_tracker(data.thermal_matrix, 'optimal', 'MEDIANFLOW', False, False)
# ROI_seq, t_video =thermal_tracker(data.thermal_matrix, 'optimal', 'TLD', False, False)
# ROI_seq, t_video =thermal_tracker(data.thermal_matrix, 'non-optimal', 'MEDIANFLOW', False, False, True, 0, 30)     
# ROI_seq, t_video =thermal_tracker(data.thermal_matrix) 
data.tracked_matrix=t_video
1. Select your ROI and press Enter
2. Press ESC to exit
frame: 20
tracking error occurred at 37 frame
tracking error occurred at 38 frame
tracking error occurred at 39 frame
frame: 40
tracking error occurred at 40 frame
frame: 60
frame: 80
frame: 100
frame: 120
frame: 140
frame: 160
frame: 180
frame: 200
frame: 220
frame: 240
frame: 260
frame: 280
frame: 300
frame: 320
frame: 340
frame: 360
frame: 380
frame: 400
frame: 420
frame: 440
frame: 460
frame: 480
frame: 500
frame: 520
frame: 540
frame: 560
frame: 580
frame: 600
frame: 620
frame: 640
frame: 660
frame: 680
frame: 700
frame: 720
frame: 740
frame: 760
frame: 780
frame: 800
frame: 820
frame: 840
frame: 860
frame: 880
frame: 900
frame: 920
frame: 940
frame: 960
frame: 980
frame: 1000
frame: 1020
frame: 1040
frame: 1060
frame: 1080
frame: 1100
frame: 1120
frame: 1140
frame: 1160
frame: 1180
frame: 1200
frame: 1220
frame: 1240
frame: 1260
frame: 1280
frame: 1300
frame: 1320
frame: 1340
frame: 1360
frame: 1380
frame: 1400
frame: 1420
frame: 1440
frame: 1460
frame: 1480
frame: 1500
frame: 1520
frame: 1540
frame: 1560
frame: 1580
frame: 1600
frame: 1620
frame: 1640
frame: 1660
frame: 1680
frame: 1700
frame: 1720
frame: 1740
frame: 1760
frame: 1780
frame: 1800
frame: 1820
frame: 1840
frame: 1860
frame: 1880
frame: 1900
frame: 1920
frame: 1940
frame: 1960
frame: 1980
frame: 2000
frame: 2020
frame: 2040
frame: 2060
frame: 2080
frame: 2100
frame: 2120
frame: 2140
frame: 2160
frame: 2180
frame: 2200
In [42]:
from ipywidgets import interactive, FloatRangeSlider, Output, VBox
%matplotlib inline

interactive_plot = interactive(data.interactive_imshow_cond4,  frame_number=(0, data.thermal_matrix.shape[2]-1))
output = interactive_plot.children[-1]
output.layout.height = '320px'
interactive_plot
# VBox([range_slider,interactive_plot])

d.png

4.3. Spatial Interpretation

Here, we use a classical method only - averaging.

In [43]:
# print(ROI_seq)
ft_vector = np.zeros((5,data.thermal_matrix.shape[2]))
for i in range(data.thermal_matrix.shape[2]) :
    ft_vector[1,i]= np.mean(data.thermal_matrix[int(ROI_seq[0,i]):int(ROI_seq[0,i]+ROI_seq[2,i]) ,int(ROI_seq[1,i]):int(ROI_seq[1,i]+ROI_seq[3,i]),i])

    
Plot the extracted signals

Now you have to prgramm your codes by yourself to refine your signals (e.g. bandpass filtering, frequency analysis, etc)

In [49]:
from ipywidgets import interactive, FloatRangeSlider, Output, VBox, Layout, fixed
%matplotlib inline


layout = Layout(width='500px')
range_slider = widgets.FloatRangeSlider(
    value=[10, +50],
    min=0., max= data.time_stamp[-1], step=1,
    description='range',
    readout_format='.1f',
    layout=layout
)
# range_slider

def interactive_timeplot(time, signal, fig_w, range_bar):
    fig = plt.gcf()
    yourDPI = fig.get_dpi()

#     plt.figure(figsize=(fig_w/yourDPI,(fig_w/yourDPI)/3))
    plt.plot(time, signal)
    plt.axis([range_bar[0], range_bar[1],-1+min(signal[int(range_bar[0]):int(range_bar[1])]), 1+max(signal[int(range_bar[0]):int(range_bar[1])])])
#     plt.axis([time[int(range_bar[0])], time[int(range_bar[1])],min(signal[int(range_bar[0]):int(range_bar[1])]), max(signal[int(range_bar[0]):int(range_bar[1])])])
#                          min(signal[range_bar[0]:range_bar[1]]), max(signal[range_bar[0]:range_bar[1]])]
#     plt.show()
    
m_interactive_timeplot = interactive(interactive_timeplot, time= fixed(data.time_stamp), signal=fixed(ft_vector[1,:]), fig_w=fixed(500), range_bar=range_slider)
m_interactive_timeplot

e.png

close your extra opencv window

In [18]:
cv2.destroyAllWindows()

Note: currently working on codes for the spectrogram analysis TIPA_library.utils: overlap_windows, overlap_matrix, gausswin, compute_frequency_grid, rvs if your targeted signature is about respiratory or cardiac pulse, then you could use this code (respiration variability spectrogram) later
A key reference for this: Cho, Y., Bianchi-Berthouze, N. and Julier, S.J., 2017. DeepBreath: Deep learning of breathing patterns for automatic stress recognition using low-cost thermal imaging in unconstrained settings. In 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII) (pp. 456-463). IEEE. https://doi.org/10.1109/ACII.2017.8273639


you need to use your own data as the given samples are with non-fixed sampling rates.
rvs_output= rvs.rvs(8, ft_vector) (incomplete)


Lastly, we warmly welcome potential contributors!

In [ ]: