Tail temperature tracking - MobsLab/PreProcessing GitHub Wiki
Once you that you have finished tracking your points of interest in your video using DeepLabCut, you can export the CSV file to Matlab for further analysis.
Start by opening the output CSV file to understand the content of each column. Each line corresponds to a different frame of the video. The first column corresponds to the frame number, the second column is the x coordinates of the first tracked body part, the third column is the y coordinates of the first tracked body part, the fourth column is the likelihood score that DeepLabCut gives for its tracking of the first body part. The next three columns are the same thing for the second tracked body part : the fifth column is the x coordinates of the second tracked body part. And so on …
As an example application, use the script TailTemperatureTrackingV2.m to analyse the tracking data and extract the tail temperature. The code aims to track the temperature of a point located on the tail, at a distance of approximately one quarter of the tail length, starting from the tail base. We could use DeepLabCut to track this « body part » and extract the pixel value from the thermal camera to get a reading of its temperature, but as the tail is only 1 or 2 pixels wide, and the tracking error is in the same order of magnitude, the tracked body part is often outside the tail. So the extracted temperature would fluctuate between real tail temperature and background temperature. That would be very bad for further analysis.
The first step is to import the file using the Matlab function « uigetfile ».
After running this line, you will be asked to select the CSV file.
After that, we use the command « csvread » to extract the content from this file and store it in a variable that we named DLC. It’s a matrix.
In the case where the list of tracked body parts is as follows :
- Nose
- RightEar
- LeftEar
- Neck
- BodyCenter
- TailBase
- TailCenter
Then the X coordinates of the nose are in DLC(:,2) (second column)
The Y coordinates of the nose are in DLC(:,3)
The Y coordinates of the RightEar are in DLC(:,5) because DLC(:,4) contains the likelihood values for nose tracking.
After extracting the X and Y coordinates of every body part, we can start processing them to extract additional information. This part only involves a geometrical interpretation of these coordinates.
WARNING: The Y coordinates that DeepLabCut outputs in the CSV file correspond to a Y axis pointing downwards. The bigger the Y value, the lower the pixel is on the image. For instance (x=0,y=0) corresponds to the pixel in the top left corner of the image.
In order to extract a reliable value of tail temperature that wouldn’t dramatically depend on a perfectly accurate pixel tracking, we use a geometrical trick to do that. We take the segment between the tail base and tail center, and we determine the coordinates of the straight line perpendicular to this segment that passes from its center.
We extract the temperature profile along this line and try to determine when the line intersects the tail. It usually shows as a peak in temperature with a higher derivative than when this line crosses the body (the tail is hot and thin).
The « Radius » variable determines how long this line should be. Keep in mind that if it’s too long, we risk to cross the body or areas outside the maze more often.
In the code provided, you’ll find a section with an « If » structure. It is about the slope of the line that we have determined. Depending on how it is oriented, we run into some limit behaviour (when the line is horizontal or vertical) that causes trouble. The « If » structure ensures that we always use an approach that is not affected by the limit behaviour depending on the slope value.
We store the value of tail temperature determined at each frame in the variable TemperatureCurve.