• No results found

Improving Thermal Images for the Quality of 3D Models in Agisoft Metashape

N/A
N/A
Protected

Academic year: 2021

Share "Improving Thermal Images for the Quality of 3D Models in Agisoft Metashape"

Copied!
38
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

Bachelor Informatica

Improving Thermal Images for

the Quality of 3D Models in

Agisoft Metashape

Joey Lai

June 15, 2020

Supervisor(s): drs. J. Waagen, dr. R.G. Belleman

Inf

orma

tica

Universiteit

v

an

Ams

terd

am

(2)
(3)

Abstract

As the availability of thermal cameras and drones becomes easier, aerial thermography is increasingly used. Archaeologists have known since 1970 that aerial thermography has the potential to reveal archaeological traits. The sensors placed in thermal cameras use detection of infrared light to create thermal images. Despite its advantages, it also has its limitations because the same color range over a set of thermal images will indicate different temperature ranges. This is problematic in photogrammetric suites such as Agisoft Metashape to create 3D models.

In this thesis, we have developed a workflow that uses a normalization technique on a set of thermal images to assign the same gray values to the same temperature to create a better 3D model in Agisoft Metashape. We have made two main suggestions to further improve the normalized thermal images. Firstly, we proposed noise filtering that uses a Gaussian kernel to remove noise after the acquisition of thermal images. Secondly, we suggested an image processing technique that uses a global histogram equalization for the enhancement of contrast.

From our results, we have seen that thermal images that associate the same gray value with the same temperature are important for the dense point cloud quality in Agisoft Metashape, which in turn results in a better 3D model of the temperature of an area. Noise filtering with a Gaussian kernel on our thermal images is an essential preprocessing step to improve the quality of the dense point cloud and the number of tie points before the normalization of thermal images, while contrast enhancement using histogram equalization is not very influential.

(4)
(5)

Contents

1 Introduction 7

1.1 Background/Context . . . 7

1.2 Purpose and Goals . . . 9

1.3 Outline . . . 9

2 Related Work 11 2.1 Intuitive Colorization of Temperature in Thermal Cameras . . . 11

2.2 Normalization of Infrared Facial Images under Variant Ambient Temperatures . . 11

3 Theoretical Background 13 3.1 Forward-Looking Infrared Camera . . . 13

3.2 Infrared Imaging . . . 13

3.2.1 Image . . . 13

3.2.2 Grayscale image . . . 13

3.2.3 Radiometric images . . . 14

3.2.4 Converting thermal radiation to temperature . . . 14

3.3 Image Processing Methods . . . 14

3.3.1 Normalization . . . 14

3.3.2 Noise Filtering . . . 14

3.3.3 Contrast enhancement techniques . . . 16

4 Methodology 19 4.1 Workflow . . . 19

4.1.1 Normalization/rescaling . . . 19

4.1.2 Optional noise filtering . . . 20

4.1.3 Optional contrast enhancement . . . 20

4.1.4 Flowchart . . . 20

4.2 Agisoft Metashape . . . 21

4.3 Procedure: Performance Measures . . . 22

4.3.1 Comparing initial photos before processing . . . 23

4.3.2 Number of tie points . . . 23

4.3.3 Average reprojection error . . . 23

4.3.4 Quality of the dense point cloud . . . 23

4.4 Setup . . . 24

4.4.1 Data collection . . . 24

4.4.2 System software and hardware . . . 24

4.4.3 Equipment . . . 24

5 Experiments and Discussions 25 5.1 Original and Normalized Thermal Images . . . 25

5.1.1 Discussion . . . 27

5.2 Metashape Evaluation . . . 27

(6)

5.2.2 Average reprojection error . . . 28

5.2.3 Quality of the dense point clouds . . . 29

6 Conclusion 35 6.1 Summary and Conclusion . . . 35

6.2 Limitation . . . 35

6.3 Future Work . . . 36

(7)

CHAPTER 1

Introduction

1.1

Background/Context

As the availability of thermal cameras and drones becomes easier, aerial thermography is increas-ingly used [1]. Archaeologists have known since 1970 that aerial thermography has the potential to reveal archaeological traits [2]. They make use of aerial thermography as a geophysical ex-amination method to record thermal infrared radiation to show the variations in temperature across a scene. In an archaeological context, Cool mentions that ”anthropogenic traits and their neighboring soil matrices often embody and reradiate infrared light at different rates” [1]. This phenomenon causes features to appear as anomalies in thermographic images. As a result of this, these images can be used as maps that plot the location of underground archaeological traits [1]. These images are also called thermograms or thermal images. Most of the time, thermal data are collected using a forward-looking infrared camera [3].

Forward-looking infrared (FLIR) cameras records thermal images as grayscale images. The recording consists of assigning an intensity value per pixel, which is saved in a single channel grid. This grid is then shown as a grayscale image. They assign black pixels to the coldest regions and white pixels to the hottest regions. The resulting mapping is a relative scale that depends on the minimum and maximum temperature within each frame. As a result of this, most thermal cameras do not consistently designate the identical gray values to the same temperatures. This behavior can make comparing thermal images problematic because the same range of colors can be used to represent completely different temperature ranges [1].

The occurrence of materials such as targets commonly used in thermographic mapping with a relatively high or low temperature compared to the environment in some thermal images amplifies this problem. These targets are often made of aluminum and are placed in the survey areas so that drones know where the thermal images need to be taken. Furthermore, these targets often decrease the contrast in the thermal images so it becomes difficult to distinguish objects. The images in Figure 1.1 showcase these problems. We can see that the gray values do not correspond to the same temperature in every thermal image. This is mainly caused by the fact that every thermal image can contain different objects with a different temperature value.

(8)

(a) Thermal image 1 with an aluminum target

(b) Thermal image 2 (c) Thermal image 3

Figure 1.1: Sample thermal images of the same terrain taken in succession.

FLIR software tools have the capability to calibrate thermal images to adjust the relative scale of grayscale images [4]. However, this is not perfect as shown in the images in Figure 1.2. Even after the FLIR calibration, the gray values of the thermal image in Figure 1.2(a) do not correspond to the same temperature in the thermal images in Figure 1.2(b,c).

(a) Thermal image 1 with an aluminum target

(b) Thermal image 2 (c) Thermal image 3

Figure 1.2: Sample thermal images of the same terrain taken in succession calibrated with the FLIR software.

These effects after the acquisition of thermal images can be troublesome for specific archae-ological purposes. One important archaearchae-ological purpose is the processing of thermal images in photogrammetric suites such as Agisoft Metashape in order to create 3D spatial data of a terrain [5]. According to the manual of Agisoft Metashape, ”the software product performs pho-togrammetric processing of digital images (aerial and close-range photography) and generates 3D spatial data to be used in Geographic information system (GIS) applications, cultural heritage documentation, and visual effects production as well as for indirect measurements of objects with various scales” [6].

One standard job for a photogrammetry processing project in Metashape using thermal im-ages is to construct a textured orthophotos and even a 3D model of the temperature in an area [7]. The data processing method with Agisoft Metashape on images consists of three steps rele-vant for our thesis. The first step of the method is called alignment. At this stage, Metashape searches for feature points on the images and matches them over thermal images into tie points. A tie point (TP) is a distinct area that is recognizable visually in the overlapping area among two or more images [6]. The second step consists of the creation of a dense point cloud using these tie points. This dense point cloud is a set of 3D points to construct models [8]. The last step involves the generation of a polygonal mesh and 3D models using the dense point cloud. The resulting models can be textured for the photorealistic digital reproduction of the scene and exported in various formats compatible with other software to process the textured models [7].

Since tie points are created based on feature points found on the thermal images, it may be important that the feature points in two or more thermal images look the same [7]. Poor input, as well as thermal images that do not associate the same gray value to the same temperature, could consequently have a negative effect on the alignment results. This would result in a poor dense point cloud and a low-quality 3D model [7].

(9)

Therefore, it is important that thermal images associate the same gray value with the same temperature. This would improve the performance of the first step of Agisoft Metashape with the creation of correct tie points. Additionally, improving the contrast of thermal images could also have a positive effect on the alignment step.

Furthermore, thermal images are often corrupted with noise during image retrieval [9]. Un-desirable effects, such as artifacts, are created by noise [10]. Hence, the elimination of noise is regularly an essential preprocessing step in image processing applications.

1.2

Purpose and Goals

We have introduced several reasons why thermal images may not work well in Agisoft Metashape to create a 3D model of the temperature of an area. From these reasons, we can formulate several research questions. The main research question in this thesis is as follows:

1. How important are thermal images that associate the same gray value to the same tem-perature for the 3D model in Agisoft Metashape compared to thermal images that do not?

Thus, the purpose of this thesis is to find a workflow such that the resulting thermal images make it easier to compare different thermal images and yield a better 3D model in Agisoft Metashape. Furthermore, we do analysis on noise filtering and contrast enhancement to determine whether the workflow should also have optional processing steps for the removal of noise and the enhance-ment of contrast in thermal images. The additional research questions we can ask on this topic are the following:

2. How does applying noise filtering before the processing of the thermal images in the work-flow affect the 3D model in Agisoft Metashape?

3. How does contrast enhancement after the processing of the thermal images in the workflow influence the 3D model in Agisoft Metashape?

1.3

Outline

The thesis is organized as follows: Chapter 2 discusses existing work in the field and considers what image processing techniques will be used in this thesis. In Chapter 3, we will explain the theoretical background of thermal images and specific image processing techniques. Chapter 4 presents the workflow to process the set of thermal images and how Agisoft Metashape works. The results of the workflow are presented and discussed in Chapter 5. Finally, in Chapter 6, we will summarize our findings and give our conclusions. Furthermore, we will discuss the ethics related to this thesis in the same chapter.

(10)
(11)

CHAPTER 2

Related Work

Various works have already been done to resolve the problem of the different temperature repre-sentations in different thermal images [11], [12]. In this section, we will highlight the work done by others and consider what techniques we will use in this thesis.

2.1

Intuitive Colorization of Temperature in Thermal Cameras

Petter Sundin investigated a way to create a more intuitive way of colorizing images originating from thermal cameras and creating a fixed color to temperature connection [11]. This master thesis was performed at FLIR systems, which claimed that the colorizing used today is well suited for experienced users but non-experienced users often lack the needed knowledge to be able to fully understand what the camera displays. It was carried out to accomplish a mapping between a specific temperature and a specific color. According to Petter Sundin, this would make the colorization much more intuitive than it is today. Furthermore, this thesis suggested histogram equalizing to increase the contrast in these images.

The master thesis of Petter Sundin succeeded in intuitively mapping colors for different temperatures, but it had issues with the difference in brightness that caused colors to appear different even though they had the same hue. Furthermore, this solution only solved the problem if end-users are comparing the thermal images themselves. The resulting images are not designed to be used in further processing procedures.

This intuitive colorization of temperature values is not suitable for Agisoft Metashape because Agisoft Metashape performs further processing on the resulting images. However, we can make use of the approach of utilizing the temperature values to create new gray values. Furthermore, we can implement the histogram equalization to increase the contrast in thermal images.

2.2

Normalization of Infrared Facial Images under Variant Ambient

Temperatures

According to Lu, Yang, Wu, et al. thermal images taken of faces are severely affected by a variety of factors, such as environment temperatures, eyeglasses, hairstyles, and so on [12]. The authors proposed a novel study on the normalization of infrared facial images, especially resulting from variant ambient temperature [12]. They suggested three normalization methods to eliminate the effect of variant ambient temperatures in order to increase the performance of infrared face recognition systems. The results from their experiments demonstrated that the infrared facial recognition systems performed better with temperature normalization than without.

However, because the normalization methods described in this paper only deals with the issue of ambient temperature, it is not suited for the normalization of aerial thermal images taken from the sky. Aerial thermal images are affected by different external factors such as emissivity, object distance, relative humidity, and so on.

(12)

In our thesis, we will make use of the normalization technique to assign the same gray value to the same temperature using the formulas described in Infrared Thermography: Errors and Uncertainties written by Minkina and Dudzik to deal with the different external factors [13].

(13)

CHAPTER 3

Theoretical Background

In this chapter, we will go in-depth into the theoretical background of thermal images. Moreover, we will discuss various image processing techniques that are commonly used to solve or lessen the problems we have stated in Chapter 1 of this paper.

3.1

Forward-Looking Infrared Camera

Forward-looking infrared cameras, also known as FLIR cameras, are commonly utilized on mili-tary and nonmilimili-tary aircraft or drones [14]. These cameras use a thermographic sensor to record the intensity of infrared radiation. The sensors placed in forward-looking infrared cameras use detection of infrared radiation, for a large part determined by heat emitted from a source to cre-ate an image constructed for output. This infrared radiation is also known as thermal radiation. Thermal radiation, also known as heat, is the emission of electromagnetic waves from all objects that have a temperature greater than absolute zero [15].

3.2

Infrared Imaging

3.2.1

Image

An image is described as a 2-dimensional function, f(x, y), where x and y are respectively the x and y coordinate. The result of function ’f’ at any pair of coordinates (x,y) is called the pixel value of that image at that point. Therefore, an image can be represented in rows and columns as the following matrix:

f (x, y) =         f (0, 0) f (1, 0) ... f (n − 1, 0) f (0, 1) f (1, 1) ... f (n − 1, 1) . . ... . . . ... . . . ... . f (0, m − 1) f (1, m − 1) ... f (n − 1, m − 1)        

3.2.2

Grayscale image

A grayscale image, in the case of thermography, is an image where every pixel contains a sample value representing the amount of light. Grayscale images are made of different shades of gray. Darker gray values represent the weakest intensity, while lighter gray values are associated with a stronger intensity.

(14)

3.2.3

Radiometric images

Thermal images captured by FLIR cameras are often stored as radiometric Joint Photographic Experts Group (RJPG) grayscale images [1]. In a RJPG image, every pixel does not only contain the pixel value, but they also hold the raw temperature value. Furthermore, RJPG images hold important metadata such as emissivity, atmospheric temperature, object distance, and other essential data for calibration.

3.2.4

Converting thermal radiation to temperature

Converting a raw value obtained from a binary thermal image file into an estimated temperature is possible by using the standard equation used in infrared thermography1. The conversion makes

uses of the PlanckR1, PlanckB, PlanckF, PlanckO, and PlanckR2 calibration constants that are specific to each thermal camera. The formula is as follows:

temperature in Celsius = PlanckB

log(PlanckR2·(raw value+PlanckO)PlanckR1 ) + PlanckF− 273.15

However, before the conversion of the raw value into a temperature value, it is important to take note of the transmission loss through the atmosphere. Minkina and Dudzik have discussed the transmission loss through the atmosphere using thermal images in their book Infrared Ther-mography: Errors and Uncertainties [13]. They provided several equations for the calculation of the raw value transmission through the air using the basic laws of radiative heat transfer. The equations specified use the situational parameters stored in the metadata of the thermal images in their formulas. These situational parameters can be different in each thermal image. For example, the distance between the thermal camera and an object can be inconsistent.

3.3

Image Processing Methods

3.3.1

Normalization

Normalization can have a variety of meanings in statistics and applications of statistics stated by Dodge and Commenges [16]. In the most simplistic situations, normalization of numbers means modifying values measured on different scales to a more general scale. To assign the same gray values to the same temperature values, a method of rescaling is needed to normalize the data. Feature scaling is one technique of a rescaling method that normalizes the range of independent variables of data to a specific range.

Min-max scaling, or also known as min-max normalization, is the most simplistic approach of feature scaling that rescales values between 0 and 1 [16]. The general min-max normalization formula of rescaling a range of values between 0 and 1 is performed according to the formula:

x0= x − min(x) max(x) − min(x)

where x is the original value of the dataset and x0 is the new normalized value of the rescaling.

In order to rescale values into a range of a to b with min-max normalization, the updated formula transforms into:

x0 = a +(x − min(x)) · (b − a) max(x) − min(x)

where a and b are respectively the new minimum and maximum value of the range.

3.3.2

Noise Filtering

In Infrared Thermal Imaging, Vollmer and M¨ollmann argue that ”the most important part of any image processing is the preprocessing of the raw data” [17]. It is often necessary to reduce

(15)

noise and eliminate pixel artifacts in the images by applying filters. Furthermore, filters are also frequently used to smooth an image [18].

An important goal of noise filtering is the need to preserve edges and contours such that no relevant information in the image gets lost. The filters that are often used in thermal images are mean filters, median filters, or Gaussian filters [17].

Mean filter

The mean filter is one of the easiest low-pass filters to implement [18]. It is a sliding-window filter that substitutes every pixel value with the average of all values in the window. The window or kernel of the mean filter is commonly shaped as a square, but it can be of any shape stated by Gonzales and Woods [18]. For example, a mean filter with a 3x3 kernel would compute the average of each pixel with its value and the pixels surrounding it and would look like the kernel in Figure 3.1: kernel = 1 9   1 1 1 1 1 1 1 1 1  

Figure 3.1: Example of a mean filter kernel.

  1 1 1 1 10 1 1 1 1  −−−−−−−→ mean filter (10 + 1 + 1 + 1 + 1 + 1 + 1 + 1 + 1)/9 = 18/9 = 2

Figure 3.2: Example of applying a mean filter to an arbitrary pixel of value 10 and its neighbor-hood.

Median filter

Like the mean filter, the median filter is also a sliding-window filter. Instead of replacing the center value in the window with the mean of all values in the kernel, it replaces the center value in the window with the median of all the pixel values in the kernel. Similar to the mean filter, the kernel is also usually sized as a square, but this is not always the case [18]. The median filtering of a random pixel with a 3x3 kernel resembles a procedure where the values of the neighborhood and the pixel itself are ordered, and the fifth value would be defined as the result. An example of the median filtering of a pixel value with its neighborhood is shown in Figure 3.3.

  4 3 5 6 7 9 8 2 100  −−−−−−−−→ median filter (2, 3, 4, 5, 6, 7, 8, 9, 100) = 6

Figure 3.3: Example of applying a median filter to an arbitrary pixel with value 7 and its neighborhood.

(16)

Gaussian filter

A Gaussian filter is a kernel that is used as a 2D convolution operator to blur images and reduce detail and noise [19]. In this context, the Gaussian filter resembles the mean filter, but the shape of the Gaussian kernel is different. In the mean filter, all pixels in the sliding window have the same weight. The Gaussian filter, on the contrary, uses a Gaussian function to calculate the values of the kernel. The Gaussian kernel created from a Gaussian function produces a weighted average of each pixel and its neighboring values. In this sliding window, the central pixels have a higher weight than the pixels on the boundary.

As a result of this, the Gaussian filter has a gentler smoothing effect than a mean filter with the same size. This is also the reason why a Gaussian filter preserves edges better than the mean filter. The Gaussian function for a 1D kernel is as follows:

G(x) = √1 2πσ · e

−x2 2σ2

where x is the distance from the current pixel in the horizontal or vertical axis and σ corresponds to the standard deviation. The product of two 1D Gaussian functions can be used to create a 2D Gaussian function [19]. This is done by applying the 1D Gaussian function in both horizontal and vertical directions on a digital image. A 2D Gaussian function is defined by the following formula:

G(x) = 1 2πσ2 · e

−x2 +y2 2σ2

where x is the distance from the current pixel in the horizontal axis, y is the distance from the current pixel in the horizontal axis, and σ, as before, is the standard deviation. An example of a Gaussian filter is illustrated in Figure 3.4:

kernel = 1 273       1 4 7 4 1 4 16 24 16 4 7 24 41 24 7 4 16 24 16 4 1 4 7 4 1      

Figure 3.4: Approximation of a 5x5 Gaussian kernel with a standard deviation of 1 pixel.

The standard deviation and the kernel size determine the amount of smoothing. A larger standard deviation increases the degree of smoothing, because it decreases the weighted average of the central pixels. On the other hand, the size of the kernel determines how many values of the neighborhood are used in the kernel.

3.3.3

Contrast enhancement techniques

Image histogram

A histogram of a digital image is a graphical representation of the pixel value distribution [20]. It shows the number of pixels for each pixel value in a digital image. The horizontal and vertical axis of an image histogram respectively shows the different pixel values and the total number of pixels of a specific pixel value. The horizontal axis of an image histogram is ordered from the most darkest values to the lightest values. In Figure 3.5 we can see the image histogram of an image. Here, the image histogram displays the total number of pixels for each gray value in the image.

(17)

Figure 3.5: Histogram of a digital image2.

We instantly observe a huge peak in the center of the histogram in Figure 3.5. The image contains a lot of gray values of medium gray value but few of low and higher pixel values. This behavior can result in an image with a low contrast, which can be bad for the performance in Agisoft Metashape [7]. Histogram equalization can solve this problem by spreading out the most intensive values in a histogram to improve the contrast and quality of an image [21].

Histogram equalization

Histogram equalization is a point operator that tries to make the histogram of an image flat such that the pixel values are better distributed. Mordvintsev and Abid have stated that ”Equalization implies mapping one distribution (the given histogram) to another distribution (a wider and more uniform distribution of intensity values)” [22]. In Figure 3.6, we can see the result of applying histogram equalization on an image:

Figure 3.6: Improving the contrast of an image with histogram equalization3.

Histogram equalization does not always work perfectly. It only has a good performance if the histogram of a digital image is restricted to a distinct area. Therefore, histogram equalization will not have a good performance in images that contain low and high pixel values [22].

Adaptive histogram equalization

In addition to the standard histogram equalization, there is also an adaptive histogram equal-ization technique [23]. The difference between traditional and adaptive histogram equalequal-ization comes from the fact that standard histogram equalization operates with every pixel of an image. Adaptive histogram equalization, contrarily, works on smaller parts of an image. From these separate sections, adaptive histogram equalization will create a specific histogram for each of

2Image source: https://docs.opencv.org/2.4/doc/tutorials/imgproc/histograms/histogram equalization/histogram equalization.html 3Image source: https://opencv-python-tutroals.readthedocs.io/en/latest/index.html

(18)

them. A standard histogram equalization will then be applied to each of these computed his-tograms to redistribute the pixel in the corresponding distinct region of the image. As a result of that, adaptive histogram equalization is capable of enhancing the local contrast in each area of an image [24]. This adaptive histogram equalization is shown in Figure 3.7:

Figure 3.7: Adaptive histogram equalization illustrated. Three different sections are highlighted where adaptive histogram equalization can be applied in one image to increase the local contrast. Source: Vswitchs - Own work, CC04.

(19)

CHAPTER 4

Methodology

To answer our main research question in this thesis:

• How important are thermal images that associate the same gray value to the same tem-perature for the 3D model in Agisoft Metashape compared to thermal images that do not?

we will develop a workflow such that the thermal images associate the gray values with the same temperature value. In order to do an analysis on noise filtering and contrast enhancement in the workflow, we will include optional processing steps for the removal of noise and the enhancement of contrast in thermal images. The performance of the workflow will be evaluated using the results obtained from Agisoft Metashape.

4.1

Workflow

To create the new gray values that represent the same temperature in different thermal images, we make use of normalization or rescaling of the temperature values. Before the normalization of thermal images, we convert all raw temperature values of the pixels to temperature values in Celsius. The reason for this is that the situational parameters of thermal images, such as the object distance (distance from the lens to an object) or Planck values (constants specific to a camera), are subject to change in different images as explained in subsection 3.2.4. This means that the same raw temperature value in two different thermal images does not necessarily mean that they convert to the same temperature.

Therefore, we first calibrate the raw temperature values with the formulas described by Minkina and Dudzik in their book Infrared Thermography: Errors and Uncertainties1 for the transmission loss. Afterward, we transform the calibrated raw temperature values to Celsius using the standard equation used in infrared thermography mentioned in subsection 3.2.4 by extracting the Planck constants from the thermal images.

4.1.1

Normalization/rescaling

The conversion of a temperature value to a gray value in a single image depends on the minimum and maximum temperature that exists in that image. In order to assign the same gray value to the same temperature in multiple thermal images, we collect the temperature values of all thermal images. From this collection, we retrieve the global minimum and maximum value. Afterward, we can rescale the temperature values in all thermal images into a range between an arbitrary collection of values [minGrayValue, maxGrayValue] using the min-max normalization described in subsection 3.3.1.

x0= a + (x − globalM inT empV alue) · (b − a) globalM axT empV alue − globalM inT empV alue

(20)

where x is the temperature value of a pixel, x0 is the new gray value, a is the minimum gray value of the range and b is the maximum gray value of the range.

4.1.2

Optional noise filtering

We will implement noise filtering before the normalization process because the filtering of noise is often done as first before other image processing techniques [17]. There are several filters available to apply noise filtering such as mean filtering or median filter. However, these filters have a worse performance in preserving edges and details than a Gaussian filter in thermal images [17]. Thus, we will use a Gaussian filter in our workflow.

The kernel used for the Gaussian filter will be a simple 3x3 kernel with a standard deviation of 0.8 recommended by Vollmer and M¨ollmann in the book Infrared Thermal Imaging [17] for thermal images. The resulting Gaussian kernel is shown in Figure 4.1:

kernel = 1 16   1 2 1 2 4 2 1 2 1  

Figure 4.1: Approximation of a 3x3 Gaussian kernel with a standard deviation of 0.8.

4.1.3

Optional contrast enhancement

After the global normalization, there is a high probability that the contrast in several images is very low due to outliers. Outliers extend the global range of temperatures which affects the grayscale range. Therefore, we propose an optional global histogram equalization operation on all normalized thermal images. This is done by applying a standard histogram equalization on a histogram containing all gray values of all thermal images. The reason why we are not applying this separately on every image or using an adaptive histogram equalization lies in the fact that both operations result in thermal images that do not associate the same gray value to the same temperature. This is mainly caused because the operations are applied locally and not on all gray values of every thermal image. Spreading out the pixel values in local areas or separate thermal images will reintroduce the problem of associating different gray values with the same temperature in different thermal images.

4.1.4

Flowchart

The overall workflow for the normalization of thermal images is summarized in the flowchart given in Figure 4.2:

(21)

Figure 4.2: Workflow for the normalization of thermal images. This workflow consists of optional noise filtering and contrast enhancement.

4.2

Agisoft Metashape

As mentioned in Chapter 1, Agisoft Metashape is software that allows users to process images from thermal cameras into the spatial data in the form of dense point clouds and 3D models of the temperature in an area. The processing [7] of thermal images with Metashape covers the following primary steps:

1. Loading images into Metashape.

2. Inspecting loaded thermal images and remove unnecessary images. 3. Aligning thermal images.

(22)

5. Building a polygonal mesh. 6. Generating 3D texture model. 7. Exporting results.

Aligning the thermal images is done by the feature matching algorithm of Agisoft Metashape. It creates tie points based on overlapping areas that are visually recognizable in two or more thermal images. A tie point is essentially a 3D point with its 3D position computed by the corresponding 2D keypoints (interesting points determined by the feature matching algorithm) detected in the matching thermal images. This computation makes use of the camera’s internal and external parameters and the positions of the keypoint in the images [8]. Figure 4.3 illustrates this computation:

Figure 4.3: 3D point is computed using the external and internal parameters of an image of the keypoints. Source: https://support.pix4d.com/hc/en-us/articles/202559369-Reprojection-error.

The tie points are gathered together in a sparse point cloud, which is the collection of all tie points in a coordinate system. A sparse point cloud is defined as the representation of the positions of all tie points. This sparse point cloud only serves as a visualization and is not used in further processing [7].

In Metashape, we can generate and visualize a dense point cloud, which is computed based on the tie points of the alignment step [7]. The dense point cloud is a set of 3D points that Metashape uses to reconstruct a polygon mesh and 3D model.

4.3

Procedure: Performance Measures

To evaluate the performance of the workflow, we will compare the resulting set of thermal images with the original thermal images using Agisoft Metashape. Agisoft Metashape provides its own automatic processing report generation that includes important measures such as the number of tie points and the average reprojection error. Furthermore, we can compare and discuss the quality of the dense point clouds created from different sets of thermal images. The resulting sets of thermal images will be collected using several different configurations of our workflow.

In total, we will test 4 sets of thermal images acquired from our workflow against the original thermal images:

1. Normalized images.

(23)

3. Normalized images with histogram equalization.

4. Normalized images with Gaussian filtering and histogram equalization.

4.3.1

Comparing initial photos before processing

Before we process the normalized images in Agisoft Metashape, we can also compare the set of normalized images with the original photos. The resulting images can be analyzed if the workflow is indeed able to assign the same gray value to the same temperature. Moreover, we can evaluate if the Gaussian filter used during the noise filtering is reliably able to preserve edges and important details of thermal images.

4.3.2

Number of tie points

As explained in section 4.2, Agisoft Metashape searches for feature points or overlapping areas on the images and matches them across images into tie points in the alignment step of the process. A lack of tie points could result in unstable alignment results explained by the manual of Agisoft Metashape [7]. Measuring the number of tie points could thus be a good way to measure the performance of our workflow.

4.3.3

Average reprojection error

After the alignment step, when the 3D coordinates of all tie points are computed, Agisoft Metashape will then reproject the 3D points back to all the images that it appears in [8]. The reprojection error is the distance in pixels between the marked and the reprojected tie point on one image. In Figure 4.4, we can see how Metashape calculates this reprojection error:

Figure 4.4: 3D point is computed by the points marked on the images and reprojected back to measure the error. Source: https://support.pix4d.com/hc/en-us/articles/202559369-Reprojection-error.

The average reprojection error should be no more than 1 pixel, and serves as a good per-formance measure of the alignment step [8]. Therefore, reprojection errors provide a qualitative measure of accuracy for our workflow.

4.3.4

Quality of the dense point cloud

Because a dense point cloud is used as a basis for creating the 3D models, the quality of the dense point cloud can be used as a good indicator for the performance of our workflow. The dense point cloud gives a good view of the quality of the 3D mapping of the tie points, not obscured by the subsequent polishing algorithms of the mesh and texture steps. It provides a very accurate background for surface and volume measurements [8].

(24)

The analysis of the quality of the dense point clouds is made with the help of a digital archaeologist of Amsterdam Centre for Ancient Studies and Archaeology (ACASA). We are not going to measure the quality of the 3D models (mesh or texture models) itself because they are only useful to present and visualize the model. They are designed to look nice, rather than to be accurate, so it is not recommended to use it for measurements [8].

This analysis is a relative comparison with the dense point cloud of the original thermal images. This is done because a perfect model to calculate ’deviations’ or ’errors’ with our resulting thermal images does not exist.

4.4

Setup

4.4.1

Data collection

In order to evaluate our workflow, we use 157 thermal images taken of a specific terrain with a drone. The thermal images used are obtained by ACASA archaeologists, using 4D Research Lab facilities. The set of photos were collected in Halos in Greece during a 15min flight session. The flight was at an altitude of 10 meters with an atmospheric temperature of 28 degrees Celsius and a humidity of 53%.

4.4.2

System software and hardware

• Software name: Agisoft Metashape Professional • Software version: 1.6.2 build 10247

• OS: Windows 64 bit • RAM: 31.85 GB

• CPU: Intel(R) Core(TM) i7-8750H CPU @ 2.20GHz • GPU(s): GeForce RTX 2070 with Max-Q Design

4.4.3

Equipment

• Drone type: DJI M210

• Thermal camera type: Zenmuse XT2 • Resolution pixels: 640 x 512

• Framerate: 30Hz • Radiometric yes/no: yes

• Resolution thermal, spectral, radiometric: 0.5 k, 8-15 nm, 24 bit • Optical camera type: Zenmuse XT2 optical (integrated)

(25)

CHAPTER 5

Experiments and Discussions

5.1

Original and Normalized Thermal Images

(a) Image 1 with an aluminum target (b) Image 2

Figure 5.1: Original sample thermal images of the same terrain taken in succession.

(a) Image 1 with an aluminum target (b) Image 2

(26)

(a) Image 1 with an aluminum target (b) Thermal image 2

Figure 5.3: Sample thermal images of the same terrain using normalization and Gaussian filtering.

(a) Image 1 with an aluminum target (b) Image 2

Figure 5.4: Sample thermal images of the same terrain using normalization and histogram equal-ization.

(a) Image 1 with an aluminum target (b) Image 2

Figure 5.5: Sample thermal images of the same terrain using normalization, Gaussian filtering and histogram equalization.

(27)

5.1.1

Discussion

The resulting normalized images from Figures 5.2-5.5 show that the workflow can assign the same gray values to similar temperature values in different thermal images. This result is mainly due to the rescaling method, which replaces the original gray value with the temperature values using the global maximum and minimum temperature from all thermal images. It is clear from the results that we are able to better compare different thermal images with each other than the original thermal images in Figure 5.1.

Applying the Gaussian filter before the global normalization compared to no noise filtering makes almost no difference in the details in resulting thermal images. A slight difference can only be perceived when we zoom in the images. We can thus see that the Gaussian filter is indeed able to retain edges and important details in thermal images. Enabling histogram equalization also showed us that the contrast in the thermal images is much better than before.

5.2

Metashape Evaluation

5.2.1

Number of tie points

Figure 5.6: Number of tie points by set of input images. Workflow configuration consists of optional Gaussian filtering (GF) or histogram equalization (HE).

Discussion

Looking at Figure 5.6, we can see that the set of normalized thermal images computes the same number of tie points as the original thermal images. However, the workflow with Gaussian filtering yields the highest number of tie points found. This is result could be explained by the blur effect of the Gaussian filter. Because the Gaussian filter also reduces detail in images, it could be the reason how Agisoft Metashape can match more tie points across all thermal images with their feature matching algorithm. On the other hand, the number of tie points decreases whenever we apply histogram equalization.

It is important to take note that a higher number of tie points does not necessarily mean that the 3D model from Agisoft Metashape is always better. However, a higher number of tie points does indicate a more stable alignment result in Agisoft Metashape as explained in subsection 4.3.2 [7].

(28)

5.2.2

Average reprojection error

Figure 5.7: Average reprojection error by set of input images. Workflow configuration consists of optional Gaussian filtering (GF) or histogram equalization (HE).

Discussion

In Figure 5.7, we observe almost no difference in the average reprojection error between the original and normalized thermal images. Applying either Gaussian filtering or histogram equal-ization shows neither a huge positive or negative effect. This is not a bad outcome, because we can see that all sets of thermal images produces an average reprojection error within the quality threshold of 1 pixel stated in subsection 4.3.3. Therefore, we can establish that our workflow produces thermal images that perform like the original images based on the average reprojection error.

(29)

5.2.3

Quality of the dense point clouds

(a) Top view of the dense point cloud.

(b) Bottom right section of the dense point cloud.

Figure 5.8: Dense point cloud of the original thermal images. The dense point cloud contains a lot of holes and the morphology of the terrain is badly modeled.

(30)

(a) Top view of the dense point cloud.

(b) Bottom right section of the dense point cloud.

Figure 5.9: Dense point cloud of the normalized thermal images. The dense point cloud does not have a lot of holes and the morphology of the terrain is modeled well.

(31)

(a) Top view of the dense point cloud.

(b) Bottom right section of the dense point cloud.

Figure 5.10: Dense point cloud of the normalized thermal images with Gaussian filtering. The dense point cloud does not have a lot of holes and the morphology of the terrain is modeled well. Furthermore, there are more details present than Figure 5.9(b).

(32)

(a) Top view of the dense point cloud.

(b) Bottom right section of the dense point cloud.

Figure 5.11: Dense point cloud of the normalized thermal images with contrast enhancement. The dense point cloud has some holes and the morphology of the terrain is modeled well.

(33)

(a) Top view of the dense point cloud.

(b) Bottom right section of the dense point cloud.

Figure 5.12: Dense point cloud of the normalized thermal images with Gaussian filtering and histogram equalization. The dense point cloud has some holes and the morphology of the terrain is modeled well.

(34)

Discussion

The images in Figure 5.8 of the dense point show that the original thermal images produce a lot of errors. The dense point cloud created from the original thermal images contains a lot of holes on the surface. Furthermore, the morphology of the terrain is modeled badly in the dense point cloud.

We can see in Figure 5.9 that the dense point cloud created from thermal images that associate the same gray value to the same temperature has a much better quality. There are fewer holes present, and the morphology of the terrain is modeled properly in the dense point cloud, which results in visible depth.

After applying Gaussian filtering in the workflow, we discover a considerable improvement in Figure 5.10. We can observe that the dense point cloud in Figure 5.10(b) is more detailed than in Figure 5.9(b). There is more contrast and shapes present. This result could be explained by the higher number of tie points created during the alignment step discovered in subsection 5.2.1. A higher number of tie points could thus result in a much more detailed dense point cloud.

However, when we include histogram equalization, the dense point clouds do not significantly improve in detail in Figure 5.11 and 5.12 compared to no histogram equalization. This could be explained due to the fact that the global histogram of the thermal images is not restricted to a distinct area since the workflow rescales the new gray values between the minimum and maximum gray values. In subsubsection 3.3.3, we have stated that histogram equalization does not have a good performance in images that contain low and high pixel values.

(35)

CHAPTER 6

Conclusion

6.1

Summary and Conclusion

In this thesis, we have developed a workflow that uses a normalization technique on thermal images to assign the same gray values to the same temperature to create a better 3D model in Agisoft Metashape. We made two main suggestions to further improve the normalized thermal images. Firstly, we proposed noise filtering in subsection 4.1.2 that uses a Gaussian kernel to remove noise after the acquisition of thermal images. Secondly, we suggested an image processing technique in subsection 4.1.3 that uses a global histogram equalization for the enhancement of contrast.

In practice, the normalized thermal images from our workflow is capable of producing the same number of tie points in comparison with the original thermal images using the feature matching algorithm of Agisoft Metashape without influencing the average reprojection error. However, applying noise filtering before the normalizing of thermal images yields us the highest number of tie points. This produces a more stable result during the alignment step of Agisoft Metashape.

On top of that, normalizing the thermal images has been found to be useful to improve the quality of dense point clouds. The errors in the dense point clouds are less than when we use the original thermal images. Furthermore, when we include noise filtering in our workflow, we can see a considerable improvement. This is important in Agisoft Metashape because the 3D models are created from these dense point clouds.

In general, noise filtering with a Gaussian kernel has a positive effect on the results of our workflow except for the average reprojection error. On the other hand, contrast enhancement using histogram equalization does not significantly improve the 3D model. It did not increase the number of tie points, nor improved the dense point cloud.

To answer our first research question in section 1.2, the results of using the workflow presented in this paper illustrate how important thermal images that associate the same gray value with the same temperature can be in Agisoft Metashape compared to thermal images that do not. Normalized thermal images produces a higher quality dense point cloud than the original thermal images in Agisoft Metashape, which in turn results in a better 3D model of the temperature of an area.

Regarding the second and third research questions in section 1.2, noise filtering with a Gaus-sian kernel on our thermal images is an essential preprocessing step to improve the quality of the dense point cloud and the number of tie points before the normalization of thermal images, while contrast enhancement using histogram equalization is not very influential.

6.2

Limitation

Despite the advantages of our workflow, it also comes with its limitations. Due to the fact that our workflow applies a rescaling normalization on a set of images, we need to perform

(36)

the workflow again whenever we include a new thermal image. This behavior makes real-time tasks of thermal images not attractive considering the overhead that comes with our workflow. Furthermore, this overhead increases whenever we include noise filter or contrast enhancement operations in the workflow. Lastly, the workflow assumes that the thermal images are taken in a short interval, since the temperature of objects can change in different seasons.

6.3

Future Work

In this thesis, experiments were performed with a 3x3 kernel and a single standard deviation value for the Gaussian filter. It could be interesting to explore different kernel sizes and stan-dard deviation values in the future analysis of noise filtering on thermal images. Furthermore, comparing different filters for noise filtering is also an option to consider. As for our test data set, we have only looked at 1 data set for the evaluation of our workflow. Testing our workflow with more data sets could give us a more reliable evaluation of our method. Lastly, our nor-malized thermal images are only tested in Agisoft Metashape. Analyzing the results of different photogrammetric suites than Agisoft Metashape with our normalized thermal images as input can also be researched.

6.4

Ethics

Military usage

As mentioned in section 3.1, forward-looking infrared cameras are commonly used for military purposes. Thermal imaging holds a lot of potential to the armed organizations according to Akula, Ghosh, and Sardana [25]. Due to its numerous advantages, thermal imaging has a wide number of uses in military and security. It is commonly utilized by the army and navy for the monitoring of borders and enforcement of the law [25]. Moreover, thermal imaging is widely used in military aviation to discover and target hostile organizations [25].

Therefore, it is important to think about what kind of consequences our workflow could have in a military context. From our results, we can see that the normalized images from our workflow can increase the quality of 3D models in photogrammetric suites such as Agisoft Metashape. This increase in quality could not only be interesting to archaeologists, but also for the military. It is important that this work is not misused for military purposes.

(37)

Bibliography

[1] A. Cool, “Thermography,” The Encyclopedia of Archaeological Sciences, 2018.

[2] J. Casana, A. Wiewel, A. Cool, A. C. Hill, K. D. Fisher, and E. J. Laugier, “Archaeological aerial thermography in theory and practice,” Advances in Archaeological Practice, vol. 5, no. 4, pp. 310–327, 2017.

[3] H. Thomas, “Some like it hot: The impact of next generation flir systems thermal cameras on archaeological thermography,” Archaeological Prospection, vol. 25, no. 1, pp. 81–87, 2018.

[4] Flir Tools, howpublished = https://www.flir.com/products/flir-tools/.

[5] A. Starovoytov, I. Chernova, and O. Chernova, “Software and hardware systems for col-lecting and analyzing archaeological data,” International Multidisciplinary Scientific Geo-Conference: SGEM, vol. 19, no. 2.1, pp. 1059–1064, 2019.

[6] L. Agisoft, Metashape, 2019.

[7] L Agisoft, “Agisoft metashape user manual,” Professional Edition, Version, vol. 1, p. 71, 2019.

[8] P. M. Pro, Computer software, pix4d sa, switzerland.

[9] N. H¨ogasten, M. Ingerhed, M. Nussmeier, E. A. Kurth, T. R. Hoelter, K. Strandemar, P. Boulanger, and B. Sharp, Pixel-wise noise reduction in thermal images, US Patent 9,208,542, 2015.

[10] A. K. Boyat and B. K. Joshi, “A review paper: Noise models in digital image processing,” arXiv preprint arXiv:1505.03489, 2015.

[11] P. Sundin, Intuitive colorization of temperature in thermal cameras, 2015.

[12] Y. Lu, J. Yang, S. Wu, Z. Fang, and Z. Xie, Normalization of infrared facial images under variant ambient temperatures. InTech, 2011.

[13] W. Minkina and S Dudzik, Infrared Thermography: Errors and Uncertainties. Wiley Press, 2009.

[14] J. Perˇs, M. Kristan, M. Perˇse, and S Kovaˇciˇc, “Observing human motion using far-infrared (flir) camera-some preliminary studies,” in Proceedings of the Thirteenth International Elec-trotechnical and Computer Science Conference ERK 2004, Portoroˇz, Slovenia, Citeseer, 2004, pp. 187–190.

[15] R. Gade and T. B. Moeslund, “Thermal cameras and applications: A survey,” Machine vision and applications, vol. 25, no. 1, pp. 245–262, 2014.

[16] Y. Dodge and D. Commenges, The Oxford dictionary of statistical terms. Oxford University Press on Demand, 2006.

[17] M. Vollmer and K.-P. M¨ollmann, Infrared thermal imaging: fundamentals, research and applications. John Wiley & Sons, 2017.

[18] R. C. Gonzales and R. E. Woods, Digital image processing, 2002.

(38)

[20] E. Sutton, “Histograms and the zone system,” Illustrated Photography, 2016.

[21] P. Garg and T. Jain, “A comparative study on histogram equalization and cumulative histogram equalization,” International Journal of New Technology and Research, vol. 3, no. 9, 2017.

[22] A. Mordvintsev and K Abid, “Opencv-python tutorials documentation,” Obtenido de https://media. readthedocs. org/pdf/opencv-python-tutroals/latest/opencv-python-tutroals. pdf, 2014.

[23] D. J. Ketcham, R. W. Lowe, and J. W. Weber, “Image enhancement techniques for cockpit displays,” HUGHES AIRCRAFT CO CULVER CITY CA DISPLAY SYSTEMS LAB, Tech. Rep., 1974.

[24] R. Hummel, “Image enhancement by histogram transformation.,” MARYLAND UNIV COLLEGE PARK COMPUTER SCIENCE CENTER, Tech. Rep., 1975.

[25] A. Akula, R. Ghosh, and H. Sardana, “Thermal imaging and its application in defence systems,” in AIP conference proceedings, American Institute of Physics, vol. 1391, 2011, pp. 333–335.

Referenties

GERELATEERDE DOCUMENTEN

Deze situatie vraagt om het verzetten van de bakens en wekt het verlangen naar concrete voorstellen die nieuwe wegen openen voor letterkundig onder- wijs en onderzoek... De

misschien niet deterministisch zijn laat onverlet dat strategisch en toegepast onderzoek concrete aanbeve- lingen, sturingsregels, methoden en technieken produ- ceert, waarvan

The final state of the surface after a fluence of 8.4 × 10 17 cm −2 is shown in Figure 1a and Figure 1c: at 15 keV primary energy a porous structure is formed on the surface

And the last but not least I would like to express my warm thanks to Heli Savolainen for being with me to share many nice moments during the last four years of my life. Heli

Copyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright owners and it is a condition of

Een directe associatie tussen keramiek en oven zou enkel met enige zekerheid in de afvalkuilen te be- palen, zijn ware het niet dat hier slechts een drietal kleine

classes); very dark grayish brown 10YR3/2 (moist); many fine and medium, faint and distinct, clear strong brown 7.5YR4/6 (moist) mottles; uncoated sand grains; about 1% fine

3 toont dat vooral in de meest zuidelijke en oostelijke zone van het projectgebied urnengraven en delen van het Merovingisch grafveld aanwezig kunnen zijn... 3 De ligging van