• No results found

Development of a tracking algorithm for the Ocean Battery bladder

N/A
N/A
Protected

Academic year: 2021

Share "Development of a tracking algorithm for the Ocean Battery bladder"

Copied!
23
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

Development of a tracking algorithm for the Ocean Battery bladder

Nicolas Will (s3785408)

Abstract— The Ocean Grazer B.v. is at the forefront of the energy transition with its methods to harvest energy from wave motion, wind, and the sun. They require a quan- titative method to assess critical folding and stretching, to validate their intermediate energy storage solution, the Ocean Battery. The critical element is the bladder with potential stretching and folding during the storage and release process. MATLAB is used to analyse the stretching in the bladder. The real-life video footage contained a large amount of noise in the image in form of small particles in the water and bright air bubbles. Thus an initial cleaning and denoising of the inputs are required in MATLAB to assess the stretching. Multiple frames are used to form an average image as a starting point. A median filter is applied to the grey image, followed by a flexible threshold to convert the input into a binary image. Further processing of the data via a two-step object identification led to reduced noise in the image while keeping the relevant information of the bladder. The cleanness of the image significantly improved already after the first application of the average image. To check the effectiveness of the cleaning effort a 3D representation of the input image is used for visualisation.

After the cleaning of the image, the second part of the program estimated the stretching of the input object. For that, three different objects are used, which are created in SolidWorks. The first object has a parabolic shape, the second a strong bend, and the last one a w-shaped bend.

All three shapes are stretched by 2%, 5%, and 10% respec- tively. The program is able to identify the upper outer line of the object and create an approximation of the detected

Disclaimer: This report has been produced in the framework of an educational program at the University of Groningen, Netherlands, Faculty of Science and Engineering, Industrial Engineering and Management (IEM) Curriculum. No rights may be claimed based on this report. Citations are only allowed with explicit reference to the status of the report as a product of a student project.

This project was conducted in collaboration with the Ocean Grazer B.v. as part of the final Integration Project for a Bachelors program in the last semester (6th) of the academic year 2020- 2021. The Ocean Grazer company is dedicated to providing green energy storages and production solutions for a shift towards sustainable energy production. N. Will is an undergrad- uate student in the program BSc. Industrial Engineering and Management at the Faculty of Science and Engineering at the University of Groningen, Province of Groningen, Netherlands (e-mail: n.will@student.rug.nl). The Integration project is super- vised by

by Prof. dr. ir. B. Jayawadhana, professor at the faculty of Sci- ence and Engineering for Mechatronics and Control of Nonlinear Systems at the University of Groningen, and

Dr. M. Mu ˜noz Arias, lecturer in Industrial Engineering and Management at the Faculty of Science and Engineering at the University of Groningen.

upper line. The identified line is used to calculate the length of the object. The operator inputs the original length of the object to calculate the stretching within the material.

Lastly, the stresses from the stretching are calculated and compared to the yield strength of the bladder material. The results indicate the success of the denoising and cleaning of the image. The stretching calculations yielded accurate results for a stretch of 2%, but the accuracy decreased with an increase of stretch and complexity of the shape.

Index Terms— Object Tracking, Image Pro- cessing, Material Stretching, Green Energies, Ocean Grazer B.v.

I. INTRODUCTION

Climate change and recent nuclear power plant catas- trophes are just two reasons for a shift towards green energies. Studies have shown that the shift has happened as well within the consumers with 85 per cent indicating that they want more renewable energy and 49 per cent indicating that they are willing to pay more for renewable energies [1]. This led to an increase in the construction of off-shore wind parts and the capacity of the electricity output of those offshore parks [2]. Estimations project that wind energy will make up 30 per cent of the European energy consumption by 2030 as indicated in Fig. 1 [3].

Green energies such as wind, solar and wave are dependent on external conditions, resulting in fluctuating electricity outputs [4], [5]. During periods of intense wind, the system can produce more electricity than needed, while during doldrums little to no electricity is produced. The fluctuation in electricity output compli- cates the storage in periods of high electricity outputs and the sale of the electricity. An effective method for electricity storage needs to be developed, that can hold the generated power in an intermediate position with as few losses as possible [6]. High fluctuation can lead to a reduction of grid voltage stability, especially during increased surpluses [6]. Fluctuation also occurs in the electricity price, thus making it not economic to sell the electricity at any time. [7], [8]. The Ocean Battery was introduced to bypass this problem and to store the energy and release it once it is needed (Fig. 2). This guarantees a stable output. The Ocean battery is placed at the bottom

(2)

Fig. 1. Anticipated share of wind energy of EU members by 2030 retrieved from [3].

of the ocean/sea and uses the hydrostatic pressure of the ocean to generate electricity (Fig. 2).

Fig. 2. Representation of the Ocean Battery concept retrieved from [9].

A systematic drawing of the Ocean Grazer Battery is provided in Fig. 3 to showcase the concept [9]. A working fluid is stored in a ridged wall structure (A) at atmospheric pressure. If additional energy needs to be stored, the working fluid will be pumped into the flexible bladder (B) and stored as potential energy. The now filled bladder has a higher pressure than the ridged structure, which is determined by the depths of the Ocean Battery.

The deeper the Ocean Battery is placed, the higher is the hydrostatic pressure within the flexible bladder. Once energy is needed, the working fluid will flow from the bladder through a hydro turbine (C) back into the ridged body to generate electricity [9]. One crucial part of this process is the inflation and deflation of the flexible bladder, as continued repetitions are needed to generate electricity over the decades. Folding and stretching of the bladder is a potential threat to the lifetime of the battery.

Simulations and models already indicated that minimal

to no folding occurs with the current bladder design [10]. The development of a bladder tracking algorithm based on video material of the in/deflation process of the bladder would further validate those simulations with empirical evidence. Currently, it is not clear if the current setup of the Ocean Battery produces enough data from its on-board cameras to develop such a tracking algorithm.

Thus it is the goal of this project to investigate if the current data and information are satisfactory to create such a program. The project acts as a first step towards this empirical validation process and uses created models of different shapes to mimic different deformations of the bladder. The created shapes are used to overcome the current lack of video material from the Ocean Battery bladder, as the official video material is not available yet.

Fig. 3. Representation of the Ocean Battery concept to store energy in the flexible bladder retrieved from [9].

Object tracking and recognition have a history in com- puter science and engineering, as scientists realised their value in the early 2000s for motion detection, pattern recognition and boundary detection [11]. At present, object tracking, shape recognition and computer vision have many applications. Computer vision can be used for inspection and maintenance procedure to detect multiple types of damages in structures, roads, construction and rail track systems without the supervision of human operators [12], [13], [14], [15]. Algorithms and computer programs utilising neural networks, artificial intelligence and deep learning to recognise faces, analyse medical scans and real-time activities [16], [17], [18]. State-of- the-art programs uses R-CNN (region based convolu- tional neural networks), which significantly reduces the running time and costs of the program. The technique combines state-of-the-art accuracy and economic factors by predicting object boundaries and scores at the same time [16].

A different method to analyse images is based on the colour tones [19]. For that, RGB (red, green, and blue) images are imported into Matlab and are converted into grey scales [20], [21]. An image consists, depending on the resolution, of hundred of thousands of individual pixels, all storing information such as the colour of that particular pixel given in terms of the R, G and B

(3)

components [22]. The conversion to a grey scale reduces the needed information per pixel while maintaining the relevant properties to analyse the shapes of the object displayed in the image [22]. MATLAB can convert the images by a command, into grey-scale images. The grey image requires less memory in the computer program while still containing the relevant properties to filter and clean the image.

The goal of the project is to combine the knowledge of the field of computer science and object recognition with the applications of the Ocean Battery. Much research has been done in the fields of object recognition and tracking and the estimation of rectangular shapes, but this project aims to combine them to facilitate the Ocean Battery system. The validation of the existing models and concepts facilitates the energy transition with a successful launch of the green energy storage device of the Ocean Battery.

The paper is structured in the following way. Firstly, the used materials and methods are presented, that are relevant for the project (II). The methods are followed by the results of the project (III) followed by the a discus- sion of the results (IV). Chapter V discusses potential improvements and recommendations for the project to facilitate the tracking of the bladder. Lastly, a conclusion is provided to summarise the main findings (VI).

II. MATERIALS ANDMETHODS

The main materials are the images and video footages of the preliminary bladder of the Ocean Battery. The video had the file type .AVI and the image used for the MATLAB operations had the file type .PNG. Addi- tionally, models of bladder-like objects were created in SolidWorks to overcome the current lack of materials.

The first model had a parabolic shape, the second one a w-shaped bend and the last image a sharp bend (Fig.

25, Fig. 27, Fig. 26). All three different shapes have a unique starting length. To simulate stretching in the material three models per shape are being created with an increased length (2%, 5% and 10% increase) with the same shape as the original model. The original and stretched lengths of all three shapes can be observed in Table I. The used software for this project is MATLAB R2020b. Table II summarises all used variables and functions for the project.

The first step is to clean and denoise the images for further operations and to correctly identify a stretch or bend within the applied pattern in the bladder.

∆ = tf

25 (1)

TABLE I

SOLIDWORKS SHAPES USED FOR STRETCHING CALCULATIONS

Shape Original length

[mm]

Stretched length [mm] (Percentage increase [%]) Parabolic (25) 314.2 mm 320.44 mm (2%)

329.87 mm (5%) 345.58 mm (10%) W-shape bend (27) 100 mm 102 mm (2%)

105 mm (5%) 110 mm (10%)

Strong bend (26) 90 mm 92 mm (2%)

95 mm (5%) 100 mm (10%)

TABLE II

THE TABLE SUMMARIES ALL SYMBOLS AND FUNCTIONS USED WITHIN THE EQUATIONS

Symbol Meaning

i Rows of matrix

row Total amount of rows in matrix

j Columns of matrix

col Total amount of columns in matrix n Neighbourhood level for median filter

pt Total pixel size

ps Individual pixel

p Greyness value of pixel

t Threshold factor

max Maximum value of matrix

If Image after the application of median filter

Time step for frames in video tf Total frames of video

Ia Image compiled of average greyness values

Ig Grey image

l Length of matrix

sqtwolog Function that utilises minmax performance mI f Maximal greyness value of image

¯

x Mean greyness of matrix

N Number of elements within matrix s Standard deviation of matrix

thselect MATLAB command

nI f Normalisation factor of matrix

tr Threshold value used for binary transforma- tion

fI f Normalisation factor

b Binary image

α Threshold factor for binary image nf Normalisation factor two lo Original length of bladder

ls Length of sample size

lt Total length of current bladder lp Pixel length of the sample size

ln Length of the bladder after potential stretch- ing

cl Change in length (pixel)

length MATLAB command to determine length of a matrix

nl New length of bladder (pixel)

r Ratio of new length to old length of the bladder

d Change in length of the bladder

(4)

Ia(i, j) = Ia(i, j) + 1

∆Ig(i, j) (2) The method used for this first cleaning step of the image depends on the input to the script. If the input is a video file, as it is planned for the future if the footage is available, the first step is to create an average image based on multiple frames of the video. A function will extract the duration and the framerate of the video to calculate the total frames (tf) within the video. Secondly, it will determine a timestep (∆), which will be used for the sampling of the video (1). For that, the total frames of the video are being divided by 25, as the used camera in the OceanBattery has a framerate of 25. The maximum time step is fixed to 50, as long videos would increase the total frames of the video and thus increases the ∆ significantly. The value of 50 is selected as this represents twice the framerate and thus two seconds.

The next step is to determine an initial frame for future operations and convert it to the greyscale. Lastly, an average image of the greyscale is created by the function.

The program will extract each frame with the timestep

∆ and convert it to the greyscale. The greyness of each cell is added to a zero matrix of the same size. The greyness value of the cells is divided by the timestep ∆ and added to the average greyscale of all used frames for the new image (2). The idea to clean the image with average greyscales is, that the noise within the video and frames are mainly small particles in the water or bright air bubbles. These move faster than the in/deflated pattern. Thus the average greyness value of the cell is not dependent on the noise, but only on the bright pattern of the bladder. The method can be found in Fig. 4.

If the input is only one image, there is no initial clean- ing as no average can be calculated. The image is solely assigned to a variable and converted to the greyscale. It is possible to select a region of interest (ROI) for both types of input if already major disturbances are visible.

The selection of a ROI is only possible if the selected ROI does not reduce the usability of the image (Fig. 24).

pt=

row

X

i=1 col

X

j=1

ps(i, j) (3)

The next step is the selection of an appropriate neigh- bourhood level n for each pixel to apply the median filter on the image. The appropriate neighbourhood level is de- pendent on the total pixel size (pt) within the image (3).

It is desired to select a small neighbourhood to achieve adequate cleanness of the image [26]. Median filters and multi-level median filters are commonly used tools in image processing as they preserve image features, are

Fig. 4. Pseudocode to extract multiple frames from a video with the timestep ∆. The frames create an average frame based on all sample frames within the video.

robust and have little computation time [23], [24], [25].

A median filter is most appropriate for this as the noise is not Gaussian distributed, but rather arbitrary based on the environmental conditions. Next, the grey image is denoised from all remaining disturbances and noises in the image.

t = thselect(max(If),0sqtwolog0) (4)

sqtwolog = q

2log(l(If)) (5)

mI f = max(max(If)) (6)

¯ x = 1

N

N

X

i=1

pij (7)

(5)

b(i, j) = (

1, if p(i,j) ≥ α

0, otherwise (8)

s = v u u

t 1

N − 1

N

X

i=1

(pij− ¯x)2 (9) The denoising process is based on the threshold prin- ciple where each pixel is assessed with a threshold value α. If that pixel has a value higher or equal to the threshold it will be placed in a new binary matrix as a 1, otherwise, as a zero (8) [22]. To use an appropriate and flexible threshold level the maximum greyness level (6), the mean of the greyness level (7) and the standard deviation are calculated (9). Additionally, a threshold factor t is calculated, which utilises the minimax per- formances (4), (5). This method ensures the selection of a flexible threshold, which is dependent on the image characteristics and the noise within the image.

nI f = t · s (10)

tr = mI f − nI f (11)

nf = x¯

(0.7 · tr) (12)

α = tr · nf (13)

b = If > α (14)

Lastly, the threshold factor t is multiplied with the standard deviation (10). To incorporate the fluctuation within the image, the threshold is subtracted by the most extreme outlier (11) and a normalisation factor is calcu- lated (12). The normalisation factor is the division of the mean greyness level (¯x) divided by the threshold factor (tr) multiplied with 0.7. The factor of 0.7 is chosen with regards to the value of one sigma, that defines 68%

of all data. Thus extreme outliers are excluded from the data points. The final threshold is multiplied with a normalisation factor (13), as otherwise again extreme values could negatively influence the denoising process.

In the end, each pixel is compared to the threshold value and placed in a binary matrix (14), [22]. This process can be seen in Fig. 5.

This approach for cleaning the images is not the only possible solution to clean an image. The advantages of this method are firstly its simplicity. The method is suitable for any coloured/ grey image and can therefore

be used for various cameras and resolutions settings. Sec- ondly, the method uses very little computation time and little memory, which significantly reduces the running time of the program. Additionally, very little information are needed to clean the image. Other methods, such as the Birg´e-Massart method, utilise wavelengths, wave- length coefficients and transformation or case dependent constants to denoise an image [27].

However, disadvantages can arise if bright noise dis- turbs the cleaning process. This would lead to more extreme outliers and thus worst pattern recognition. A second problem can arise if the desired pattern and the surroundings have a similar colour. Thus it is not possible to distinguish noise and the pattern based on a different colour. It would not be possible to clean the image based on greyscale or colour, but it would rather be necessary to clean the image based on patterns or objects.

Solutions could be edge detection or square detection.

The presented and received preliminary video footage did not need more sophisticated cleaning techniques as a clear difference between the pattern and the background was visible.

Fig. 5. Pseudocode of the cleaning process with a median filter and a threshold. Idea retrieved from [38].

A next step to further clean and denoise the image is the removal of large objects in the image, that could not be removed by the median filter and the threshold value.

(6)

First, all objects within the image are identified with the bwboundaries command (Fig. 6) [39]. Cell B saves the outer boundary coordinates for an arbitrary amount of objects. Cell B and the coordinates are used by the first function (Fig. 7) to identify the maximum and minimum row and column values for each object. The last function deletes the detected object if it is identified as noise (Fig.

8). For that, it first identifies the maximum and minimum row and column values and creates a new image with the determined row and column values. It plots the detected object and the original image and asks the operator if that particular object is noise or not. If the operator identifies it as noise the object is deleted. Moreover, it is possible to automatically delete all objects with an area below a given value. This reduces the manual work for the operator.

Fig. 6. Determine the boundaries of the detected objects within the image and store them in cell B. Code idea retrieved from [39].

Fig. 7. Function to identify the position of the detected object within the image.

Fig. 8. Function to delete the detected object within the image if it is noise.

The cleaned image is plotted in a 3D environment with the surf function and by plotting every point of the matrix in a 3D environment [29]. Both methods will create a 3D plot based on the greyness level of the input. A higher

(7)

value for the greyness corresponds to a brighter colour, which is closer to the camera. This method of creating a 3D representation of the input image is therefore only possible with real footage that contains different colours and greyness levels. The created binary SolidWorks files do not support this method.

After the current cleaning method, it is possible to repeat the cleaning process explained in Fig. 6, Fig. 7 and Fig. 8 with the outer edges of the image. This can yield better results as noise can overlap with the pattern.

The edge function in MATLAB is able to create the outer edges of detected objects.

The image is now cleaned and represented in a 3D environment. To determine the bending and stretching within the image, only the upper outer pattern will be used. To identify the upper line pattern a program will loop through the columns of the matrix of the image.

Once it identifies a column with a sum greater or equal to one it will determine the exact row of the first non- zero cell. This information of row and column is saved in a new matrix and the program will select the next column, as only the outer line and thus only the first nonzero cell is needed. The program will display the final image of the outer pattern and connect all points with an inbuilt function. Lastly, all zero columns will be deleted to create a new matrix that only consists of columns that contain information of the upper line pattern. A detailed flowchart is presented in Fig. 9. The deformation in all current examples is uniform, therefore is it possible to observe the bending and stretching in the entire material only based on one line pattern. It is most efficient to use either the most upper or lowest line pattern.

Fig. 9. Flowchart to firstly select only the upper line of the identified pattern, secondly connect the identified points to a continuous line and lastly delete all zero columns. The zeros are deleted as they do not contain relevant information regarding the shape of the pattern.

The effectiveness of selecting only upper line patterns is dependent on the input and the cleaning process. Thus some noise may still be present in the image and within the upper line pattern. The spline function is used to approximate the observed data and to smoothing outliers in the dataset. The spline function creates a vector with median values based on the input data for the x and y coordinates (in this case row and columns values) [28].

The next step is to calculate the length (in pixels) of the detected and smoothed upper line. For that, the identified line is separated into samples with a defined distance in between. Each sample point represents one of three points of a right triangle. For that point, A (i) is selected with a connection to point C (i + 1). A third point (B) is chosen which guarantees a right triangle.

The hypotenuse of ~AC is calculated. This is repeated for all sample points and the total length is calculated.

s= lo

ls (15)

lp = (X

lt) · (lo)−1· ∆s (16)

(8)

Next, the operator is asked to input the original length of the bladder lo and sample size ∆s is calculated in meters by dividing the original length of the bladder by the total sample size points (15). The current sample size

s and the distance of the bladder are in meters. The distances within the image are given in pixels. Thus it is needed to convert the real distances in meters to a pixel distance (lp). For that, the current bladder length lt (in pixel) is multiplied with the inverse of the original length of the bladder lo (m1) and the sample size ∆s (m). This yields a distance in pixels to use for the image (16).

Lastly, the stretching within the image needs to be determined. For that, the new sample distance lp is used.

Every sample point from the identified outer line is used to create again right triangles consisting of three points (A(j, i), B(j + 1, i) and C(j + 1, i + 1)). If the distance between ~AC is smaller than the current pixel distance lp the next sample point is chosen until the distance ~AC is greater. All distances ~AC are stored in a vector. Once the program looped through all sample points the length of the previous bladder length vector (lt) is compared to the length of the vector with the new length of all distances (ln).

cl = l(lt) − l(ln) (17)

nl = sum(ln) + cl·sum(ln)

l(ln) (18)

r = nl

sum(ln) (19)

d = (r · lo) − l0 (20) After the distances are obtained it is possible to first calculate the length difference of the vector with the original bladder length (lt) and the new vector (ln) (17).

If the original length (lt) is greater than the new length of lna stretch is implied. Less sample points are needed as every single sample point has now a greater distance between them. Thus the material stretched. To calculate the new length (nl) the sum of lnis added to the product of the difference in length cland multiplied by the length of the average sample size of ln (18). Currently, this length nl is still in pixels, but it is unit-less after the ratio between the new length nl and the previous length ln is determined (19). This ratio r is multiplied with the original bladder length lo and subtracted from lo to obtain the change in length (d) of the bladder in meters (20). The process can be seen in Fig. 10.

TABLE III

RESULTS OF THE STRETCHING/BENDING CALCULATIONS

Symbol Material properties [unit] Explanation

E 6 [M P a] [33] Young’s modulus of the mate- rial

 [−] Engineering strain of the mate-

rial

σ [M P a] Stress within the material

Y 17 [M P a], [31], [32] Yield strength of the material

Y 100% - 600% [−] [31], [32]

Maximum engineering strain

Fig. 10. Flowchart to determine the stretch in the material.

Firstly, the current length of the bladder is determined. Second the operator provides the program with the original length of the bladder and a sample step size is determined for pixels. Lastly, the stretch in the material is determined.

The last step in the project is the relation of the calculated stretching of the program to the material properties and potential failures. For that, it is desired to first calculate the engineering strain () and second the stress within the material (σ) due to the elongation.

The material properties of EPDM (ethylene propylene diene monomer rubber) are known and summarised in Table III and combined with yet to determine properties [31], [32], [33].

 = ln− lo

lo (21)

σ = E ·  (22)

(9)

First, the engineering strain is calculated based on the original and new bladder length (21). Second, the stress present in the material is calculated by multiplying the determined strain () with the Young’s Modulus (E) (22).

III. RESULTS

This chapter will present the results of the project in the same order as presented in the materials and method section (see II). First the results of the cleaning process with the average image based on the video input (III- A). Second the cleaned image after the application of the median filter and the flexible threshold (see III-B).

Third the boundaries of the objects and the identification of the objects within the image (III-C). Fourth the results of the 3D representation of the detected objects and the identification of the upper outer line pattern (III-D and III-E). Fifths the representation of the detected upper outer line after the application of a spline function (III- F). Sixths the results of the stretching calculations of the bladder (III-G).

The first part of the program, the cleaning process and the 3D representation, will be done with video footage of an exemplary Ocean Battery bladder to test the program with realistic noise and disturbances in the water. Furthermore, the 3D representation of the cleaned image is also based on the real footage of the exem- plary Ocean Battery bladder, as the 3D representation is determined by the different greyness levels of cells within the matrix. The created SolidWorks shapes are already in black and white, thus no cleaning is required.

However, the cleaning was also successfully tested on them. The second part of the program, which focuses on the detection of the upper outer line pattern, the approximation of the detected line and the stretching of the material, is based on the SolidWorks shapes. The exact dimensions are known and can be altered if wanted, while the exact dimensions for the exemplary bladder are not exactly known.

A. Average image from video footage

The video input for the program had significant amounts of noise and disturbances. An example image is provided in Fig. 11. This image is the foundation of the cleaning process and will be used as a comparison for the effectiveness of the cleaning.

Fig. 11. Image of the original video footage, which was used to access the efficiency of the cleaning process of the program.

The coloured video input is fed into the first function (Fig. 4) and an average grey image is created (Fig. 12).

It is visible that this image is an average image of all used grey frames of the video as the corners of the image are smooth and the majority of the noise is already been removed. The majority of the small air bubbles are removed and the bright pattern is still clearly visible.

Fig. 12. Average image created out of 19 frames of the colored input video.

B. Median filter and threshold

After the creation of an average image, a median filter is applied (Fig. 13). The result looks very similar to the average image as both techniques utilise a similar approach. The median filter creates an average greyness value for a cell based on the greyness values of cells in the neighbourhood n. After the application of the flexible threshold, a small separated part in the top left can be observed (Fig. 14).

(10)

Fig. 13. Image after the application of a median filter with neighborhood n of three.

Fig. 14. Binary image after the application of a flexible thresh- old.

C. Object Identification

The third step in the cleaning process is to target the leftover noise by first determining their outer boundaries followed by the identification of their exact position in the image and lastly by removing the detected object.

The deletion process is not automatic and the program displays two images to the operator. The operator decides if the presented object is noise. Fig. 15 visualises an example of a detected object that is noise, while Fig. 16 represents an object that is no noise. After all detected objects have been assigned to noise or no noise a binary and cleaned image is shown (Fig. 17).

Fig. 15. Output for the operator to decide if the detected object is noise. In this case it is noise.

Fig. 16. Output for the operator to decide if the detected object is noise. In this case it is not noise.

Fig. 17. Result of the cleaning process after all objects have been removed.

(11)

If the result of the cleaned object is not satisfactory after the third step it is possible to add an intermediate cleaning step. For that, the outer edges of the binary im- age (Fig. 17) are being created and step three is repeated with only the outer edges of the image. The outer edges are created with the edge function in MATLAB. The usage of the outer edges of the image can be beneficial if still noise is present within the image especially around or on the line pattern. If noise is present at the pattern it will be identified as part of the pattern, thus it will not be cleaned. The edge function establishes a finer boundary condition and it is, therefore, possible to occasionally differentiate between the line pattern and additional noise (Fig. 46).

D. 3D representation of the bladder

The resulting 3D representation of the cleaned image is based on the greyness value of each cell. Again the effectiveness of the cleaning process is accessed by comparing the original image in a 3D environment (Fig.

18) with the cleaned image in 3D. Two different methods are being presented for the 3D representation. Firstly, the surf function (Fig. 19) and secondly a method that assess each point of the matrix and represents them in a 3D environment (Fig. 20) [29].

Fig. 18. 3D representation of the original image.

0 100

400 600

200

400 200

300

0 0 200

Fig. 19. 3D representation of the cleaned image (surf).

Fig. 20. 3D representation of the cleaned image (points) [29].

E. Identification of upper line pattern

This and the coming sections are discussed based on the created parabolic shape with an increase of the length of 2%. The results of the other shapes and lengths (with an increase of 2%, 5% and 10%) can be found in Fig. 28 - Fig. 36. The stretching of the material is determined based on the upper outer line pattern, as the deformation for the shape is uniform. Fig. 21 visualises the detected upper line of the parabolic shape. Some points are discontinued from the rest and small holes are present within the image. The points will be connected to create the desired continuous line of the upper outer line pattern.

(12)

Fig. 21. Detected outer upper line of the parabolic shape.

F. Approximated data from upper line pattern With all points connected, an approximated line can be created for the observed line pattern. Fig. 22 visualises the result. The circles are the real data (detected points in the image), the line is the approximation of the real data and the stars are the sample size. The sample size is chosen to be every tenth measurement point and is used later on to calculate the stretching in the material. The results of the other shapes and lengths are presented in Fig. 37 - Fig. 45.

Fig. 22. Approximation of the detected upper line of the parabolic shape.

TABLE IV

RESULTS OF THE STRETCHING/BENDING CALCULATIONS

Shape (Increase in length)

Increase in

length [mm]

(Constructed

model in

SolidWorks)

Measured increase [mm]

(MATLAB)

Mismatch to total length [%]

Parabolic (2%)

6.24 mm 9.056 mm 0.88 %

Parabolic (5%)

15.67 mm 18.802 mm 0.95 %

Parabolic (10%)

31.38 mm 19.742 mm (-) 3.37 % W-shape

bend (2%)

2 mm 2.062 mm 0.06 %

W-shape bend (5%)

5 mm 0.676 mm (-) 4. 12 %

W-shape bend (10%)

10 mm 2.916 mm (-) 6. 44 %

Strong bend (2%)

2 mm 2.195 mm 0.21 %

Strong bend (5%)

5 mm 7.5 mm 2.63 %

Strong bend (10%)

10 mm 7.297 mm (-) 2.70 %

G. Stretching results

Lastly, the actual stretching within the object is being calculated. Since the approximated data is the same as the actual data, the real data is being used. Table IV sum- marises the results of the stretching calculations for all three objects. The second column of the table represents the actual increase of the length of the shape created in SolidWorks in mm. The third column represents the calculated increase of the object by the MATLAB program and lastly, column four represents the percentual mismatch of the difference between the actual increase and the measured increase to the new total length of the object. The result of the perceptual mismatch is indicated as positive if the program overestimates the stretch. A negative percentual mismatch represents an underestimation of the stretch within the object.

Table V summarises the findings of the calculated stresses and elongation within the created SolidWorks models. The second column represents the engineering strain in the created model based on the increase in length. The third column represents the stress within the created model with Young’s modulus of 6 M P a. Lastly, column four represents the elongation in the material due to the created stretch.

(13)

TABLE V

RESULTS OF THE ENGINEERING STRAIN AND STRESS CALCULATIONS

Shape (Increase in length)

Engineering strain [−]

Stress [M P a] Elongation [%]

Parabolic (2%)

0.029 0.17 M P a 2.88 %

Parabolic (5%)

0.060 0.36 M P a 5.98 %

Parabolic (10%)

0.063 0.38 M P a 6.28 %

W-shape bend (2%)

0.021 0.12 M P a 2.06 %

W-shape bend (5%)

0.007 0.04 M P a 0.68 %

W-shape bend (10%)

0.029 0.17 M P a 2.92 %

Strong bend (2%)

0.024 0.15 M P a 2.44 %

Strong bend (5%)

0.083 0.50 M P a 8.33 %

Strong bend (10%)

0.081 0.49 M P a 8.11 %

IV. DISCUSSION

The discussion will follow the same structure as the Materials and Results section, with the first part discussing the insights of the image cleaning (IV-A). The second part will focus on the leanings of the stretching calculations (IV-B) followed by the limitations of the study (IV-C) and the significance (IV-D).

A. Image cleaning

The result of the average image strengthen the as- sumption that the noise and the disturbances within the water move faster than the pattern itself. The average image significantly reduces the noise in the image while the pattern is still visible with the relevant information still present (Fig. 12). The initial cleaning process of the video footage will facilitate future operations immensely.

Another outcome of the average image is that the im- provement of the median filter is incremental compared to the average image, as they utilise a similar method to clean the image (Fig. 13). Moreover, it is possible to convert the image successfully to a binary image with the usage of a flexible threshold. The selected threshold can correctly display the pattern, while no disturbances are visible around the pattern (Fig. 14). Fig. 14 still contains one object in the top left corner that is identified as noise.

The function to identify and delete noise, based on the input of the operator, can correctly detect and delete the object (Fig. 17). The non-automatic cleaning procedure is more time consuming for the operator but guarantees that only correct objects are removed. The arbitrary character of the noise makes it difficult to automatically

remove all noise consistently without damaging the main object in the picture. Thus it was decided with one of the stakeholders to introduce the manual process of denoising the image. Another consideration is to further clean the image with the edge function within MATLAB.

The technique is useful if only one image is available for the input, thus it is not possible to remove the majority of the noise already with the creation of an average image.

Additionally, it can be used if the image still contains lots of noise after the first object detection or if the noise overlaps with the pattern. The last step for the image cleaning is the 3D representation of the cleaned image.

Firstly, both methods are able to significantly improve the 3D representation of the original image. Both have no noise around the pattern and the pattern is visible. Fig.

20, however, is able to capture the shape of the bladder more realistically. Thus it is preferred to use this method for the 3D representation. One slight disadvantage of that method is computation time, as it is larger than the method that uses the surf function. The computation time for this small image is between two and ten seconds. An increase in pixels could further increase the computation time, which would be undesired.

B. Stretching calculations

The connected points of the detected upper line are satisfactory and represent the shape of the objects un- doubtedly (Fig. 21). The outer line is visible and will be used for future calculations. The same observation can be made for the created models of the strong bend (Fig. 31 - Fig. 33) and the w-shaped bend (Fig. 34 - Fig. 36). The approximation of the data with the spline function is nearly identical to the original data with little to no outliers (Fig. 22). It is visible that more complex shapes, such as the strong bend and the w-shaped bend, yield similar results (Fig. 40 - Fig. 45). This emphasises that the pattern recognition in the previous section III-E was already precise.

Thus far the recognition of the objects yielded similar results independently on the stretching and the different shapes. The results for the stretching calculations, sum- marised in Table IV, however, present a different out- come. In general, it can be observed that the program is able to estimate the stretching of a small magnitude (2%) precisely. All three different shapes have a mismatch of less than 1% and especially the estimation of the stretch for the complex w-shape deviates by only 0.062 mm.

Moreover, the estimations for a stretch of 2% for all three shapes are an overestimation. An overestimation is more desirable than an underestimation, as it introduces a factor of safety for the material. Another insight is that

(14)

the program deviates further from the actual stretching with an increase in stretching. It can be observed that the largest mismatch between the actual stretch and the measured stretch is always present at the largest stretch (10%). Additionally, the estimation of the 10%

stretch for all three shapes is an underestimation of the actual stretch. This is not desired and could become problematic for higher stretches. A final observation is an increase in the deviation of the measurements for more complex shapes (complex shapes here mean more bends and sharper angles). The parabolic model is the simplest shape and has the most accurate measurements (0.88%, 0.95% and (-) 3.37%). The most complicated w-shape has the least accurate measurement (0.06%, (-) 4.12%

and (-) 6.44%), while the strong bend is between the two.

Lastly, Table V summarises the results of the calcula- tions for the engineering strain, stress and elongation of the created shapes due to the stretching. The material properties, such as the yield strength (17 M P a) and the maximum elongation (100% - 600%), of EPDM, are known and summarised in Table I. The maximum stretching does not exceed 10% of the material. Thus the stress within the material should not exceed the maximum yield strength Y . This is also shown by the results with a maximum stress of 0.5 M P a for the strong bend. All calculations support that no critical stretching occurs within the material which was expected.

C. Limitations

The project acted as a first step to validate the theo- retical simulations and expectations with empirical data of the prototype of the Ocean Grazer. One of the first questions was if accurate object tracking is possible with the current setup of the Ocean Battery. The project was influenced by two major limitations. The first one was the lack of real video footage from within the Ocean Battery bladder to observe the movement of the bladder in the in and deflation process. The absence of real data leads to the creation of artificial models with idealised and uniform deformation in the bladder. Moreover, the calculations for the stretching within the material were conducted in a 2D environment and not in the 3D environment as seen in Fig. 22. Lastly, the lack of real footage caused that only the first part of the program, the cleaning of the image, was tested on real data. The second type of limitation is due to technical and time constraints within the project. The cleaned image is rep- resented in a 3D environment by differentiating between greyness levels within the image. A higher greyness level indicates a closer object. It is a common technique to

leverage different greyness levels and brightness scales in object tracking & recognition [37]. Despite this, the technique is prone to be negatively influenced by bright light sources. Another limitation concerns the perceived shift of objects, lines and distances due to the camera angle towards the detected object. An object that is not straight to the camera view may be detected smaller or bigger due to the position of the camera to the object and thus creates this visual shift. This is predominantly important for the stretching calculations of the bladder but was not incorporated in the methods. A final limita- tion is the method/ equations used to convert the length of the bladder from meters to pixels length and back to meters (15 - 20). The conversion of meters to pixels and back to meters is prone to errors in the calculations and is more sensitives to irregularities. Thus the calculated results for the stretching are always also dependent on the deviation between the pixel length and the measures in meters.

D. Significance

The results and insights from this project will help the Ocean Grazer B.v. as the first step towards a tracking algorithm of their Ocean Battery bladder. The project highlighted that it is already possible to create a 3D representation of the bladder with current video footage.

On the other hand, it did also point out the current lim- itations to calculate precisely the stretch in the material.

The recommendations, presented in the next section (V), can be of special interest for the Ocean Grazer, as they present specific methods to further improve the tracking of the bladder in the future. Lastly, the results also showed that the program is able to determine precisely the stretching in smaller magnitudes. This is important to the company as the Ocean Grazer expects no to only incremental stretching in the bladder.

V. RECOMMENDATIONS

To further improve the tracking of the bladder or to facilitate future projects a few recommendations will follow to improve the quality and the accuracy of the tracking. A first recommendation is to add a second camera (view) to locations of special interest. It is already known which regions of the bladder are most likely to experience folding or stretching in the system.

This is predominantly true for the corners of the bladder.

It is possible to add a second camera, that captures the bladder from a different side and perspective. The addition of a second camera would, on the one hand, help to capture the entire folding/ stretching process. This can be of great help if stretching occurs in regions that are not

(15)

in the field of view for the first camera, thus the second camera would be able to capture it. On the other hand, it would be possible to create a 3D representation of the bladder not solely on the greyness scale in the images and videos, but by a combination of the two different camera views.

To further improve the focus of the cameras it is possible to attach quick responds codes (QR codes) to the regions of special interest. Again regions of interest of the bladder are known beforehand thus it is possible to add four QR codes at the boundaries of that region.

It is already known that software programs are capable to detect QR codes within noisy and arbitrary images with detection accuracies of above 90% in fractions of seconds [34]. This is also the case for blurry QR codes with detection accuracies of above 90% and deformed QR codes with detection accuracies of above 80% [35].

Preselection of the region of interest can significantly reduce the computation time of the code and already eliminate disturbances in the image, that are outside of this region of interest.

Another method to further improve the recognition of the pattern is to use more sophisticated cameras. Many state-of-the-art programs compute 3D shapes based on depth perceptions of RGB images or greyscales [36], [37]. Better images of the bladder itself could facilitate both the 3D representation of the bladder and future calculations for stretching. The preliminary images from the inside of the Ocean Battery had a bright tone over the entire image and pattern. Even for human eyes, it was hard to distinguish between the white pattern and the normally black background.

Lastly, an entirely new approach for the pattern detec- tion could be used by applying infrared ink to the pattern [30]. It would be possible to mix the IR ink within the pattern on the inside of the bladder and have a special camera detect it. That would eliminate the need to clean all images and video footages as only the IR pattern would be visible and not the noise. This could, on the one hand, reduce the computational time of the code. On the other hand, it could be possible to achieve sharper images of the pattern and thus more accurate results.

VI. CONCLUSION

The suggested method to clean real-life footage of the Ocean Battery is successful, especially for video inputs.

The first step to generate an average image based on all frames to filter out the noise is the most influential step to clean the frames from noise. It is possible to achieve a very accurate 3D representation of the bladder based on the greyness values. However, this method is limited to coloured or grey images and black and white

images as input do not yield the same results. Currently, this is not problematic as the future inputs (videos from the Ocean Battery) will be coloured and not binary. The stretching calculations, based on different shapes created in SolidWorks, yielded two different results. Firstly, it is possible to determine the stretch (2%) within the uniform object to a maximum percentual mismatch of less than 1%. More complex shapes with multiple edges yielded the worst results. Secondly, the program overestimated the stretch (again 2%) for all three different shapes, while it underestimated the stretch of 10% consistently. The results of the stretching calculations are influenced by limitations such as the conversion of meters to pixel length and the lack of real footage of the bladder.

Finally, several improvements are suggested to improve the accuracy of the tracking in the future, such as the application of infrared ink to the pattern or QR-codes to define the region of interest.

ACKNOWLEDGMENT

Thanks are due to Prof. dr. ir. B. Jayawadhana and Dr. M. Mu˜noz Arias for their continued assistance and great feedback throughout the entire Bachelor Integration Project on my progress and my results. I always had the opportunity to run by them any questions and problems I had, also on short notice. Moreover, I do want to thank Drs. W. Prins for his weekly individual meetings and his constructive and critical feedback on my ideas, which helped me a lot to develop and strengthen my project.

The same needs to be said for Prof. dr. A. Vakis and the entire Ocean Grazer research group during the weekly group meetings and my progress update presentations.

Lastly, I would like to thank Marijn van Rooij for letting me visit the Ocean Grazer workshop in Eemshaven to familiarise myself with the prototype of the Ocean Battery.

REFERENCES

[1] Vestas, “Global consumer wind study 2012”, 1 st edition, Vestas Wind Systems A/S, Copenhagen, DK, Sep. 2012, [Online] Available: file:///C:/Users/Nicolas/Downloads/Global- Consumer-Wind-Study-2012 Vestas%20(2).pdf

[2] K&L Gates, SNC Lavalin and Atkins, “Offshore wind hand-book.”, 2 nd version, K&L Gates, SNC Lavalin and Atkins, Seattle, WAS, USA, Oct. 2019, [Online] Avail- able:https://files.klgates.com/files/uploads/documents/2019

offshore wind handbook.pdf

[3] A. Nghiem and I. Pineda, “Wind energy in Europe: Scenarios for 2030”, K&L Gates, WindEurope.org, Brussels, B., Sep. 2017, [Online] Available: https://windeurope.org/wp- content/uploads/files/about-wind/reports/Wind-energy-in- Europe-Scenarios-for-2030.pdf

(16)

[4] P. Sorensen and N. A. Cutululis and A. Vigueras-Rodr´ıguez and L. E. Jensen and J. Hjerrild and M. H. Donovan and H. Madsen,

“Power fluctuations from large wind farms,” IEEE Transactions on Power Systems, vol. 22, no. 3, pp. 958–965, Jul. 2007, DOI:

10.1109/TPWRS.2007.901615

[5] P. Pinson and L. Christensen and H. Madsen and P. E. Sørensen and M. H. Donovan and L. E. Jensen, “Regime-switching modelling of the fluctuations of offshore wind generation,”

Journal of Wind Engineering and Industrial Aerodynamics, Elsevier, vol. 96, no. 12, pp. 2327–2347, Dec. 2008, DOI:

https://doi.org/10.1016/j.jweia.2008.03.010

[6] H. Zhao and Q. Wu and S. Hu and H. Xu and C. N. Rasmussen,

“Review of energy storage system for wind power integration support,” Applied energy, Elsevier, vol. 137, pp. 545–553, Jan.

2015, DOI: https://doi.org/10.1016/j.apenergy.2014.04.103 [7] J. C. Reboredo and A. Ugolini, “The impact of energy prices on

clean energy stock prices. A multivariate quantile dependence approach,” Energy Economics, Elsevier, vol. 76, pp. 136–152, Oct. 2018, DOI: https://doi.org/10.1016/j.eneco.2018.10.012 [8] J. N. Tsitsiklis and Y. Xu, “Pricing of fluctuations in

electricity markets,” European Journal of Operational Re- search, Elsevier, vol. 246, no. 1 pp. 199–208, Oct. 2015, DOI:https://doi.org/10.1016/j.ejor.2015.04.020

[9] OceanGrazer.com, “Offshore hybrid renewable energy harvest and storage device,” [Online] Available:https://oceangrazer.com/

[10] A. Kuncorojati, ‘Fold Structure Detection Analysis during In- flation and Deflation Process on Bladder Reservoir of Ocean Grazer using 3D Finite Element Method,”M.S. thesis. Re- search Project, Industrial Engineering and Management, Univer- sity of Groningen., Groningen, NL, 2020, [Online] Available:

http://fse.studenttheses.ub.rug.nl/id/eprint/21973

[11] O. Omidvar, “Shape recognition,” in Progress in Neural Net- works,, vol. 6,Intellect Ltd; 0 edition (May 1, 1999), pp. 320.

[12] Y. J. Cha and W. Choi and G. Suh and S. Mahmoudkhani and O. B¨uy¨uk¨ozt¨urk, “Autonomous structural visual inspec- tion using region-based deep learning for detecting multiple damage types,” Computer-Aided Civil and Infrastructure En- gineering, vol. 33, no. 9, pp. 731–747, Nov. 2017, DOI:

https://doi.org/10.1111/mice.12334

[13] M. Quintana and J. Torres and J. Men´endez, “A simplified computer vision system for road surface inspection and main- tenance,” IEEE Transactions on Intelligent Transportation Systems , vol. 17, no. 3, pp. 608–619, Mar. 2016, DOI:

10.1109/TITS.2015.2482222

[14] S. Xu and J. Wang and X. Wang and W. Shou, “Com- puter vision techniques in construction, operation and main- tenance phases of civil assets: A critical review,” Proceed- ings of the 36th International Symposium on Automation and Robotics in Construction (ISARC), pp. 672–679, 2019, DOI:

https://doi.org/10.22260/ISARC2019/0090

[15] F. Maire, “Vision based anti-collision system for rail track maintenance vehicles,” 2007 IEEE Conference on Advanced Video and Signal Based Surveillance, London, UK, Sep. 2007, DOI: 10.1109/AVSS.2007.4425305

[16] S. Ren and K. He and R. Girshick and J. Sun, “Faster r- cnn: Towards real-time object detection with region proposal networks,” Transactions on Pattern Analysis and Machine In- telligence, vol. 39, no. 6, pp. 1137–1149, Jun. 2017, DOI:

10.1109/TPAMI.2016.2577031

[17] G. Litjens and T. Kooi and B. E. Bejnordi and A. A. A.

Setio and F. Ciompi and M. Ghafoorian and J. A. Van Der Laak and B. Van Ginneken and C. I. S´anchez, “A survey on deep learning in medical image analysis,” Medical Im- age Analysis, Elsevier, vol. 42, pp. 60–88, Dec. 2017, DOI:

https://doi.org/10.1016/j.media.2017.07.005

[18] X. He and S. Yan and Y. Hu and P. Niyogi and H.J. Zhang,

“Face recognition using laplacianfaces,” IEEE transactions on pattern analysis and machine intelligence, vol. 27, no. 3, pp.

328–340, Apr. 2005, DOI: 10.1109/TPAMI.2005.55

[19] S. V. Chhaya and S. Khera and P. Kumar, “Basic geometric shape and primary color detection using image processing on Matlab,” International Journal of Research in Engineering and Technology, vol. 4, no. 5, pp. 505–509, May 2015

[20] R. Hussin and M. R. Juhari and N. W. Kang and R. C.

Ismail and A. Kamarudin, “Digital image processing techniques for object detection from complex background image,” Proce- dia Engineering, Elsevier, vol. 41, pp. 340–344, 2012, DOI:

https://doi.org/10.1016/j.proeng.2012.07.182

[21] V. Goel and S. Singhal and T. Jain and S. Kole, “Specific color detection in images using RGB modelling in MATLAB,”

International Journal of Computer Applications, Foundation of Computer Science, vol. 161, no. 8, pp. 38–42, Mar. 2017, DOI:

10.5120/ijca2017913254

[22] Y. Wan and Q. Xie, “A novel framework for optimal RGB to grayscale image conversion,” 2016 8th International Con- ference on Intelligent Human-Machine Systems and Cybernet- ics (IHMSC), IEEE, vol. 2, pp. 345–348, Dec. 2016, DOI:

10.1109/IHMSC.2016.201

[23] Y. Hui and H. Ji, “Research on Image Median Filtering Al- gorithm and Its FPGA Implementation,” 2009 WRI Global Congress on Intelligent Systems, IEEE, vol. 3, pp. 226–230, Aug. 2009, DOI: 10.1109/GCIS.2009.130

[24] M. Storath and A. Weinmann,“Fast Median Filtering for Phase or Orientation Data,” IEEE Transactions on Pattern Analysis and Machine Intelligence, IEEE, vol. 40, no. 3, pp. 639–652, Mar. 2018, DOI: 10.1109/TPAMI.2017.2692779

[25] C. Solomonand T. Breckon, “Fundamentals of Digital Image Processing: A practical approach with examples in Matlab,”

,John Wiley & Sons Ldt.; 2011

[26] G. Chen and T. D. Bui and A. Krzy˙zak, “Image denoising with neighbour dependency and customized wavelet and threshold,”

Pattern recognition, Elsevier, vol. 38, no. 1, pp. 115–124, Jan.

2005, DOI: https://doi.org/10.1016/j.patcog.2004.05.009 [27] S. Sidhik “Comparative study of Birge–Massart strategy and

unimodal thresholding for image compression using wavelet transform,” Optik, Elsevier, vol. 126, no. 24, pp. 5952–5955, Dec. 2015, DOI: https://doi.org/10.1016/j.ijleo.2015.08.127 [28] MathWorks (HelpCenter). Spline function, [Online] Available:

https://nl.mathworks.com/help/matlab/ref/spline.html

[29] Z. Taylor (2021).Find 3D Normals and Curvature, (Version 1.4.0.0), [Source code], [Online] Available:

https://www.mathworks.com/matlabcentral/fileexchange/48111- find-3d-normals-and-curvature

[30] G. Narita and Y. Watanabe and M. Ishikawa “Dynamic pro- jection mapping onto deforming non-rigid surface using de- formable dot cluster marker,” IEEE transactions on visualiza- tion and computer graphics, vol. 23, no. 3, pp. 1235–1248, Mar.

2017, DOI: 10.1109/TVCG.2016.2592910

[31] AZO Material, ”Ethylene Propylene Rubbers – Properties and Applications of Ethylene Propylene Diene (EPDM) and Ethy- lene Propylene Copolymers (EPM)”, 2003 [Online] Avail- able:https: //www.azom.com/article.aspx?ArticleID=1822 [32] MatWeb (Material Property Data), ”Ethylene Propylene

Rubber (EPM, EPDM)”, n.d. [Online] Available:

http://www.matweb.com/search/datasheet.aspx?matguid=f8e33- 55cc2c541fbb0174960466819c0ckck=1

[33] EPDM Rubber, ”EPDM Rubber”, n.d. [Online] Available:

https://ancoviello20.wixsite.com/epdm: :text=EPDM’s%20Yo- ungs%20Modulus%20measure%20is,return%20to%20its%20o- riginal%20length.

(17)

[34] L. Belussi and N. Hirata, “Fast QR code detection in arbitrarily acquired images,” 2011 24th SIBGRAPI Conference on Graph- ics, Patterns and Images, IEEE, pp. 281–288, Aug. 2011, DOI:

10.1109/SIBGRAPI.2011.16

[35] P. Gaur and S. Tiwari, “Recognition of 2D barcode images us- ing edge detection and morphological operation,” International Journal of Computer Science and Mobile Computing, vol. 3, no. 4, pp. 1277–1282, Apr. 2014, DOI: 10.47760/ijcsmc [36] A. Crivellaro and M. Rad and Y. Verdie and K. M. Yi and P.

Fua and V. Lepetit, “Robust 3D object tracking from monocular images using stable parts,” IEEE transactions on pattern anal- ysis and machine intelligence, vol. 40, no. 6, pp. 1465–1479, May. 2017, DOI: 10.1109/TPAMI.2017.2708711

[37] S. Giancola and J. Zarzar and B. Ghanem, “Leveraging shape completion for 3d siamese tracking,” Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 1359–13689, Jun. 2019

[38] MathWorks (HelpCenter), Penalized threshold for wavelet 1-D or 2-D denoising, [Online] Available:

https://nl.mathworks.com/help/wavelet/ref/wbmpen.html

[39] MathWorks (HelpCenter), Identify-

ing Round Objects, [Online] Available:

https://https://nl.mathworks.com/help/images/identifying- round-objects.html

APPENDIXI APPENDIX

Fig. 23. Legend for the flowcharts. Objects are explained with role and function within the flowchart.

Fig. 24. Pseudo-code to determine a region of interest (ROI) for an arbitrary image.

(18)

Fig. 25. Created shape in SolidWorks: Original parabolic bladder (314.2mm).

Fig. 26. Created shape in SolidWorks: Original strong bend bladder (90mm).

Fig. 27. Created shape in SolidWorks: Original w-shape bladder (100mm).

Fig. 28. Detected outer upper line of the bladder from the initial video.

Fig. 29. Detected outer upper line of the created parabolic shape (stretched by 5%).

(19)

Fig. 30. Detected outer upper line of the created parabolic shape (stretched by 10%).

Fig. 31. Detected outer upper line of the created shape with a strong bend (stretched by 2%).

Fig. 32. Detected outer upper line of the created shape with a strong bend (stretched by 5%).

Fig. 33. Detected outer upper line of the created shape with a strong bend (stretched by 10%).

Referenties

GERELATEERDE DOCUMENTEN

Parkbezoekers die het er mee eens zijn dat ze vaker naar het park zouden gaan als er geen bedrijven stonden, hebben dus vaker een negatieve waardering voor het zien

Based on the results of the correlation analysis of accessibility index levels for the year 2005 and per capita income levels for the year 2005, the results for research sub-

(chlorofluorocarbon;! chemically! inert! refrigeration! gases! released! into! the! atmosphere!!. since!

The nucleus constraint referred to above correctly implies that Frisian sequences of long vowels + /i/, /u/ are not real diphthongs (compare the Dutch data in (7)).. 'stay, sg.'

Keywords: water supply, service delivery, neighbourhood, exit, voice and loyalty framework, inadequate public services, Lagos, Benin, urban households.. Disciplines: Public

Publisher’s PDF, also known as Version of Record (includes final page, issue and volume numbers) Please check the document version of this publication:.. • A submitted manuscript is

Indien wiggle-matching wordt toegepast op één of meerdere stukken hout, kan de chronologische afstand tussen twee bemonsteringspunten exact bepaald worden door het

Ten eerste moest er een mogelijkheid komen om door middel van post-globale commando's er voor te zorgen dat stukken tekst alleen op het computerscherm worden