• No results found

7. Verification and Validation

7.2 Verification process

This verification process is expected to answer the question, “Are we building the system right?”. A prototype implementation is developed for the main use cases functionalities of the camera driver to answer this question. The main use cases of the component are driver startup and shutdown, initializa-tion and terminainitializa-tion of the image sensor’s hardware components, sensor activainitializa-tion, queue scan, and grabbing frame. Testing was done to test the main functionality of the driver and to find out the chal-lenges and consequences in testing when we migrate from C to C++. To verify these functionalities and challenges, unit tests, integration tests, and system tests were performed using internal ASML tools called ATTEST, Devbench, and Testbench, respectively. The unit test, integration test, and system test are explained in detail in the following subsections.

7.2.1. Unit testing

Unit test is an automated piece of code that isolates a unit of work in a system and verifies its correct-ness. A unit of work is usually, but not limited to, a single method or class. The ASML toolset called ATTEST is used to create and run unit tests. It is a unit testing framework based on Google Test and Google Mock. The following steps are used to test functions of the camera driver classes:

a) Define the System Under Test (SUT) b) Generate test doubles

c) Create test cases

Define the System Under Test (SUT)

The first step for applying the unit test is to define the SUT. In this project, a class is considered as a SUT and a set of test cases for member functions of the class were created.

34

Generate test doubles

A unit test generator from the ATTEST toolset is used to generate makefiles and mocks needed to develop a unit test for the SUT. For C code, the tool generates all the test doubles or mocks needed to break the dependencies of the SUT. However, for C++ code, there are some dependencies that cannot be generated fully using the unit test generator. Another tool called test double generator from the AT-TEST is used to overcome the limitation. The test double generator tool takes a C++ header file as input to generate test doubles needed by the SUT.

Create test cases

After the above steps are correctly followed and all the dependency mocks are generated, the next step is to develop a test case. For demonstration purposes, let us consider one of the classes called COBIni-tialization that is responsible for initializing the COB hardware. We tested both good weather and bad weather test scenarios. A good weather scenario describes where the sequence of events to initialize COB succeeds, whereas a bad weather scenario describes where the sequence of events to initialize COB fails. Table 5 describes the two test cases for the good and bad weather test scenarios.

Table 5: Test cases for COB initialization

Test case name Description

test_initialize_cob_board_good_weather Given state = TERMINATED = BIST is TRUE When initialize_cob is invoked Then state = INITIALIZED test_initialize_cob_board_bad_weather Given state = TERMINATED

= BIST is FALSE When initialize_cob is invoked Then state = Exception thrown

Figure 23 shows the result of the two test cases after they were developed and built successfully.

Figure 23: test case results for COB initialization

In a similar fashion, other classes of the camera driver were tested. At least one test case was devel-oped and tested for each class. The test output for each of the test cases is in the development reposi-tory.

7.2.2. Integration testing

After the main uses cases of the camera driver were developed, an integration test was applied to verify that the different modules within the camera driver and the camera driver with its client are well

35 integrated. As shown in Figure 28, the camera driver is regarded as a black box in the test environment.

The client uses the driver’s external interfaces to execute the internal behavior of the camera driver.

The mode of the camera driver in simulation mode 2 shows that its behavior was tested in a virtual machine that has no real hardware.

Figure 24: The SUT and its test environment

Moreover, the behavior of an existing firmware component, which the camera driver requires, is stubbed. Integration test was applied on Devbench, a virtual machine that has no machine-specific hard-ware, for each of the main use cases of the driver. As an illustration, the integration test for the step-wise initialization use case is described here. In the step-step-wise initialization, there are multiple hardware components that had to be initialized in sequence. All the existing test cases on the Devbench were covered and some of these test cases are listed in Table 6.

Table 6: Existing sample test cases used in integration test

Test case number Test case

Test_case_1 Test HPM initialization Test_case_2 Test HSSL initialization Test_case_3 Test DMA initialization Test_case_4 Test SBD scan initialization Test_case_5 Test HPM Board initialization Test_case_6 Test COB initialization

Test_case_7 Test Relay Board initialization Test_case_8 Test Sensor Board initialization

Devbench has a graphical user interface to run the complete step-wise initialization of the driver, in-cluding the test use cases mentioned in Table 6. The following steps were used to run the tests on the Devbench:

a) Create and activate Devbench from the project Clearcase view b) Synchronize the driver code to the Devbench platform

c) Set the simulation mode of the Image sensor subsystem driver to DEVBENCH d) Start the TwinScan application in the Devbench

36

e) In the Devbench, run the image sensor subsystem driver as a stand-alone application (ADT tool). This will open a GUI with the main functionality options of the driver

f) From the GUI, click the step-wise initialization button and a new dialog box will be opened to test the driver initialization

The GUI of the step-wise initialization provides options to test the driver initialization one by one or all at once. Either way, when we run the step-wise test case, the corresponding box becomes green if a test case is successful; Otherwise, it becomes yellow, and the program terminates with an error. Figure 25 shows the results before and after running the step-wise initialization test cases.

Figure 25: Integration test results before and after step-wise initialization

7.2.3. System testing

After successfully testing the main use cases of the camera driver in the Devbench, the driver behavior should be tested in an environment that involves real hardware. By doing this, we verify the correctness of the driver commands being sent to the image sensor hardware components and the integration of the driver not only with its client but also with the rest of the TwinScan software components. The main use case of the camera driver is to capture wavefronts of the TwinScan at the wafer stage. To test this use case, a test platform called Testbench is used. The Testbench tool is similar to the Devbench, but it involves real hardware modules and other required software components that are not stubbed. The fol-lowing steps were used to test the frame grabbing use case.

a) Create a patch from the project stream view

b) Install the created patched on the Testbench and configure hardware setups if necessary c) Start the TwinScan application in the Testbench

d) Initialize all the drivers in the TwinScan machine e) Grab a frame using one of the configured sensor types

The test was successful, and we were able to capture a frame using one of the available sensor types in real time. Figure 26 shows one of the captured frames.

37 Figure 26: System test result for grabbing frame use case