• No results found

This chapter deals with the validation of the meta-models that were built. Two past assessments done by Altran are taken as reference cases to validate the meta-models and our assessment tool.

A CMMI assessment and a TMMi assessment for a specific product quality are done to validate the tool.

6.1 Case Study for CMMI

In Figure6.1 is a CMMI assessment that Altran did for the Reliability characteristic. As can be seen in the figure, 12 Process Areas were chosen for the assessment, some from the CMMI-SEV and some from CMMI-DEV. The Process Areas are listed in the first row of the Excel sheet. These were chosen since the assessment team decided that these Process Areas are the ones that affect Reliability. The Process Areas are Requirements Management (REQM), Process and Product Quality Assurance (PPQA), Configuration Management (CM), Service Delivery (SD), Incident Resolution and Prevention (IRP), Service System Development (SSD), Service Continuity (SC), Requirements Development (RD), Technical Solution (TS), Product Integration (PI), Verification (VER) and Validation (VAL). As can be seen in the right bottom corner of the figure, the Overall Score is 3.24. Except for REQM, all other scores of the practices of individual Process Areas have been blurred out to protect customer sensitive data. The Process Score or Capability of REQM is 3.24.

Figure 6.1: The Excel sheet of the CMMI assessment

Figure 6.2: Editors- run-time instances needed for a CMMI Reliability assessment

In Figure 6.2, can be seen the run-time instance of all three meta-models, and the different instances that were required to complete a CMMI assessment and thus validate the new tool. A CMMI assessment for the product characteristic for Reliability had been done by Altran in the past. The assessment was re-done using the tool and the same result was achieved as had been done by the company using an Excel Workbook.

The two panes in the upper part of the picture show the standards that had to be defined in order to do the assessment. They are both .standards files and depict the CMMI standard and the ISO 25010 quality model.

In the bottom left pane is the Assessment Definition, wherein the assessment has been defined for a particular client. Since the client needed the Reliability characteristic assessed, it has been mentioned there. Furthermore, only those processes from CMMI were selected by the assessment team, which they thought affected that characteristic. This is a .assessment file.

In the right bottom pane is the run-time instance of the actual meta-model, which is the final CMMI assessment, based on the defined assessment. The final assessment score is highlighted by the red rectangular box, a score of 3.24, which is the same as the one achieved using Excel by Altran in Figure6.1. Furthermore, individual process scores can also be seen by using the drop down option, such as the Assessment Process Score of the Requirements Management process, which is 3.23, the same as the Capability Score of REQM from the Excel sheet in Figure6.1.

6.2 Case Study for TMMi

In Figure6.3 is a TMMi assessment that Altran did for the Reliability characteristic. As can be seen in the figure, eight Process Areas were chosen for the assessment, from the TMMi standard.

The Process Areas are listed in the first row of the Excel sheet. These were chosen since the assessment team decided that these Process Areas are the ones that affect Reliability. The eight Process Areas are : Test Processing Strategy (TPS), Test Planning (TP), Test Design and Execu-tion (TDE), Test Lifecycle IntegraExecu-tion (TLE), Non-FuncExecu-tional Testing (NFT), Test OrganizaExecu-tion (TO) and Peer Reviews (PR). The Overall Score from the assessment is 3.30. Each of the Process Areas, have their own individual process scores. The average of the eight process scores, gives the assessor the Overall Score. Except for TPS, all other scores of the practices of individual Process Areas have been blurred out to protect customer sensitive data. The Process Score or Capability of TPS is 3.33.

For the TMMi assessment, in Figure6.4, is a four pane view showing run-time instances based on the three meta-models that were made.

The two panes in the upper part of the figure show the standards that had to be defined in order to do the assessment. They are both .standards files and depict the TMMi standard and the ISO 25010 quality model. As indicated by the red arrows, the data from the top two panes together help to define the assessment in the bottom left pane. This is a .assessment file,which in turn is used to do the actual assessment, as indicated by the pane that is to is right by the red arrow. The actual assessment is a .actual file, which is based on the actual meta-model.

In the assessment instance, the Assessment Score and the Assessment Process Score have been highlighted by means of red rectangular boxes. By comparing with Figure6.3, the final results can be compared. The assessment tool gives more exact values instead of rounding them off, therefore the value of the Assessment Score is 3.29, compared to the Overall Score in the Excel sheet of 3.30. The individual process scores can also be compared with the Excel sheet in Figure 6.3. In the actual run-time instance the process TPS has a Assessment Process Score of 3.33 which is also the same in Figure6.3. Consequently, the tree editor can replace the Altran Excel sheet to complete assessments.

Figure 6.3: The Excel sheet of the TMMi assessment

Figure 6.4: Editors- run-time instances needed for a TMMi Reliability assessment In this chapter we see that the assessment tool has thus been validated by providing the same results as those achieved using the Altran Excel templates for CMMI and TMMi assessments.

Conclusions and