• No results found

Recommendations for a Web Management System: A case study on the improvement of its structuring component with user-centered design

N/A
N/A
Protected

Academic year: 2021

Share "Recommendations for a Web Management System: A case study on the improvement of its structuring component with user-centered design"

Copied!
19
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

1. Introduction

Today, many websites are built on Content Management Systems (CMS). According to Patel, Rathod & Prajapati (2011), these websites are easy to maintain because underlying CMS make it possible to separate content from presentation. CMS are server-based, as the content of the websites is stored in databases. In this way, users can manage their content in a user-friendly environment, and changes will be directly visible on the website. The presentation and functions of the content are defined in a theme, which can be created by developers only. Themes are to a certain extent expandable as CMS offer a large collection of plug-ins. Plug-ins can affect not only the theme, but also the user environment of the CMS itself. CMS are modular systems, because plug-ins can be built on top of other plug-ins.

A study of Farooq, Hussain, Abbas & Hussain (2012) has shown that the three most efficient CMS of today are Joomla, Drupal and Wordpress. Although these systems share the core functions of CMS, there are some differences between these CMS. Of the three, Wordpress is the most popular one, with a CMS market share of 59.5% and a total web market share of 26.4% (“Usage statistics and market share of WordPress for websites”, 2016). Wordpress

was first designed as a basic blogging tool, but over the years it has become an advanced CMS with the use of a large collection of plug-ins (Farooq et al., 2012). Patel et al. (2011) suggest that users can also choose from many available themes. Although plug-ins and themes are created by developers, Wordpress offers good documentation support to prevent that users do not know what plug-ins and themes they have to install. Wordpress naturally has an easy to use interface. In comparison to Wordpress, Joomla was designed as a more advanced CMS for the development of interactive multilingual websites, which supports the creation of online communities and e-commerce applications in a short time. However, Patel et al. (2011) state that the most advanced and powerful CMS is Drupal: a platform for the creation of robust, but flexible websites that can hold many complex data. Because the most advanced websites can be built, its user environment also requires the highest amount of technical skills.

It has become clear that the three CMS differ in complexity, but share the same core functions. These functions give average users many possibilities with respect to the creation and maintenance of their websites. However, structural restrictions unfortunately still remain. Although plug-ins give users more control over the theme,

Recommendations for a Web Management System:

A case study on the improvement of its structuring component with user-centered design

Emiel de Graaf, University of Amsterdam

Supervisor: Jacobijn Sandberg, FNWI

Abstract: Today, many websites are built on Content Management Systems (CMS). However, with CMS, users do not have full control over the structure of the page. To overcome these structural limitations, the digital-marketing company Web First is developing a so-called Web Management System (WMS). This WMS has the same core functions as a CMS, but with as big difference that it has an added structuring component. In collaboration with Web First, empirical research on the final functionality of the WMS is done. Therefore, the structuring component had to be improved, for which a user-centered approach is applied. For the second part of this study, we researched the effect of this approach on the improvement of the structuring component. Using a user-centered approach, usability was evaluated with three usability tests by identifying usability problems with Video Data Analysis. Six participants were used for each test, from whom one participant participated in a pilot test. Between these tests, there was enough time to address the most severe problems. In the end, most serious problems were resolved, so good recommendations for the functionality of our WMS could be made. However, it appeared that the decrease of less severe problems was stagnating at around the 30 problems. We suggest that adopting a user-centered approach can resolve almost all of the serious problems, which ensures the evolution of good design. However, to solve the less severe problems as well, designers must make the right decisions during the development process. Therefore, they have to know when it is better to ignore the user, for example when solving a usability problem might increase complexity. However, many designers do not have this knowledge. In this case, we suggest to fully adopt the user-centered approach and settle for good design. This research contributes to the Human-computer Interaction (HCI) discipline as it can be used as a case study on the application of the user-centered approach. Furthermore, an indication of the effect of UCD is given. Finally, the recommendations for our WMS contribute to its future development.

(2)

most plug-ins do not fit with custom developed themes and other plug-ins. Therefore, if users want to create or change structure that themes cannot achieve, they have no choice but to hire a developer. For this reason, the digital-marketing company Web First is developing a so-called Web Management System (WMS). This WMS has the same core features as a CMS, but with as big difference that it has an added structuring component. This component is at the heart of the WMS, and gives users – in contrast to normal CMS – full control over the structure of their websites. With the structuring component, websites are arranged in blocks, in which users can add their own content. Users have full control over the arrangement of these blocks, because blocks can be infinitively divided into more blocks. This way, WMS make it possible to overcome the structural limitations that CMS always will have. Users will become less dependent of developers, as themes of the WMS only contain the style of the website and not the structure anymore.

Because WMS are also modular systems, the functions of the structuring component can easily be extended in the future. However, the structuring component is still in an early stage and there are many problems with regard to its usability. Accordingly, we do not know yet what the functionality of the WMS will be. For this reason, in collaboration with Web First, empirical research on the final functionality of the WMS is done.

Therefore, the structuring component is improved iteratively, using a user-centered approach. Although user-centered design (UCD) is applied in many studies (Vredenburg, Mao, Smith & Carey, 2002; Kangas & Kinnunen, 2005), some researchers emphasize several weaknesses of this approach (Norman, 2005). For this reason, this study also researches the effect of UCD on the improvement of the structuring component. An important question is if usability problems reduce with the application of the user-centered approach. This paper contributes to the Human-computer Interaction (HCI) discipline as it can be used as a case study on the application of UCD. Furthermore, this study contributes to the future development of our WMS, as recommendations for its final functionality are made.

2. Theoretic background

This section provides a theoretic framework for the application of the user-centered approach.

2.1. User-centered design

Although UCD is one of the most dominant approaches in interface and application design (Norman, 2005), research of Gulliksen et al. (2003) has shown that there is lack of agreement on the definition of UCD. For this reason, Gulliksen et al.

combined existing definitions with the outcome of several studies to develop a new definition. This definition reads as follows: UCD is “a process focusing on usability throughout the entire development process and further throughout the system life cycle” (p. 401), which is based on twelve key principles. For this study, three of these principles will be used as guidelines for the application of UCD:

User focus: The needs and the behavior of

the user should guide the development of the product.

Evolutionary systems development: The

development of the product is both iterative and incremental; design solutions should be iteratively evaluated and improved.

Evaluate use in context: Design must be

tested against baselined criteria. These criteria must be defined before the development process.

According to Norman (2000), there are many reasons to apply UCD. First of all, it is important to know who the users of your product are. Bad designs all over the world show that it is important not to forget the user; by listening to the users of your product, these bad designs can be prevented. Furthermore, users get the feeling that they are heard when they are included in the development process. This also leads to users knowing what to expect. However, Norman states that there are also disadvantages of UCD. One of them is listening to every single complaint might increase complexity. Furthermore, UCD focuses on static interfaces, with as result that no attention is paid to consistency between system components. Finally, when systems get too complex, sometimes it is impossible to prevent that users have to learn how to deal with interfaces and applications first. In the discussion, both positive and negative aspects are elaborated. 2.2. Usability testing

As discussed, usability is iteratively evaluated in UCD. Emanuel (2013) suggests that this is done with usability tests. With these tests, usability problems and possible solutions can be revealed. Usability tests can be as simple as letting representative users do some typical tasks, but they can get much more complex as well by using methods such as questionnaires, additional interviews and eye tracking. Most of the usability tests are qualitative in nature, although there are also quantitative methods available such as the measurement of completion time. Because of the scope of this paper, this paper only focuses on qualitative data such as problems, explanations and suggestions. This results in more detailed data.

According to Emanuel (2013), some researchers question the reliability and validity of usability

(3)

testing. However, there are several ways to ensure powerful research. Examples are taking away personal bias, treating subjects equally, and selecting representative participants. Furthermore, reliability and validity can be improved with techniques such as pilot tests, member checks, peer review of data and specialized evaluation techniques.

Concerning the right number of participants for a usability test, a study of Nielsen (2000) implies that only five participants are sufficient to discover 85% of the usability problems. Although 15 participants for 1 test would discover all the usability problems, Nielsen states that it is better to perform 3 tests with 5 participants. In this way, solutions for the usability problems of the first 2 test can be evaluated after these tests. In addition, the problems that were not detected in the first test and still exist, will simply be detected in the last two tests. Because of the iterative nature and the qualitative approach of this study, 5 participants per test is justifiable. These participants can simply be anyone, as long as they represent the envisaged users of the product (Emanuel, 2013). For this study, it is important to describe the population. This includes aspects such as age, sex, education and skill level that the researchers deems important. In our research, important demographic data are obtained with questionnaires.

2.3. Moderating usability tests

According to Bergstrom (2013), there are different methods to collect data from usability tests. These methods can roughly be divided into think-aloud methods and probing methods. In the literature, the think-aloud method is probably the most popular one, because of its ability to understand participants’ thoughts as they work through issues. In comparison to probing, it does not interfere with the natural thought process and flow of actions. There are some disadvantages of the think-aloud method as it can interfere with usability metrics such as accuracy and time on task. However, this is not important for our research, because we do not rely on quantitative data. For this reason, the think-aloud method is recommended. Furthermore, there are differences between the retrospective and concurrent think-aloud method. The main difference is that the retrospective think-aloud method lets the participants talk aloud after the usability tests (as they retrace their steps), while the concurrent think-aloud method let the participants talk aloud during the usability tests. The retrospective method has as advantage that the cognitive flow will not be disrupted by the talking itself. However, disadvantages of the retrospective method are the loss of memory and an increased duration of usability tests. Van Someren, Barnard &

Sandberg (1994) suggest that another disadvantage is the occurrence of post hoc rationalizing. This means that participants reconstruct events more structured than they were. This can happen intentionally, as participants do not like to admit that they failed and instead pretend that actions reflect rational behavior. But mostly, the reconstructing happens unintentionally; the participants simply do not know that they do this. For these reasons, the concurrent think-aloud method is applied in this paper.

Although a lot of research on the different data collection methods has been done, less research has been done on data analysis techniques. In a study of Kjeldskov, Skov & Stage (2004), two techniques were compared: the traditional Video Data Analysis (VDA) technique and the newer Instant Data Analysis (IDA) technique. With VDA, usability tests are first recorded. After that, the videos are examined to identify usability problems. IDA is a much faster technique, as problems are identified immediately after the usability tests without the use of videos. However, IDA therefore requires the extra help of a data logger and relies on the memory of the researchers. Furthermore, Kjeldskov and his team showed that with VDA 92% of the critical problems were discovered, while with IDA 85% of the critical problems were discovered. Finally, VDA provides a richer and more diverse snapshot of some of the identified problems as explanations and suggestions are recored. For this reason, our research uses the VDA technique.

2.4. Interpretation of problems

For the interpretation of the usability problems, Nielsen (1994b) suggests that rating their severity is a reliable method. Such severity ratings are useful in the user-centered approach, as it can be used to prioritize problems and allocate the right amount of resources. The severity of a usability problem depends on three factors: the frequency of the occurrence, the impact and the persistence of the problem. However, it is common to combine the three aspects in a single severity rating. Although Nielsen showed that there is a positive correlation between the rating and the frequency of usability problems, there are also serious usability problems that can occur just once. This must be taken into account when assigning severity scores.

3. Design

3.1. Design

For this study, the user-centered approach was used to evaluate and improve the structuring component, so that in the end recommendations for the final functionality of our WMS could be made. As

(4)

mentioned before, one of the key principles of UCD is evolutionary system development. This means that UCD is iterative and incremental. Therefore, the usability of the structuring component was evaluated at three different stages. At each stage usability problems were identified with a usability test. These problems were combined with any explanations and suggestions on how to improve them. Because of too short time, we did not try to solve all the problems, but we tried instead to come with solutions to solve the most severe problems. Eventually, the first test was used as a baseline test, as the usability problems of the first test and the final test were compared to measure the effect of UCD. Furthermore, the course of the usability problems was discussed. After the improvement of the structuring component, recommendations for the final functionality of our WMS could be made. 3.2. Participants

For each tests, 6 participants were used. One of them participated in a pilot test. This pilot test was done at least one day before the start of the other tests. A pilot test is the same as a normal usability test, but as described in “Running a usability test” (n.d.), there are a few differences: First of all, a pilot test can be used to test the test. In this way, ambiguities could be prevented. Secondly, any bugs that may affect the test can be discovered. Finally, the facilitator has the opportunity to practice his attitude toward the participants.

Before the selection of the participants, the target group was defined as “average and experienced web-users who want to build a website”. After that, representative participants were selected with purposive sampling. This ensured the detection of a wide variety of problems as different types of participants were selected. As shown in Appendix A, 39% of the participants were men, and 61% were women. Only a few participants had used a CMS or an online web-builder before. None of them were totally inexperienced software users.

3.3. Materials

3.3.1. Demographic data

Before the start of a usability test, demographic data about the participants were collected with a questionnaire. From the 16 questions, 5 of them were general questions about name, age, gender, occupation and education. The other 11 questions were questions about skills related to software, web usage and site building. These questions were asked in Dutch and answered with a 5-point Likert-type scale. In Appendix A, the demographic data are shown.

3.3.2. Setting

Each usability test took place in a closed room with only one participant and one facilitator inside. The participant had to take place in front of a personal computer. On this computer, the current version of the WMS was running. Both the screen and the voices of the participants and the facilitator were recorded with OBS Classic 0.657. For the processing of the voices, a standard microphone was used.

3.3.3. Usability test procedure

Before the start of a usability test, the facilitator welcomed the participant and gave him the questionnaire, which he had to complete first. After that, the facilitator explained the purpose of the WMS and the procedure of the usability test. The participant was told that he/she could not do anything wrong, as only the structuring component was examined. Furthermore, the facilitator asked the participant to think-aloud during the tests. When the introduction was completed, the participant could start with the task.

In the first part of the task, participants were able to get comfortable with the think-aloud method with a few introducing subtasks. During the test, the facilitator adopted an active approach as he pushed the participants to think aloud and asked to explain their behavior if it was unclear. The facilitator stayed objective as only neutral questions were asked, without steering a participant in a specific direction. Participants were not helped with

(5)

completing their tasks, unless the participant got stuck. In this case, just enough help was given to get the participant back on track. After the participant completed the task, additional questions were asked to get overall feedback on the structuring component.

3.3.4. The Web Management System

In this study, our WMS can be seen as the central system, because its structuring component was iteratively evaluated and improved. In Figure 2, the first version of our WMS is shown. In this version, basic functions of a CMS were included. Just as with a CMS, users could create pages and manage the content on these pages. The style of this content could be defined in a theme. With the structuring component, users could build their website in blocks and add content to these blocks. This way, users could manage both the structure and the content of their website. All this happened in the back-end of the WMS, so when users wanted to see the results of their website, they had to activate a page renderer. In the first version of our WMS, there were two type of blocks available: 'Rows' and 'Columns'. Rows could only be placed below each other inside a Column, while Columns could only be placed next to each other inside a Row. This way, users could infinitively divide their website both horizontally and vertically in blocks. After the adding of a block, users had the opportunity to add content to this block. In the first version of our WMS, the only content available were so-called Articles, which

were coded in HTML. These Articles could be managed with a related Article plug-in. Our WMS was already a modular system as other plug-ins could be added in the future.

3.3.5. The task

The task for the Usability test was developed in collaboration with Web First, and was representative for the actual use of the structuring component. The task was divided into two parts. In the first part, the participants were familiarized with the concurrent think-aloud method. This part consisted of three small subtasks:

• Create a new page with the name 'Page' - do not change the URL

• Rename the page to 'Home' and set the page as homepage

• Show the result of the website

In the second part, the structuring component was examined as participants had to replicate a website, which was shown on an image (see Figure 1). We created a theme for the users in which the style of the website was defined. As discussed, WMS give users complete freedom over the structure of their website. Therefore, the participants had to structure the website by building the blocks that in Figure 1 are indicated with a red border. For the structuring of the page, the participants only had to add the blocks inside the outer container. As shown in Figure 1, in each of these blocks a red label was drawn. These labels referred to a so called Article, which contained the content of the website. These

(6)

Articles were created in advance, so the participant only had to link the right Article to the right block. The participants successfully completed the task when the website corresponded with the image that they had to replicate.

3.3.6. Additional questions

After each usability test, a number of additional questions were asked. These questions were used to find explanations of existing problems and suggestions on how to improve them. Furthermore, the participant was asked to give an overall opinion about the structuring component.

3.4. Data analysis

For the analysis of the usability problems, the Video Data Analysis technique was used. For each participant, his/her video was reviewed and problems were written down. Problems were defined in a short, but precise way. After all the problems of a usability test were analyzed, they were grouped by category. Similar problems were combined and reformulated when needed. Thereafter, a list of several different problems remained. From these problems, two things were noted: the frequency (1-5) and its impact (1-5). The impact was determined by de facilitator, who also took persistence into account. For this reason, impact and persistence were combined in one 'impact' score.

Both the frequency and impact resulted in an severity score on Nielsen's rating scale (1994b). Because as discussed above, impact was a combination of two factors, we weighted this score twice as much as the frequency. This ensured that frequency did not outweigh impact. Nielsen scale consists of 5 points:

0. “I don not agree that this is a usability problem at all

1. Cosmetic problem only – need not be fixed unless extra time is available on project 2. Minor usability problem – fixing this should

be given low priority

3. Major usability problem – important to fix, so should be given high priority

4. Usability catastrophe – imperative to fix this

before product can be released” (p. 49) According to Nielsen (1994a), serious problems “have high potential for causing major delays or preventing the users from completing their task” (p. 154). Therefore, we considered major usability problems (3) and usability catastrophe (4) as serious problems.

4. Results

The usability problems that were found in each test are shown in Appendix B. These problems are given a unique number for identification purposes. Furthermore, the frequency and the impact of the usability problems are noted, which resulted in a severity score for each problem. Frequency and impact both range from 1 up to 5, while Nielsen's severity scale (1994b) ranges from 0 to 4. As discussed, we chose to let impact weigh twice as much as frequency. To fit the severity scores on Nielsen's scale, we use the following formula:

severity = (frequency + impact * 2) / 3 – 1

Furthermore, we discussed that problems with a severity score of 3 of more were considered as serious problems:

serious = severity ≥ 3

Of each test, the amount of problems and the sum of their severity scores are shown in Table 1. In Table 2, these numbers are shown in percentages of the results of the baseline test. Here we can see that the total amount of problems are reduced with 23,9%, and that the amount of serious problems decreased with 69,2%. Furthermore, in Figure 3, we can see that the effect of the user-centered approach has become less. This effect is discussed in more detail in Section 5.2.

In the following sections, the problems and the solutions that we implemented are described. To describe the nature of these problems as well, we assigned to each usability problem a category (see Appendix B). In the following sections, problems are referred to by their identification number and/or their category.

Table 1: Problems and their severity

Problems Severity Serious problems*

Baseline test 46 104 26 (56,5%)

Second test 38 72 14 (36,8%)

Final test 35 56 8 (22,9%)

* And the serious problems in percentages of the total problems

Table 2: Relative problems and their severity

Problems* Severity* Serious problems*

Baseline test 100% 100% 100%

Second test 82,6% 69,2% 53,8%

Final test 76,1% 53,8% 30,8%

(7)

4.1. The baseline test

After the first test, 46 problems were identified, which are shown in Appendix B. Of these problems, 26 were serious problems (see Appendix C). Most of the serious problems were assigned to the 'structuring' category. These problems were given a high impact score, because participants did not know how to overcome these problems.

It appeared that the biggest issue was that participants did not understand what the functions of Rows and Columns were. Several usability problems supported this thought:

• “Thought that Rows are the only elements to structure a page” (no. 13)

• “Added a Row instead of a Column” (no. 17) • “Added a Column instead of a Row” (no. 18) • “Did not understand to use Rows for the

vertical placement of blocks” (no. 19) • “Kept adding Rows inside Rows instead of

below” (no. 35)

Additionally, the participants did not recognize the types of the blocks that they added (no. 12 & 14). This might have been strengthened by the weakness of the Toolbar; participants did not notice that there were different “Add” and “Remove” buttons for Rows, Columns, and Articles (no. 27 & 28). Even later in the test, participants struggled with Columns and Rows; the participants had to create the structure that is shown in Figure 4, but they did not understand that they had to add Rows in the Column on the right side (no. 23). Furthermore, participants were unaware of the fact that Articles could only be added to Columns and not to Rows (no. 29).

Another problem was that the structuring component provided too little feedback on which

block was selected and the location where new blocks were added. The following problems confirm this.

• “Did not expect to add a block in the Column that was selected” (no. 30)

• “Added a Column before instead of after another Column, because the outer Row was selected” (no. 33)

• “Wrongly selected a child or parent block and therefore added a block at the wrong location” (no. 34)

Because participants added blocks at the wrong location, they also tried to rearrange them (no. 40 – 42). The fact that participants could not rearrange the blocks (no. 43) was considered as a serious problem, because the participants had much difficulty with this. To solve this, the participants of the baseline test suggested to implement a dragging function.

Besides all the problems in the 'structuring'

Figure 3: Effect of the user-centered approach

(8)

category, there were also problems with the main Navigation Bar ('navigation' category). This is the location where users can save their page and go to the page settings. However, participants were searching for these buttons in the structuring window (no. 1 & 2). Furthermore, a few participants accidentally clicked on 'Remove page' when they wanted to remove a block (no. 4).

Finally, the participants had problems with the saving of the page ('saving' category), as they were not informed that the page first had to be saved to see the results in a page renderer (no. 9). Sometimes, this resulted in data loss, which clearly is a serious problem (no. 8).

For the second version of our WMS, we chose to improve the problems in the 'structuring' category first, because of their impact. Furthermore, it was not difficult to address the problems in the 'navigation' category, thus these problems were also taken into account. Due to too short time, other problems were not addressed.

4.2. The second usability test

For the second version of our WMS (see Figure 5), many problems in the 'structuring' category and the 'navigation' category were addressed. The evaluation of the structuring component resulted in 38 usability problems, from which 14 were serious usability problems (see Appendix B). One of the biggest changes was that there was no difference between Rows and Columns anymore; they did all become 'Blocks', which could be added in a more intuitive way. To add a new Block, participants had to hover over one of the sides of another Block, and click on the 'Plus' buttons that appeared (see Figure

5). In this way, Blocks could be added horizontally or vertically. Additionally, we chose to use more contrasting colors to distinguish the Blocks better.

However, many problems in the 'structuring' category still occurred. One of these problems was that participants did not notice the 'Plus' buttons immediately (no. 17 & 18). This was because participants had to hover precisely over the center of the borders of the Blocks to show them.

Furthermore, the Toolbar of the first version was placed in the Blocks itself (see Figure 5). Although this solved many problems, new problems occurred as well. One of the problems was that participants did not know that a block must be clicked to show the Toolbar (no. 12). Furthermore, participants had to guess what Toolbar icon represented what function. The following problems could be described as a consequence:

 “Did not understand that the purpose of the arrows is to enlarge a Block” (no. 14)

Figure 5: The structuring component of the second version of our WMS

(9)

 “Tried to shrink a Block, but a Block can be enlarged only” (no. 16)

 “Clicked on other buttons instead of the 'Article' button” (no. 20)

For this reason, many participants suggested the use of tool-tips (no. 13). Another problem was that the Toolbar was confusing. Because the Toolbar had the same background color as other Blocks, participants thought that this was a place where new blocks could be added (no. 25).

Moreover, it was remarkable that participants still had problems with the vertical structuring of two Articles inside a column. This situation is shown in Figure 6. Participants did not understand that they had to place two columns inside the column on the right side. Instead, the participants tried to add a Block below the column on the right side (no. 28 & 33). At this stage of development, the only way to solve this problem was to remove the column. However, we chose not to do this, because in the end, users must manage this column as well (for example changing the background-color, margins etc.).

Although there was not enough time to implement a dragging function, the problems with regard to the rearranging of Blocks ('rearranging' category) occurred less frequently and were therefore less severe. This was probably the result of the fact that building was quicker and easier, so the participants did not have to rearrange the Blocks (no. 35 – 37).

Problems in the 'navigation' category were addressed as well, for example the problems of the Navigation Bar. Because in the first test, participants were looking in the structuring window for a 'Save' button and a 'Page settings' button, we chose to make the Navigation Bar part of the structuring window. However, this did not solve all the

problems. For example, participants were still searching for the 'Save' button on the bottom of the screen (no. 1). However, these problems were considered as less severe, because of the lower frequency of these problems. New problems occurred as well as labels were confusing:

 “Clicked on the title 'Edit page <page name>' instead of 'Settings'” (no. 3)

 “Did not expect that 'Page Grid' referred to the 'Edit page' window” (no. 5)

Finally, we made the 'Remove page' button less accessible to prevent that participants clicked on it by accident.

For the third version of our WMS, we tried to address several problems. First of all, we tried to solve the problems with regard to the confusing labels. Secondly, we tried to solve the problems in the 'saving' category, as they were not addressed before. Finally, we wanted to implement the tool-tips that the participants suggested.

4.3. The final usability test

In the third version of our WMS, more subtle changes to the structuring component were implemented. The evaluation of the structuring component of this version resulted in 35 problems, from which 8 were serious problems (see Appendix B). The first change was the implementation of a save reminder. This function registered when changes were made. When participants did not save the page and wanted to leave it, they got a warning in the form of a pop-up. This pop-up asked if the participants wanted to continue without saving. Participants could choose to continue or to cancel. Because of the save reminder, participants understood that the page had to be saved before viewing the in a page renderer.

However, problems in the 'saving' category still

(10)

arose. In the first test, the save reminder was not working for some reasons (no. 8). This can be seen as a bug. Furthermore, two participants initially paid no attention to the save reminder (no. 7). However, when the participants left the page, they immediately discovered that the page was not saved. Although in the third version of our WMS only three of the participants found the feedback on the page saving not enough, this was still considered as a serious problem (no. 10). Additionally, one participant found it frustrating that the page had to be saved all the time (no. 9).

Another change was the addition of tool-tips in the Toolbar of a Block (see Figure 7). When participants hovered over a button, a tool-tip with a short description appeared. This resulted in fewer problems, because the tool-tips let participants understand the function of the buttons. However, a new problem occurred: when there was too much text inside the tool-tip, words were displayed below each other. Therefore, participants confused the tool-tips with drop-down menus; the participants tried to click on the words of the tool-tips, because they thought that these words were options (no. 28).

Furthermore, many labels were changed, because they were confusing in the second version of our WMS. For example, 'Edit page <page name>' was changed to 'Structure <page name>', because users would otherwise confuse the text with a button. Instead, the function of this label was to inform the users that they were located on the structure page. Additionally, 'Page grid' was changed to 'Edit page' to provide clarity. Lastly, 'Save Article' was changed to 'Add Article', so that users did not confuse this button with the 'Save page' button.

Some of the problems were not addressed and therefore still occurred. The following problems are already discussed in Section 4.2:

• Participants did not know how create the structure that is shown in Figure 6 (no. 12). • Participants did not know that they had to

click on a block to show the Toolbar (no. 13). • Participants did have problems with finding

the 'Plus' buttons of a Block (no. 17).

• Participants tried to shrink Blocks with the 'Arrow' buttons, but Blocks can only be expanded in a certain direction (no. 21). • Participants confused the Toolbar with a

Block, because the background color was the same (no. 19).

In contrast to the results of the second test, more participants tried to relocate a Block, but this was not possible (no. 30). Therefore, this problem was considered as a serious problem again. However, because the structuring had become easier, problems in the 'relocation' category were still less severe than in the first version of our WMS.

Finally, a problem that was not found before,

was the fact that a participant stated that it was strange that columns do not have the same height (no. 27). A solution for this might prevent the problem – which occurred in the second test – that participants thought that the 'Plus' button below two columns only belonged to the largest column (no. 18).

5. Discussion and conclusions

Using the user-centered approach, we have improved the structuring component iteratively to make recommendations for the final functionality of our WMS. First, these recommendations are discussed. After that, the effect of UCD on the improvement of the structuring component is examined. Finally, limitations and the contribution of this research are considered.

5.1. Recommendations for our WMS

In the first version of our WMS, many problems occurred while structuring the page. In this version, users had to divide their page in Rows and Columns (see Figure 2). Furthermore, each of these blocks had an own 'Add' and 'Remove' button. In the first evaluation of our WMS, it appeared that the structuring component had to become much more intuitive. Therefore, in the second version of our WMS, we resolved ambiguity with regard to the Rows and Columns by making them all 'Blocks' (see Figure 5). Users can add these Blocks next to other Blocks – both horizontally and vertically – inside other Blocks. For this, we suggest that it is important that the adding of horizontal-orientated Blocks is consistent with the adding of vertical-oriented Blocks. To add new Blocks, users have to click on one of the 'Plus' buttons, which appear when users hover over a side of the Block. Because users still have problems with finding them, it is important that these 'Plus' buttons become easier accessible.

Bad accessibility also applies to the Toolbar of the Blocks (see Figure 5); it takes a while before users know that they have to click on a Block to see its Toolbar. In the first version of our WMS, the Toolbar was separated from the Blocks (see Figure 2), but this is illogical for the users. Therefore, we suggest that all the options that affect the Block must be accessed from the Block itself. Furthermore, we suggest the addition of tool-tips to make the function of the options clear (see Figure 7). The outcome of the final usability test has shown that this results in less problems.

Although during the tests the structuring component has become much more intuitive, there is still room for improvement. This refers to the fact that participants still have problems with creating the structure that is shown in Figure 6. Users tried to divide the column on the right side by searching for a 'Plus' button below this column, but they must

(11)

instead add a new Block inside that column. After that, this new Block can be divided in the vertical direction. This way, both the rows and the columns are maintainable. We suggest three solutions for this:

1. Make the 'Plus' buttons always visible. This way users can see in which directions it is possible to add new Blocks.

2. Make users understand that a Block can be divided either horizontally or vertically. Once a Block is divided in a horizontal or a vertical direction, new Blocks can not be added in the other direction.

3. Provide help to let users understand that they have to add Blocks inside Blocks to create complex structures as shown in Figure 6.

Moreover, our study has shown that users must have the possibility to relocate their Blocks. Perhaps, this is only a consequence of the problems with regard to the structuring of the page. However, one can imagine that users do not want to completely rebuild complex structures in another Block. Instead, users should have the opportunity to just drag Blocks – including the structures inside – to the desired location. Therefore, we suggest an intuitive dragging function.

Furthermore, there are several problems with the 'Arrow' buttons of a Block. With these buttons, users have the possibility to change the width of the Blocks, by expanding them to the left or to the right. However, the first problem is that users think that Blocks can be moved with these buttons. Additionally, users try to shrink the Blocks with the arrows, but a block can only be expanded in a particular direction. Therefore, we suggest to extend the dragging function by letting users drag the border of the Blocks as well. This is a much more intuitive way to resize Blocks.

Finally, we suggest several changes with regard to the saving of the page. Now, we have implemented a function, which warns users that the page must be saved before leaving it. However, participants still find it illogical that the page has to be saved to see the results in a page renderer. We suggest two solutions:

1. Remove the page renderer. Just as with Wordpress, let users open their page in another tab. Users will understand that there is no live view support, so they will refresh the page after saving changes in the structure component.

2. Or even better: implement an automatic saving function with version control. After each usability test, overall feedback was given by the users by answering the additional questions. First of all, the feedback indicated that the users understand the division of the content and the

structure. During the process of improving the structuring component, users became more positive. In the end, users suggested that the tool was accessible and easy to use. However, it appeared that users needed some help to start up, because as discussed, there were still some problems with regard to the structuring of the page. Using these recommendations, we suggest that these and other difficulties can be overcome.

5.2. The effect of user-centered design

As discussed before, problems are reduced using a user-centered approach. In Table 1, for each tests the amount of usability problems and their severity are compared to the results of the baseline test. Here we can see that the total amount of problems are reduced with 23,9%. The amount of serious problems decreased with 69,2%. If we look at the course of these decreases (see Figure 3), we can notice trends. Between the tests, the effect of the user-centered approach has become less. Furthermore, it seems that the severe problems will eventually become zero, while the total usability problems will stagnate around the 30 problems.

5.2.1. UCD in practice

In our study, it became clear that UCD is an important method to improve a product. In many cases, designers did not want to see beyond their own vision; they had a clear image of the functions of the structuring component in their mind, and automatically assumed that users would understood it as well. Sometimes, designers would initially ignore the outcome of the usability tests, because they were too proud or simply disagreed with the users. In this case, we had to intervene in the interest of this study. By adopting a user-centered approach, most serious problems were solved. However, as discussed before, less severe problems seemed to stagnate. In our case, it seems impossible to solve all these problems as well.

5.2.2. Interpretation

According to Norman (2005), this sounds logical. Norman namely confirms that by adopting a user-centered approach, good design can be ensured by overcoming the most severe problems. However, to overcome the less severe problems as well, designers must make the right decisions during the development process. This way, excellent design can be created. However, to make the right decisions, designers require a high amount of skills. This includes that designers sometimes must waive the user-centered approach by ignoring the user, to prevent that adaptations eventually increase complexity. Furthermore, when systems get too complex, most of the time it is impossible to prevent that users have to learn how to use parts of the

(12)

system. This might result in usability problems, which are not necessarily bad. Overall feedback of our participants confirms this.

5.2.3. Suggestion

Summarizing, adopting a user-centered approach can prevent many problems with regard to tunnel vision and ensures the evolution of good design. However, to create excellent design, designers must make the right decisions during the development process. Therefore, the designer must sometimes ignore the user. However, we suggest only to waive the user-centered approach if designers posses the right skills to know exactly when ignoring users is permitted. Most of the time, designers do not posses the right skills. In this case, it is save to fully stick to the user-centered approach and settle for good design.

5.3. Limitations

Our research has some limitations which must be addressed. Most of these limitations are consequences of too short time for the testing and the improving of the structuring component. Instead of trying to address all the usability problems, we tried to come with solutions to decrease the severity as much as possible. According to Nielsen (1994b), another limitation is the low inter-rater reliability of severity rating. However, Nielsen also states that there is almost always a significant correlation between any two severity raters. Therefore, Nielsen suggests that it is still possible to get reliable results by using several raters. Unfortunately, this was not possible because of too short time. Finally, the usability problems could have been described in more depth. For example, Nielsen states that it is good to combine usability problems with screen-shots.

5.4. Contribution

This paper is mainly descriptive, as recommendations for the final functionality of our WMS are made and the effect of UCD is described. Although no “fine discriminations among cases” (Emanuel, 2013) are made, this paper still makes a fair contribution. Firstly, it contributes to the Human-computer Interaction (HCI) discipline as it can be used as a case study on the application of the user-centered approach. Furthermore, an indication of the effect of UCD is given. Finally, the recommendations for our WMS can provide a solid base for research on its further development.

6. Acknowledgments

The author would like to thank his supervisor Jacobijn Sandberg for her assistance with the designing of the research and the writing of this paper. Furthermore, I would like to thank Web First

for making their WMS available and their further collaboration. Finally, I could not do this research without the 18 participants who helped me with the finding of the problems.

7. References

• Bergstrom, J. R. (2013, April 2). Moderating Usability Tests. Retrieved from

http://www.usability.gov/get- involved/blog/2013/04/moderating-usability-tests.html

• Emanuel, J. (2013). Usability testing in libraries: Methods, limitations, and implications. OCLC

Systems & Services: International digital library perspectives, 29(4), 204-217.

• Farooq, A., Javed, F., Hussain, M., Abbas, T., & Hussain, A. (2012). Open Source Content Management Systems: A Canvass. International

Journal of Multidisciplinary Science and Engineering, 3(10).

• Gulliksen, J., Göransson, B., Boivie, I., Blomkvist, S., Persson, J., & Cajander, Å. (2003). Key principles for user-centred systems design. Behaviour and

Information Technology, 22(6), 397-409.

• Kangas, E., & Kinnunen, T. (2005). Applying user-centered design to mobile application development.

Communications of the ACM, 48(7), 55-59

• Kjeldskov, J., Skov, M. B., & Stage, J. (2004, October). Instant data analysis: conducting usability evaluations in a day. In Proceedings of the third

Nordic conference on Human-computer interaction

(pp. 233-240). ACM.

• Nielsen, J. (1994a, April). Enhancing the explanatory power of usability heuristics. In Proceedings of the

SIGCHI conference on Human Factors in Computing Systems (pp. 152-158). ACM.

Nielsen, J. (1994b). Heuristic evaluation. Usability

inspection methods, 17(1), 25-62.

• Nielsen, J. (2000, March 19). Why you only need to test with 5 users. Retrieved from

https://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users/

• Norman, D. A. (2005). Human-centered design considered harmful. Interactions, 12(4), 14-19.

• Patel, S. K., Rathod, V. R., & Prajapati, J. B. (2011). Performance analysis of content management systems-joomla, drupal and wordpress.

International Journal of Computer Applications, 21(4), 39-43.

• Running a Usability Test. (n.d.). Retrieved from

http://www.usability.gov/how-to-and-tools/methods/running-usability-tests.html

• Usage statistics and market share of WordPress for websites. (2016, June 22). Retrieved from

https://w3techs.com/technologies/details/cm-wordpress/all/all

• Someren, M. V., Barnard, Y. F., & Sandberg, J. A. (1994). The think aloud method: a practical approach

to modelling cognitive processes. Academic Press.

• Vredenburg, K., Mao, J. Y., Smith, P. W., & Carey, T. (2002, April). A survey of user-centered design practice. In Proceedings of the SIGCHI conference on

Human factors in computing systems (pp. 471-478).

(13)

Appendix A: Participants

Data

Test Type Name Gender Age Study Work

Baseline test Pilot test Job de Graaf Male 18 None Auxiliairy worker at a DIY store

Baseline test Normal test Marinus Bergsma Male 17Mediadesign (vocational education) Graphic designer Baseline test Normal test Danny Konijn Male 25Logistics & Economics (higher

professional education)

Formatting shipping documents; Entrepreneur Baseline test Normal test Kim Hiemstra Female 19Media, Arts, Design & Architecture (university) Online marketing

Baseline test Normal test Karin Dousi Female 37 None Police

Baseline test Normal test Julian van Ginkel Male 23Marketing and Communication (vocational education) Project manager; Marketeer

Second test Pilot test Dominique de Graaf Female 48 None Management supporter

Second test Normal test Rowan Dekker Male 22 None Graphic designer

Second test Normal test Yander Bierman Male 23 None Graphic designer

Second test Normal test Zoë Bood Female 21 Health & Life Science (university) Auxiliairy worker at the bakery of a supermarket Second test Normal test Maurice Vogel Male 19Built Environment (higher professional education) Auxiliairy worker at the buchery of a supermarket

Second test Normal test Pauline Ooijevaar Female 53 None Administrative worker

Final test Pilot test Cas Smit Male 21Engineering physics (higher professional education) None

Final test Normal test Jesse van Heukelingen Male 22 None Catering manager

Final test Normal test René Noels Male 42 None Sales

Final test Normal test Astrid Vogel Female 52 Tax consultancy Owner of a trust office

Final test Normal test Hans Stroomer Male 55 None Retail entrepreneur

Final test Normal test Leonie van der Made Female 20Pedagogy (higher professional education) Auxiliairy worker in a chocolate shop

Descriptive Statistics

N Minimum Maximum Mean Std. Deviation

Age 18 17 55 29,83 13,768

I use the PC much for work 18 3 5 4,56 ,705

I use the PC much in my free time 18 3 5 3,94 ,802

I have much experience with programs 18 3 5 4,39 ,698

People often have to help me with software 18 1 4 1,72 1,018

I quickly learn to use programs 18 2 5 3,72 ,826

I visit many websites in a week 18 2 5 3,61 ,850

I know where to find the right info on a website 18 3 5 4,11 ,676

I am aware of quality differences between sites 18 2 5 4,00 ,840

I often use website builders 18 1 5 2,17 1,465

I often use CMS to create or maintain websites 18 1 5 1,89 1,28

Valid N (listwise) 18 Gender Frequency Percent Valid Female 7 38,9 Male 11 61,1 Total 18 100,0

(14)

Appendix B: The usability problems

Baseline test

No. Usability problem Frequency Impact Severity Serious Category

1 Searched for a 'Save page' button in the Structure Window instead of in the Navigation Bar 5 3 3 YES Navigation 2 Searched for a 'Page settings' button in the Structure Window instead of in theNavigation Bar 4 3 2 NO Navigation

3 Expected a 'Return' or "Save" button in the Page Settings 2 3 2 NO Navigation

4 Clicked on 'Remove page' instead of the 'Remove block' button 2 5 3 YES Navigation

5 Did not notice the 'Home' icon after a page was set as home 1 2 1 NO Navigation

6 Tried to add and remove blocks in the page result display 2 1 0 NO Navigation

7 Clicked on the 'Add Column' button because there was no 'Add page' button 3 4 3 YES Page saving

8 Did not save the page with as result that progress was lost 2 5 3 YES Page saving

9 Did not know the page has to be saved first to see the results of the website 5 4 3 YES Page saving

10 Thought that the 'Reload' button would update the page results 1 2 1 NO Page saving

11 Did not get enough feedback when the page was saved 5 4 3 YES Page saving

12 Thought that a Column is an selection menu for Articles, not a block 2 2 1 NO Structuring

13 Thought that Rows are the only elements to structure a page 3 4 3 YES Structuring

14 Did not know what type of blocks the borders represented 5 4 3 YES Structuring

15 Did not know how to start, because the first Row was not recognized as a block 4 4 3 YES Structuring

16 Did not recognize added Articles as one of the Articles 3 4 3 YES Structuring

17 Added a Row instead of a Column 5 4 3 YES Structuring

18 Added a Column instead of a Row 5 4 3 YES Structuring

19 Did not understand how to use Rows for the vertical placement of blocks 3 4 3 YES Structuring

20 Added two Articles below each other without using Rows 3 4 3 YES Structuring

21 Tried to use the 'Theme' module to add Articles to the right place 2 2 1 NO Structuring

22 Put only the second Article in a Row to display it vertically 3 4 3 YES Structuring

23 Did not have a clue how to add Articles vertically inside a Column by making use of Rows 5 4 3 YES Structuring 24 Expected that two horizontal Columns would result in a vertical layout 2 4 2 NO Structuring

25 Wanted to add a Row on top of the page, but this can’t be done 1 5 3 YES Structuring

26 Did not know how to show the 'Add' and the 'Remove' bar 1 2 1 NO Structuring

27 Used the wrong 'Remove' button for a particular type of block 5 4 3 YES Structuring

28 Did not notice the difference between 'Add Row' and 'Add Column' 4 4 3 YES Structuring

29 Did not understand that an Article can’t be added to a Row 3 4 3 YES Structuring

30 Did not expect to add a block in the Column that was selected 3 4 3 YES Structuring

31 Couldn‘t add a block, because there was no block selected 5 4 3 YES Structuring

32 Added a block in the last selected block, but there was no block selected 2 4 2 NO Structuring 33 Added a Column before instead of after another Column, because the outer

Row was selected 3 4 3 YES Structuring

34 Wrongly selected a child or parent block and therefore added a block to the wrong place 5 4 3 YES Structuring

35 Kept adding Rows inside Rows, instead of below 3 3 2 NO Structuring

36 Used redundant Rows and Columns 1 3 1 NO Structuring

37 Accidentally clicked on an Article in the 'Select Article' menu 1 2 1 NO Structuring

38 Clicked on the wrong 'Remove' button so too many or too few blocks were removed 3 4 3 YES Structuring 39 Did not remove empty blocks because it was not clear that they affected the structure 1 3 1 NO Structuring

40 Tried to drag a Column next to another Column 4 2 2 NO Rearranging

41 Tried to use the 'Shrink' and 'Enlarge' buttons to relocate a block 4 3 2 NO Rearranging

42 Tried to use the keyboard to move a block 1 2 1 NO Rearranging

43 Could not move blocks 5 4 3 YES Rearranging

44 Clicked on the 'Previous' button of the browser to return, but the system can not handle this 1 2 1 NO Other 45 Tried to view the result of the page in a new window, but the system can not

handle this 1 2 1 NO Other

(15)

Second test

No. Usability problem Frequency Impact Severity Serious Category

1 Searched for a 'Save page' button in the Structure Window, instead of in the Navigation Bar 2 3 2 NO Navigation

2 Did not know the 'New Page' window was already active 1 3 1 NO Navigation

3 Clicked on the title 'Edit page <page name>' instead of 'Settings' 3 1 1 NO Navigation

4 Expected a 'Return' or "Save" button in the Page Settings 3 2 1 NO Navigation

5 Did not expect that 'Page Grid' referred to the 'Edit page' window 2 2 1 NO Navigation

6 Clicked on 'Full Screen' instead of 'View page' 3 1 1 NO Navigation

7 Searched for an 'Add page' button 1 2 1 NO Page saving

8 Did not save the page, with as result that progress was lost 1 5 3 YES Page saving

9 Did not know the page has to be saved first to see the results of the website 4 4 3 YES Page saving

10 Did not get enough feedback when the page was saved 5 4 3 YES Page saving

11 Wanted to hide the drop-down bar but did not know immediately how 2 1 0 NO Structuring

12 Did not know immediately that a Block must be clicked to show other icons 3 4 3 YES Structuring

13 Misses tool-tips in the Toolbar of a Block 3 4 3 YES Structuring

14 Did not understand that the purpose of the arrows is to enlarge a Block 2 4 2 NO Structuring 15 Thought that the arrows of a Block were the navigation of a carousel 2 3 2 NO Structuring

16 Tried to shrink a Block, but a Block can be enlarged only 5 3 3 YES Structuring

17 Did not notice one of the 'Plus' buttons immediately 3 4 3 YES Structuring

18 Thought that the 'Plus' button below two Columns belonged to the largest Column only 3 4 3 YES Structuring

19 Did not understand why there are two 'Plus' buttons between rows 1 2 1 NO Structuring

20 Clicked on other buttons instead of the 'Article' button 3 4 3 YES Structuring

21 Forgot to press on 'Save Article' 4 2 2 NO Structuring

22 Found 'Save page' and 'Save Article' confusing 3 4 3 YES Structuring

23 Tried to add another Article after another Article because the 'Single Article' menu was not hidden after the addition of an Article 1 2 1 NO Structuring

24 Did not recognize the first Block as a Block immediately 1 3 1 NO Structuring

25 Thought that the Toolbar of a Block is a place to work in 3 4 3 YES Structuring

26 Added a Block outside instead of inside a Block 1 3 1 NO Structuring

27 Searched for a 'Plus' button to add a second Article below another Article in

a Column 5 4 3 YES Structuring

28 Did not know how to add two Blocks vertically inside a Column 5 4 3 YES Structuring

29 Removed too many Blocks 2 3 2 NO Structuring

30 Thought that the parent Block was a frame of its child Block 2 3 2 NO Structuring

31 Tried to drag the border of a Block to resize that Block 1 3 1 NO Structuring

32 Thought that the colors of Blocks did not represent equality well 2 3 2 NO Structuring

33 Did not understand why three Blocks had to be added to display two Blocks 1 3 1 NO Structuring

34 Couldn‘t add a row above a first row 1 5 3 YES Structuring

35 Tried to use the arrows of a Block to place a Block next to another Block 2 2 1 NO Rearranging

36 Tried to drag a Block below the right-oriented Article 1 2 1 NO Rearranging

37 Could not move Blocks 2 4 2 NO Rearranging

(16)

Final test

No. Usability problem Frequency Impact Rating Serious Category

1 Clicked on ‘Full Screen’ instead of ‘View page’ 1 1 0 NO Navigation

2 Did not know the ‘New Page’ window was already active 3 3 2 NO Navigation

3 Expected a ‘Return’ or ‘Save’ button in the page settings 5 2 2 NO Navigation

4 Searched for a ‘Save page’ button in the structure window, instead of in the

Navigation Bar 1 3 1 NO Navigation

5 Confused the ‘New Page’ label with a button 1 1 0 NO Navigation

6 Suggested to show the result of the page in another screen 1 2 1 NO Navigation

7 Did not give attention to the ‘Save page warning’ with as result that the page was not saved 2 4 2 NO Page saving

8 The save reminder did not work for some reasons 1 3 1 NO Page saving

9 Was frustrated that the page had to be saved all the time 1 3 1 NO Page saving

10 Did not get enough feedback when the page was saved 3 4 3 YES Page saving

11 Added a Block outside instead of inside a Block 2 3 2 NO Structuring

12 Did not know how to add two Blocks vertically inside a column 5 4 3 YES Structuring

13 Did not know that a Block must be clicked to show other icons 4 4 3 YES Structuring

14 Did not notice one of the ‘Plus’ buttons immediately 4 4 3 YES Structuring

15 Forget to press on ‘Add article’ 1 2 1 NO Structuring

16 Removed too many Blocks 2 3 2 NO Structuring

17 Searched for a ‘Plus’ button to add two Blocks vertically inside a column 5 4 3 YES Structuring 18 Thought that the ‘Plus’ button below two columns belonged to the largest

column only 1 4 2 NO Structuring

19 Thought that the toolbar of a Block is a place to work in 2 4 2 NO Structuring

20 Tried to make a Block smaller by dragging its borders 1 3 1 NO Structuring

21 Tried to shrink a Block, but a Block can be enlarged only 5 3 3 YES Structuring

22 Added an article to the left, but for some reasons it appeared on the right 1 3 1 NO Structuring

23 Clicked once instead of twice on the ‘Remove’ button 1 3 1 NO Structuring

24 Did not know that the ‘Add Article’ button was a button 3 4 2 YES Structuring

25 Did not notice that a Block was placed inside a Block 1 3 1 NO Structuring

26 Placed a Block above an Article, because the ‘Insert Block’ button was not hidden 1 4 2 NO Structuring 27 Thought it is strange that the two columns do not have the same height 1 2 1 NO Structuring

28 Thought that the tooltips were drop-down menus 4 3 2 NO Structuring

29 Used a redundant Block 1 2 1 NO Structuring

30 Could not move a Block 4 4 3 YES Rearranging

32 Tried to drag a Block below a right-oriented Article 2 2 1 NO Rearranging

32 Tried to use the arrows of a Block to place a Block next to another Block 3 2 1 NO Rearranging 33 Clicked on the “Previous” button of the browser to return from the settings 1 1 0 NO Other 34 The ‘Add article’ bar did not appear, because the ‘Style’ bar was active 1 3 1 NO Other

(17)

Appendix C: Usability data analysis

Baseline test

Statistics: Frequency, impact and rating

Frequency Impact Severity

N Valid 46 46 46 Missing 0 0 0 Mean 2,93 3,37 2,26 Median 3,00 4,00 3,00 Std. Deviation 1,497 ,997 ,929 Minimum 1 1 0 Maximum 5 5 3 Sum 135 155 104

Statistics: Serious problems

Amount Percent

Valid NO 20 43,5

YES 26 56,5

Total 46 100,0

Crosstabulation: Frequency * Impact

Impact Total 1 2 3 4 5 Frequency 1 0 8 2 0 1 11 2 1 2 1 2 2 8 3 0 0 1 10 0 11 4 0 1 2 2 0 5 5 0 0 1 10 0 11 Total 1 11 7 24 3 46

(18)

Second test

Statistics: Frequency, impact and rating

Frequency Impact Severity

N Valid 38 38 38 Missing 0 0 0 Mean 2,37 3,03 1,89 Median 2,00 3,00 2,00 Std. Deviation 1,282 1,102 ,981 Minimum 1 1 0 Maximum 5 5 3 Sum 90 115 72

Statistics: Serious problems

Amount Percent

Valid NO 24 63,2

YES 14 36,8

Total 38 100,0

Crosstabulation: Frequency * Impact

Impact Total 1 2 3 4 5 Frequency 1 1 4 5 0 2 12 2 1 2 5 2 0 10 3 2 1 0 7 0 10 4 0 1 0 1 0 2 5 0 0 1 3 0 4 Total 4 8 11 13 2 38

(19)

Final test

Statistics: Frequency, impact and rating

Frequency Impact Severity

N Valid 35 35 35 Missing 0 0 0 Mean 2,2 2,89 1,60 Median 2,00 3,00 1,00 Std. Deviation 1,45 ,993 ,976 inimum 1 1 0 Maximum 5 4 3 Sum 77 101 56

Statistics: Serious problems

Amount Percent

Valid NO 27 77,1

YES 8 22,9

Total 35 100,0

Crosstabulation: Frequency * Impact

Impact Total 1 2 3 4 Frequency 1 3 4 8 2 17 2 1 1 2 2 6 3 0 1 1 2 4 4 0 0 1 3 4 5 0 1 1 2 4 Total 4 7 13 11 35

Referenties

GERELATEERDE DOCUMENTEN

Het lijkt er op alsof dat ook gebruikt wordt door Facebook om te kijken of iemand ‘ongepaste’ inhoud plaatst.. En daar raak je het volgen- de aan… Facebook houdt niet van

Critical Creative thinking Flexibility Initiative Productivity Problem solving Imagination Adaptability Decision making Analysis Evaluation Creating Independence

En dus AB  PR want AB staat loodrecht op iedere lijn in het vlak PQRS... AC  BD (diagonalen van

Factors such as participants’ levels of willingness to participate (WTP), their retention in the trial, discrimination they might encounter and how participation might influence

Understanding the factors contributing towards risk perception amongst young people is important if prevention strategies were to be more effective amongst

This dissertation has nine chapters (see Figure 1.2). The first two chapters of the dissertation, after this introductory chapter, provide the theoretical perspectives

The study identifies the potential factors such as population growth, government expenditure on education, foreign direct investment, and gross domestic product as

Therefore this research will help to gain insights in the moderating effect of the online or offline shopping channel on the relationship between products in the Dutch