• No results found

Researching usability problems in systems for digital assessment at the UvA

N/A
N/A
Protected

Academic year: 2021

Share "Researching usability problems in systems for digital assessment at the UvA"

Copied!
71
0
0

Bezig met laden.... (Bekijk nu de volledige tekst)

Hele tekst

(1)

Researching usability problems in systems for

digital assessment at the UvA

Universiteit van Amsterdam

A thesis submitted for the degree of

Master of Science

in

Information Studies: Human Centered Multimedia

Xander Koning

10439099

Supervisor University of Amsterdam: Dr. Maarten Marx

Second Reader University of Amsterdam: Dr. Natasa Brouwer

(2)

Researching usability problems in systems for digital

assessment at the UvA

Master Thesis

Date: July 26, 2017

Xander Koning

University of Amsterdam

ABSTRACT

The University of Amsterdam has been working with digital assessment software for two years now. Two systems that the UvA will use from the academical year 2017-2018 are SOWISO and TestVision Online. As a lot of teachers and students will have to use these systems for the first time next year, the software needs to be intuitive and easy to use. Therefore, this Master Thesis will answer the following research question: ’Can the usability of the systems that are currently being used at the UvA for digital assessment be improved?’

To achieve this, usability issues that both existing users and new users of the systems do experience will be identi-fied. Teachers that worked with the systems are interviewed and evaluation data of students that took digital exams is analysed. A cognitive walkthrough will be conducted with TestVision Online to identify issues that first-time users of the system will encounter. Afterwards, it is researched whether the usability of the system TestVision Online can be improved for first-time users.

Finally, the found usability problems will be reported and it will be concluded that the usability of the systems SOW-ISO and TestVision Online can be improved.

Keywords

TestVision Online, SOWISO, digital assessment, usability testing, students, teachers

1.

INTRODUCTION & PROBLEM

STATE-MENT

Students must be assessed individually. One of the ways to assess students individually is by letting them take written exams. It can take up to a few weeks to find out whether or not he or she passed the exam.

To make life easier for both teachers and students, the UvA has been using digital exams for two years now. The

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

Copyright 2017 ACM University of Amsterdam ...$15.00.

idea behind digital assessment is that students get their grades more quickly than before, as the teachers have all the answers digitally available and most of the questions can be checked for correctness automatically by the system. The systems also provide more detailed statistics about the results. Also, teachers can ask different types of questions. Students are assessed in a room where they are seated apart from each other and they are not allowed to use any (digital) tools that might give them unfair advantages.

The UvA has decided to focus on fewer systems to be used for this kind of digital assessment, starting next aca-demic year (2017-2018), than they did before. There are two systems that the FNWI faculty of the UvA wants to focus on: TestVision Online1 and SOWISO2. Also, Maple T.A.3 will be used at other faculties. TestVision Online is focused on closed-ended questions such as multiple choice questions and fill-in questions but also open-ended/essay questions, whereas SOWISO is aimed at more mathemati-cal questions where formulas and computations are involved (Toetssoftware bij digitale toetsen, 2016).

This software that the UvA has opted for requires effort and time to get acquainted with and to become able to create your first exam as a teacher. This is mainly due to the fact that the interface of the systems is not intuitive at certain stages.

As a student, you can have the desire to stick to written exams, instead of digital exams. This is because a correct answer can be rejected by the assessment systems in some cases, while it could be (partially) accepted by the teacher if it would have been handed in on paper.

Research Questions

The aim of this Master Thesis is to find out if these systems are capable to fit the needs of both teachers and students and, if not, due to usability problems, suggest some possible improvements that can be made to these systems that make them more user-friendly. As a result of this, the following research question is set:

Can the usability of the systems that are currently being used at the UvA for digital assessment be improved?

This main research question can be divided into the fol-lowing three subquestions:

1 www.testvision.nl 2https://sowiso.nl/ 3 http://www.maplesoft.com/products/mapleta/ 1

(3)

• What problems regarding usability and ease of use do existing users of the systems currently being used for digital assessment at the UvA experience?

• What problems regarding usability and ease of use will first-time users of the system TestVision Online expe-rience?

• Can changes be made to the interface of the system TestVision Online to improve the usability and ease of use for first-time users?

By answering above research questions, this Master The-sis identifies the usability issues that teachers and students (both first-time and existing users) experience when they use the systems. For first-time users it is also explored if and how these problems could be solved.

Digital Assessment at the UvA

The exams take place at the IWO building in Amsterdam-Zuidoost. To get an impression of what this room looks like, see figure 1. There is a group of student-assistants that sup-port the teacher and students during the exam with all sorts of technical problems. They can see whether all computers are functioning correctly and fix problems when these oc-cur. During the exam, the assistants can see exactly which student is doing what on his computer and who completed which question. The teacher is there to answer course spe-cific questions.

Figure 1: The room ’IWO Rood’ where students take dig-ital exams at the UvA. Every monitor has a privacy filter attached, so that everybody can only see the content that is viewed on their own screen.

Students use a website where they are required to log in using their UvA-net ID. It is only possible to log in on this website from machines that are within the allowed IP ad-dress domain, this to prevent students from taking the test from home. Another important security aspect is that stu-dents can only visit the website where the digital exam is hosted and that except this application everything is locked.

Students can open just those applications that are the min-imum requirements to complete the exam.

2.

RELATED LITERATURE

Some literature related to the thesis topic is listed in this section. Firstly, some articles that are related to the assess-ment system SOWISO are listed. Afterwards, articles that have been found about the advantages of digital assessment compared to paper-based assessment are discussed. Lastly, literature about usability of (digital assessment) interfaces is mentioned.

SOWISO

SOWISO differs from other systems offering interactive math-ematical documents, because it integrates instruction, prac-tice and assessment in the form of an interactive online mod-ule. It gives a student intelligent feedback when completing a task. This intelligent feedback is based on an analysis of the student’s answer or action, and it goes beyond a simple reaction as correct/incorrect. (Heck, 2017).

A case study where SOWISO was used showed that the study success in the Numerical Recipes course in the second year of the Computer Science bachelor programme at the University of Amsterdam was unexpectedly high in compar-ison to other courses (Heck, Brouwer, Amado, & Carreira, 2015).

SOWISO was also used–combined with proctoring–to sub-stitute the yearly, centralized (in a digital assessment room) final exam during a second-year course in the Computer Sci-ence bachelor at the UvA. The teacher let students take summative assessments in regular lecture rooms where stu-dents could use their own laptops. By using online remote proctoring it became possible to prevent fraud, as all laptop screens were captured (Brouwer, Heck, & Smit, 2017).

Digital Assessment or Paper-based Assessment?

Threlfall et al. did check whether computer or paper assess-ments lead to more valid results. With computer questions it is possible to try out solutions. This results in students having better results because of the computer assessments. From this perspective, the computer version would be less valid as an assessment. However, with the paper assess-ment it is possible to draw–something that is not possible on a computer. From this perspective, the computer assess-ment would be the more valid one (Threlfall, Pool, Homer, & Swinnerton, 2007).

Kawazoe en Yoshitomi developed the website ”MATH ON WEB”, which consists of two e-assessment systems for uni-versity mathematics education; Web-based Mathematics Learn-ing System (WMLS) and Web-based Assessment System of Mathematics (WASM). A questionnaire conducted on 56 students pointed out that 83.1% found WASM useful and 62.2% found WASM easy to use. 58.5% of them answered that they prefer a WASM exam over paper-based homework (Kawazoe & Yoshitomi, 2017).

Heck compiled a list of commonly accepted advantages and limitations of computer-assisted assignment. Advan-tages are that it facilitates students to monitor their own progress and it promotes student self-assessment, that large groups can be assessed quickly and diagnostic reports and analyses can be generated. Also, cheating can be reduced by randomizing questions. An important limitation is that

(4)

construction of good objective questions and tests plus ade-quate feedback to students requires skill and practice, what makes it (initially) time-consuming. To develop this skill, assessors and invigilators will need training in assessment design, ICT skills and examination management. A high level of organization is required across all parties involved in assessment (e.g. academics, support staff, administrators) to make the system function properly (Heck, 2004).

In this version of Maple T.A. (1.5) evaluated by Heck in 2004, weak points were that question chaining is limited as you can hardly define variables that can be used across test items. Besides this, partial credit for answers that are incor-rect only in a small detail (such as a sign) is not supported; you can only say that a question is ’correct’ or ’incorrect’ (Heck, 2004).

Usability of Digital Assessment Interfaces

In many contemporary systems, there is a grand opportunity to improve the user interface, thus improving the usability. Guidelines can help achieving this. Task sequences should be standardized. Data display should be consistent and there should be minimal memory load on the user. Also the users should be divided into different groups by determining their skill level before or during their first interaction with the system (Shneiderman, 2010).

Numbas, a free and open-source e-assessment system de-veloped at Newcastle University, improves the usability for students by making the system able to be accessed from any device, including mobile phones and tablets. Authors (teachers) can by means of a graphical editor create sim-ple questions without knowledge of programming or LaTeX. Questions can be run immediately inside the editor, so the author can see the question as a student would, this gen-erates the possibility to immediately discover rendering or marking errors. Template questions can be created, which can easily be adapted to new subjects without changing the underlying math (Perfect, 2015).

3.

METHODOLOGY

In this section, the methods that will be used to make it possible to answer the research questions are outlined. Firstly, it will be explained how it will be researched how students and teachers experience digital assessment with the systems. Afterwards, the cognitive walkthrough method that is used to test usability for first-time users will be ex-plained thoroughly. Lastly, it will be outlined how solutions for issues are created and how these will be evaluated.

3.1

Qualitative Research: Interviews With

Teachers

Data regarding experiences from experts–in this case teach-ers who have created exams using the systems–will help to answer the first subquestion. To gather this qualitative data, two teachers from within the UvA will be interviewed; one that has created digital exams using TestVision Online and another one that has done the same using SOWISO. By do-ing so, (usability) problems that teachers experience when creating a digital exam can be identified. The Open Coding qualitative analysis method (Bryman, 2015, p. 574) will be used, in order to summarize and structure the data that was gathered.

3.2

Quantitative Research: Student

Evalua-tion Data Analysis

The expert on the field of digital assessment from within the UvA (Dr. Natasa Brouwer, Project leader ICT and ed-ucation innovation) said that after every digital exam, stu-dents are asked a few questions to evaluate their experience with the digital exam. This data is available to be analysed. We have access to evaluation data of the academical years 2015-2016 (Appendix C ) and 2016-2017 (Appendix D ). In both years, students were asked 10 questions.

Only the evaluations of students that have worked with SOWISO and Remindo will be taken into account. As there is no evaluation data of students who have taken exams with TestVision Online, most of the digital exams in the last two academical years were done using the Remindo assessment software. Remindo and TestVision Online are two differ-ent systems. However, the expert on the field of digital assessment from within the UvA told that Remindo and TestVision Online are similar when it comes to the student interface, thus the evaluation data can be used to analyse.

The data from the two years will be compared to see whether students from the UvA do become more positive or negative towards digital assessment in general. The va-riety of answers that students entered while answering the open questions (questions 5, 8, 10) will be analysed using the Open Coding method (Bryman, 2015, p. 574) to be able to list the most important answers.

Using the results that this analysis shows, together with the coded interview data, we should be able to answer the first subquestion: ’What problems regarding usability and ease of use do existing users of the systems currently being used for digital assessment at the UvA experience?’

3.3

Cognitive Walkthrough

To be able to figure out what problems the systems have and what changes could be added to the system TestVision Online, a usability test will be conducted. For this manner, the cognitive walkthrough method will be used. The cog-nitive walkthrough is a usability evaluation method based on cognitive theory (Rieman, Franzke, & Redmiles, 1995). This method is suited for those specific cases in which the user has never used the system before, but has to complete a certain task, i.e. a user is required to do exploratory learn-ing. A cognitive walkthrough evaluates the ease with which a typical user can successfully perform a task using a given interface design (Wharton, Rieman, Lewis, & Polson, 1994). Grounded in Lewis and Poison’s CE+ theory of exploratory learning, the Cognitive Walkthrough method describes human-computer interaction with these four steps (Rieman et al., 1995):

1. The user sets a goal to be accomplished with the sys-tem

2. The user searches the interface for currently available actions

3. The user selects the action that seems likely to make progress toward the goal.

4. The user performs the selected action and evaluates the system’s feedback for evidence that progress is be-ing made toward the current goal.

(5)

For the key task, cognitive walkthrough evaluators record the full sequence of actions necessary to do the task on the current version of the system. Then cognitive walkthrough evaluators walk through the system, applying the aforemen-tioned four steps of Lewis and Poison, hereby simulating users’ action selections and mental processes while doing the task (Blackmon, 2004).

Blackmon et al. say that a cognitive walkthrough asks two questions at each step towards completing the task:

• Is it likely that these particular users will take the right action at this stage? (from now: LIKELY)

• If these particular users do the right action and get the feedback the system provides (if any), will they know they made a good choice and realize that their ac-tion brought them closer to accomplishing their goal? (from now: FEEDBACK)

To answer these questions, evaluators tell a believable suc-cess story or failure story about every action they performed that was required to do the key task (Blackmon, 2004).

After TestVision Online has been evaluated using the cog-nitive walkthrough method, we should be able to answer the second subquestion: ’What problems regarding usability and ease of use will first-time users of the system TestVision Online experience?’

3.4

Suggest Solutions and Evaluation of

Solu-tions

The next step is to present solutions for the issues (i.e. ac-tions that are not likely to be performed by the user, given the interface) that are found after conducting the cognitive walkthrough with the system TestVision Online. These so-lutions will be made in the following way. Each issue will be explained by using one or multiple screenshots. New screens will be generated by modifying these screenshots, using Adobe Photoshop4. These new screens should present an interface which works more intuitively. More specifically: the actions that had to be performed according to the opti-mal action sequence that were considered not likely to per-form given the interface of the software, should become more likely to perform with the solution that is proposed.

To verify whether the solutions will indeed result in an interface which works more intuitively, the solutions will be tested on digital assessment experts from within the UvA. The way this is done is inspired by the A/B Testing method (Kohavi & Longbotham, 2015). With this A/B Testing method, two different layout versions of the interface are displayed to the users. There is the control layout (the old version) and the treatment layout (the new version to be tested). Both layout versions are showed to 50 percent of the users. This A/B Testing method is meant for sys-tems/websites that have lots of users. A/B Testing will col-lect user-observable behaviour data from all these users that will ultimately tell which interface is the better one.

As there are not many users to test the control and treat-ment layout on, digital assesstreat-ment experts will be asked per issue which interface (control or treatment) they think is the better one. The judgement from these experts is valuable, as they have expertise and experience with digital assessment interfaces.

4

www.adobe.com/Photoshop

After testing the proposed solutions with the experts, we should have sufficient data to answer the third and last sub-question: ’Can changes be made to the interface of the sys-tem TestVision Online to improve the usability and ease of use for first-time users?’ The answers to the three subques-tions combined will answer the main research question.

4.

EXPERIMENTS

In this section it will be outlined what steps are taken to achieve the results listed in section Results.

4.1

Interviews With Teachers

One person that could help identifying possible shortcom-ings of SOWISO is Leo Dorst. He has been teaching the course Linear Algebra to students of the Bachelor’s pro-grammes Computer Science and Artificial Intelligence for several years now, but experimented with a digital exam last academic year. He also asked students for some more detailed computations on paper. The full transcribed inter-view (11 questions) is included in Appendix A (in Dutch).

The second teacher that was interviewed is Jeroen Lem-mens from the Roeterseiland faculty. He has used TestVision Online to create a digital exam for the course Entertainment-communication, part of the Bachelor’s programme Commu-nication Science. Thirteen questions were asked. The full transcribed interview is included in Appendix B (in Dutch). The outcomes of both interviews will be discussed in sec-tion Results.

4.2

Student Evaluation Data Analysis

Results of the following five questions will be examined, as they are important in the context of this research:

• Question 4: How do you experience navigation through the digital assessment interface?

• Question 5: If unclear, what would you like to be changed?

• Question 7: Did you experience technical problems during the exam?

• Question 8: If yes, which ones?

• Question 10: Do you have any further comments/remarks on digital assessment?

Using the remarks/comments that students provided while answering question number 5 and number 10, it will become possible to identify problems that they experienced while navigating through the software. Question number 4, that is answered with a 5-point Likert scale, can be used to see if students experienced the navigation through the interface better last year, compared to the year before. Question number 7 and 8 will be analysed to see whether students had technical problems during the exam and what caused these problems. The results of the analysis are outlined in section Results.

4.3

Cognitive Walkthrough

As stated in the Methodology section, to identify usability problems for first-time users a cognitive walkthrough will be conducted. For both students and teachers, a task is chosen to be completed using the given interface. To complete the

(6)

task, some smaller tasks (i.e. actions) will be completed, and after each step the two questions LIKELY and FEEDBACK (as stated in section Methodology) will be answered.

TestVision Online (version 12.0.5463) will be tested us-ing the cognitive walkthrough method. The system will be tested with both a teacher and a student account, hereby simulating tasks that both types of users would be likely to perform using this system.

For each of these tasks the optimal action sequence that was found with the provided interface will be listed and the questions LIKELY and FEEDBACK will be answered for each action.

Teacher user.

Teachers will use TestVision Online to create digital ex-ams. To achieve this, the teacher has to perform smaller tasks. I picked some smaller tasks that are typical for a teacher to perform when he/she would create an exam. These are the following four tasks:

1. Create a new question

2. Create exams by adding questions 3. Modify/delete existing questions 4. View student results

The optimal action sequence for the task ’Create a new question’ is listed in this section for clarity. The optimal action sequences for the other teacher user tasks and the student user task are included in Appendix E.

Create a new question 1. Click on ’Questions’

LIKELY: Yes. It is clear that the button saying ’Vragen’ (figure 12) has to be clicked here.

FEEDBACK: Yes. A screen (figure 13) is showed where questions are listed and new ones can be added.

2. Click on the ’New’ button

LIKELY: No. While searching the screen, it will be noticed that this button is hidden at the right bottom corner. FEEDBACK: Yes. A new screen comes up (figure 14) where the question can be edited.

3. Choose the question type

LIKELY: Partially. The user should first know that on de-fault, the question type is: multiple choice, single answer. On the top, there is a drop-down menu (figure 15) where the type of the question can be selected.

FEEDBACK: Yes. When a different type is selected, the answer insertion field changes. For the differences between question types, see figures 16 - 21.

4. Fill in the question field.

LIKELY: Yes. The first thing that comes to mind is typing the question and afterwards providing possible answers. FEEDBACK: Yes. When the user typed the question and selects another input field on the page (the answer field) the question is still visible (figure 16).

5. Add answer possibilities through the answer field. LIKELY: No. Given the fact that it is not clear that the grey heading can be clicked, it was not clear where a user should click to add answers (figure 14). After a while I realised that it was clickable. The plus-button can be clicked to add more answers for multiple choice questions (figure 17).

FEEDBACK: Yes. Answers remain there after clicking the plus-button (figure 18).

6. Select the right answer to the question.

LIKELY: Yes, for multiple choice questions (figure 18), one would like to define which answer/answers is/are correct. This can be done by clicking the dots that appear before each answer possibility.

FEEDBACK: Yes. When a dot is clicked, it remains black. 7. Click the Properties button to edit properties of the ques-tion

LIKELY: Potentially. If a teacher wants all questions to be awarded the same number of points, then it is not needed to edit properties of the question and the user will skip this action. Otherwise when a user wants to assign taxonomies to a question or edit the max number of points, it can be done through this window (figure 22).

FEEDBACK: Yes. When the user is done making changes, he/she will understand that he/she still has to save the ques-tion.

8. Save the question by clicking on the save button LIKELY: Yes. When a user finished the editing of the ques-tion, he/she will search the interface for a way to save the question, to prevent that the changes made are gone. FEEDBACK: Yes, the save button turns grey instead of black.

5.

RESULTS

5.1

Interviews With Teachers

To summarize the results gathered from the interviews, after performing open coding on the raw interview, figure 2 is included. The table shows matches and differences be-tween SOWISO and TestVision Online in a structured way, according to what the two teachers who created an exam with the systems have said. Besides this table, the most important things the teachers said with respect to problems with the system they worked with, are summarized and out-lined below.

Interview Leo Dorst (SOWISO)

60 percent of the questions is digital, 40 percent of the ques-tions is still on paper. This is due to the fact that for some questions it is required to ’tell’ something. This is better to have written on paper, as this takes less time to review than having to open these questions for every single student within SOWISO.

Students should not be penalized twice or more. When (parts of) answers are needed to re-use in later questions, these (parts of) answers should also be linked to each other in SOWISO in a way that prevents double penalties.

For other questions that are asked using SOWISO, it might be worth points if a student chooses the right approach to compute an answer, even though the final answer may be in-correct. It would be convenient if SOWISO could recognize this automatically.

The context of questions is difficult to communicate, given the fact that only one question can be viewed at a time. This is why SOWISO is a good system to ask separate (indepen-dent) mathematical questions. A separate PDF document with all questions was provided during the second exam to solve the problem of not having an overview.

(7)

Figure 2: This table summarizes what the teachers Leo Dorst and Jeroens Lemmens have said about their experience with the systems SOWISO and TestVision Online.

Interview Jeroen Lemmens (TestVision Online)

Multiple choice, open and fill in are the types of questions that can be best asked using TestVision Online. It is also possible to add video material to a question. Adding videos to questions was difficult. Adding a Youtube link was possi-ble, but embedded videos bring in better quality. With the help of the digital test coordinator, these were integrated.

Going over exams can be improved. If there are 200 stu-dents and you stop at number 100, the next time you log in you have to start at number 1 and click on ’next student’ 100 times.

Regarding students, during review sessions, students could see what their (partially) wrong answer was for a question, but they had to go to a previous screen to see how many points they got for that question. In the same screen, there is also lack of an answer model. We had to provide a sepa-rate PDF document which contains these answer models.

Also, a button with a flag could make it more intuitive for students to mark questions during an exam, as they had to ask us how they could mark questions.

5.2

Student Evaluation Data Analysis

The results that were found after analysing the evaluation data from the years 2015-2016 and 2016-2017 are listed here.

2015-2016

180 students filled in the evaluation form in 2015-2016 (N=180). 81 percent of the students say that no technical problems occurred during the exam, with N=179.

Some of the problems that the other 19 percent of the stu-dents experienced are:

• The teacher had no rights to start the exam, this caused the exam to start 15 minutes later.

(8)

• When a question was displayed, powers or root sym-bols disappeared. When hovered over with the mouse arrow, they became visible again.

The average value for the question ”How do you experience navigation through the digital assessment interface?” is 3.8, using a 5-point Likert scale (1=’very unclear’ and 5=’very clear’), N=177.

The most important reasons why students found naviga-tion unclear (Quesnaviga-tion 5), are listed here:

• Logging in is difficult, due to 2/3 redirections. • The built-in calculator is annoying to use/too limited. • Questions should be saved in real-time. Navigating through different questions becomes easier this way, as there is no need to confirm that the current question must be saved before going back to another question. • After answering a question that was skipped earlier,

it would be better if it is possible to finish the exam immediately afterwards, instead of having to save all questions that follow this question again.

• Programming-related questions are difficult, as you can-not run your program. Using IPython can-notebooks should be considered.

General remarks on digital assessment (Question 10) in-clude:

• The text input windows are too small.

• Digital exams are unhandy as intermediate steps can-not be entered and you cancan-not see where a mistake was made.

• Math exams should not be digital, as how the answer is computed can be more important than the answer itself.

• You are not allowed to eat and drink during the exam anymore.

• It would be better if the built-in calculator can be used with every question

• The exam questions should also be provided on paper, so that things can be underlined.

2016-2017

688 students filled in the evaluation form in 2016-2017 (N=688). 95.7 percent of the students say that no technical problems occurred during the exam, with N=681.

Some of the problems that the other 4.3 percent of the stu-dents experienced are:

• Some images did not load, it was required to refresh. • It was not possible to log in.

Using a 5-point Likert scale (1=’very unclear’ and 5=’very clear’), the average value for the question ”How do you expe-rience navigation through the digital assessment interface?” is 4.1, N=677.

The most important reasons why students found naviga-tion unclear (Quesnaviga-tion 5), are listed here:

• Add a progress bar.

• A typo can cause an answer to a fill in question to be considered wrong.

• The introduction was not clear and it was to clear that you could start logging in to start the exam.

• If possible; all questions also on paper. It would be bet-ter to see more questions at a time to get an overview (especially when questions are related).

• It is unclear how you have to log in on the computer. • Periods and commas should both be accepted when

entering decimal numbers. • Add a ’previous question’ button.

General remarks on digital assessment (Question 10) in-clude:

• It would be better to have all questions also on paper, to be able to make notes, mark/underline things. • It is not easy when mathematical formulas have to be

entered, it takes a lot of time to enter special charac-ters.

• Reading questions from screens during 3 hours causes headache.

• Decimal numbers should not automatically be rounded; keep the number of entered decimals as entered. Technical problems during exams have decreased past year. 19 percent of the students encountered technical problems in 2015-2016. In 2016-2017, this percentage was only 4.3 percent. In both years, students say that the navigation in the digital assessment systems is clear (3,8 in 2015-2016 and 4.1 in 2016-2017, using a 5-point Likert scale where 1=’very unclear’ and 5=’very clear’). Thus in 2016-2017, students say that the navigation is slightly clearer than in 2015-2016.

5.3

Cognitive Walkthrough:

Issues and Proposed Solutions

After conducting the cognitive walkthrough, some issues were found regarding usability and ease of use of the inter-face. To create an overview, they are listed here–if issues were found while doing the task–and a solution is proposed to solve each issue.

Teacher user

Create a new question.

Issue #1: When a teacher user would want to create a new question, he will expect the button somewhere on the top part of the screen, but it is located in the bottom right corner.

Solution: To make the interface more intuitive for the teacher, the button to create a new question is placed in the upper left part of the screen, instead of in the bottom right corner. ’Edit’ and ’Delete’ buttons have also been added here, so the teacher will not have to search for them anymore by clicking on other buttons.

(9)

(a) Issue #1: The ’New’ button is difficult to find.

(b) Solution: Placement of the ’New’ button on the top left part of the screen, along with ’Edit’ and ’Delete’ buttons. Figure 3: A teacher has to search for buttons to create, edit or remove a question, so a solution is presented where buttons to perform these operations are located at the bar on the top of the screen.

Issue #2: Users have to click on the grey header that says ’Alternatieven’ (figure 4a) to make it orange and have a ’plus’ button show up (and some other buttons that changes the way how answer possibilities are placed). Teachers will not see that the header is clickable, given the grey color and the lack of these buttons.

Solution: When the header is made orange from the start, like the one where the stem of the question can be entered, teachers will realize that they have to click here. Also the ’plus’ button is there from the start, which implies that with this button more answer possibilities can be added via the header.

(a) Issue #2: the answer possibilities input block header is grey and there is no ’plus’ button.

(b) Solution: the header is orange before it is clicked and extra buttons are added to the header.

Figure 4: In the current situation the answer input block has a grey header that is not obviously clickable. To solve this issue, the header is made orange and the ’plus’ button has been added. Firstly, this makes it clearer that the header can be clicked. Secondly, the ’plus’ button also makes it clearer that the header must be clicked to start adding an-swer possibilities.

Create exams by adding questions.

As the screen that list exams looks similar to figure 3a and has the ’New’, ’Edit’ and ’Delete’ buttons at the same locations, a similar solution 3b can also be proposed to solve this issue (figure 33).

Issue #3: On the page to edit an exam, questions have to be put in a question set first. The button that brings the user to this screen where questions can be put in the question set has a text on it that can be hard to interpret for new users, as they will not know wherefrom the questions are selected for what purpose.

Solution: Figure 5b presents a solution where the posi-tion and the text on the button have been changed. ’Add/Remove’ should be easier to understand for new users. The location of the button, next to the heading ’Question Set’, will tell exactly what these operations are applicable on.

(10)

(a) Issue #3: the button saying ’Select Questions’ is confus-ing.

(b) Solution: a button at the top of the list of questions in the Question Set saying ’Add/Remove’

Figure 5: The button to get to the screen where the ques-tion set can be changed contains a text that might be diffi-cult to guess its function. Therefore, the text is changed to ’Add/Remove’ and the button is moved to the top, placed next to the heading ’Question set’.

Issue #4: When questions have to be selected to put them in the question set to be used to create the exam, questions can be moved from a box at the left side to a box on the right side. The box on the right side has the header ’Question set’, which will make the user think that all questions in this box are part of the question set. This is not the case. The user has to make a selection–again–by ticking the checkboxes of the questions he/she wants to be included in the question set and click on the ’Ready’ button to include these questions in the question set.

Solution: Questions in the right box of the screen are automatically part of the question set. To achieve this, it is necessary that the checkboxes are removed, as in the old situation these were required to tick to put the questions in the question set.

(a) Issue #4: When questions have been moved to the right part of the screen, they have to be selected another time to really put them in the question set.

(b) Solution: the checkboxes have been removed and questions on the right are part of the question set.

Figure 6: To solve the issue of being forced to make a selec-tion of quesselec-tions again, the checkboxes have been removed from the box on the right part of the screen and as a result the arrow on the upper right corner has also gone. Questions in the right box are now part of the question set.

Issue #5: An exam consists of questions. Each question has a maximum number of points, a taxonomy and more properties. From within the exam edit screen, it is not pos-sible to edit the properties of a single question. The user has to go back to the question itself via the main menu to achieve this.

Solution: To make it possible to change properties of single questions only for the current exam from within the exam edit screen, a solution is proposed where ’Pen’ buttons are added next to each single question (figure 7b).

(11)

(a) Issue #5: it is not possible to change properties of ques-tions from within the exam edit screen.

(b) Solution: a ’pen’ button is added next to each question of the exam.

(c) Menu that should show up when a ’pen’ button next to a question is clicked on.

Figure 7: As it is not possible in the current situation to edit properties of a single question that is included in an exam, ’pen’ buttons are added. These buttons make it possible to edit the properties of single questions from within the exam edit screen.

Modify/delete existing questions.

Issue #6: To delete questions, a user has to press the ’gear’ button first to get to a menu with buttons for actions to be performed on selected questions. The user should se-lect the questions first and press the ’gear’ button after-wards.

Solution: A button to delete questions is added in the top bar. This way, the user can press the button first and afterwards select one or more questions to be deleted from

the list. After the selection has been made and confirmed, a dialogue pops up to verify if the user really wants to delete the selected questions.

(a) Issue #6: the buttons to edit or delete questions are in a separate menu that is not intuitive to find.

(b) Solution: the ’Edit’ and ’Delete’ are placed in the top bar, alongside a ’New’ button.

(c) Dialogue that should pop up to verify whether one or more selected questions really have to be deleted.

Figure 8: As the ’gear’ button has to be clicked (see figure 13) to get to the menu with buttons for all sorts of actions to be performed on questions, a solution is presented where buttons for the actions Edit and Delete are placed in the bar on the top. A dialogue (figure 8c) is displayed when a user has pressed the ’Delete’ button and selected the questions to be deleted.

View student results.

Issue #7: The teacher can not see both the student’s an-swer and the correct anan-swer at the same time. It is required to click buttons to switch between both answers.

Solution: Figure 7b represents a solution where the stu-dent’s (partially correct) answer is displayed on top and the

(12)

correct answer is displayed below the student’s answer. This way it is no longer needed to click buttons to be able to read and compare the different answers.

(a) Issue #7: either the student’s answer or the correct answer can be viewed.

(b) Solution: present both answers on the same screen. Figure 9: A teacher has to click buttons to switch between answers, so a solution is presented where both answers can be viewed at the same time to make it easier to compare answers.

Student user

Take an exam.

No disturbing issues were found when taking example ex-ams using the student user test account. Navigation is in-tuitive, as a minimal interface is presented to the user. The different colors used in the navigation bar that tell a user whether a question is answered or not are helpful. The same goes for the possibility to mark questions.

Expert feedback on solutions

Elgin Blankwater MSc, Edutech Coordinator at the UvA, assists teachers when they run into trouble while creating exams with TestVision Online. He has given his opinion on the proposed changes to the interface. See figure 10.

Figure 10: Expert feedback on solutions (Elgin Blankwater).

Figure 11: Expert feedback on solutions (Natasa Brouwer).

(13)

Dr. Natasa Brouwer, Project leader ICT and education innovation at the UvA has also looked at the proposed so-lution to the found issues and has given her opinion (figure 11).

6.

CONCLUSION

To conclude this research on usability problems in sys-tems for digital assessment at the UvA, we will answer the research questions here.

The first subquestion ’What problems regarding usability and ease of use do existing users of the systems currently being used for digital assessment at the UvA experience?’ is answered first. Existing users of the systems are students and teachers. They do both experience problems (sections 5.1 and 5.2). Students state that it difficult to enter math-ematical symbols and formulas on a computer, and that an extra provided paper version of the exam would be good to have, to be able to mark things and have an overview of all questions. Yet, in general, students who took digital ex-ams during the past two years at the UvA have a positive attitude towards digital testing. Teachers are also positive about digital assessment, as the time to go over exams re-duces and exam review sessions can be done more efficiently. However, review sessions could be held more efficiently if the software would make it possible to show answer models to the students.

After conducting the cognitive walkthrough for the system TestVision Online, some usability problems (issues 1-7 in section 5.3) were discovered. These are all issues for which the current action in the optimal action sequence was found to be not likely to be taken, given the interface. To put it differently, the memory load on the user could be less. This memory load should be minimized to improve user interfaces (Shneiderman, 2010). The found issues are the answer to the second subquestion ’What problems regarding usability and ease of use will first-time users of the system TestVision Online experience?’

Both experts from within the UvA agreed with almost all changes to the interface that are proposed to solve the found issues during the cognitive walkthrough. It was argued that issues 5 and 6 were no real issues/usability problems. Both experts proposed a different solution to solve issue 1. Yet, they were positive about most of the solutions. The third subquestion ’Can changes be made to the interface of the system TestVision Online to improve the usability and ease of use for first-time users?’ can therefore be answered with ”Yes”.

Now that the three subquestions have been answered, we can answer the main research question: ’Can the usability of the systems that are currently being used at the UvA for digital assessment be improved?’.

At first it seemed like there was a lot that could be im-proved with the both systems. After interviewing a teacher who used SOWISO to assess his students, it turned out that he did not create the digital exam on his own. He receives help from other people that are hired by the UvA and are experts on SOWISO. Creating questions within SOWISO requires specific knowledge that these people possess and it can take up 40 hours to develop one exam. Most teachers will not have the required knowledge and time to develop such an exam. Students could get used to the interface be-fore taking the exam, as they practised every week with the homework exercises.

A cognitive walkthrough was exclusively conducted with TestVision Online, to identify issues for first-time users. The answer to the third subquestion shows that solutions to these issues can improve the usability of the TestVision Online interface, for first-time (teacher) users. For SOWISO we provided no specific issues and solutions, but the fact that a teacher is not able to create an exam on his own with this system indicates that the usability of the software can be improved. Concluding, we can say that the usability of the systems that are currently being used at the UvA for digital assessment can be improved.

7.

DISCUSSION

In this section, parts of the research will be reflected upon and/or it will be explained why these choices were made. Also, if other ways of researching were possible/will be pos-sible in the future, these will be mentioned.

More existing users are available next year

Students who filled in the evaluation were mainly students who took exams where the assessment system Remindo was used. For the exam of just one course where evaluation data is available for (Basiswiskunde in de Psychobiologie in 2015-2016), SOWISO was used. No evaluation data is available from students who took an exam where TestVision Online was used. At the end of next year, the evaluation data of students who took an exam in TestVision Online will be available as Remindo is then replaced by TestVision Online. Also, more evaluation data regarding students who worked with SOWISO will be available at that time.

Teachers from within the UvA that have worked with the TestVision Online interface are scarce, as the UvA will start using it for exams starting next year. It might be possible to evaluate the interface among more existing users at the end of next year, as then there will be more teachers that have worked with the TestVision Online interface to complete the tasks as stated in this thesis.

Cognitive Walkthrough

The cognitive walkthrough is a method to evaluate an in-terface for first-time users. It evaluates how likely it is to find certain actions–given the interface–for someone that has never worked with it before. Other usability testing methods exist to evaluate existing users.

Apart from this, for the cognitive walkthrough there was just one evaluator. If the walkthrough would have been carried out by several more evaluators, issues that had been found could have been shared and discussed with each other. The cognitive walkthrough could also have been conducted for the system SOWISO. It was decided not to do so, as the teacher that was interviewed who used SOWISO to assess his students (Leo Dorst) said that creating questions and preparing an exam takes up to 40 hours and that he himself does not have to do this. He does not possess this domain specific knowledge that is required to do so and can not spend 40 hours or more on creating one single exam, as pri-marily he is a researcher, not a teacher. The same counts for other teachers. This is why it was chosen to only test the usability for first-time users of TestVision Online using the cognitive walkthrough method, as this is a system of which it is expected that teachers use it to create (large parts of) the exams themselves.

(14)

Testing Proposed Solutions

An alternative way how the suggested solutions could have been tested (instead of asking the opinion of digital assess-ment experts), would be by means of a clickable prototype. A prototyping tool like InVision5 or Marvel6 can be used to create such a prototype. The prototype can then be dis-tributed to a group of test users who have to perform tasks with it. As it takes time and effort to create such a pro-totype (as many different connections between screens have to be made to get as close as possible to a real version of the system), it was chosen to not use this testing method. It is also questionable whether the TestVision Online devel-opers would benefit from such a prototype more than from the (static) solution screens, if they want to make changes to the source code to integrate the proposed solutions.

Costs for the UvA

The teacher that was interviewed who has created an exam with the system SOWISO (Leo Dorst) put something for-ward regarding to the costs that are entailed with digital testing for the UvA. From his perspective, he argues that it now takes just 1 day to go over all exams instead of 3 days. He says that preparing an exam by two SOWISO experts (who have to assist him) can take up to a total of 40 hours. Afterwards the exam has to be tested for bugs by 2 testers (2-3 hours each). During the exam, 3 student-assistants have to be hired to assist the teacher when technical problems occur (2-3 hours each). It is doubtful–according to Leo– whether the fact that the time to go over exams gets reduced with a few days is important enough to justify the amount of money that is spent by the UvA to hire all the people that have just been mentioned.

On the other hand, Dr. Natasa Brouwer, Project leader ICT and education innovation at the UvA has said that it is valuable to hire these people who spent 20-30 hours on creat-ing (frameworks for) questions. The fact that the questions can be re-used, given their pseudo-random nature (in the case of SOWISO), justifies the hiring of the people who cre-ate the questions to be asked in the exam. When it comes to homework questions, some questions could be used in multiple courses, during multiple years, possibly at multiple Universities, so she thinks it is worth the investment.

References

Blackmon, M. (2004). Cognitive walkthrough. Encyclopedia of human-computer interaction, 2 , 104–107.

Brouwer, N., Heck, A., & Smit, G. (2017). Proctoring to improve teaching practice. MSOR Connections, 15 (2), 25–33.

Bryman, A. (2015). Social research methods. Oxford uni-versity press.

Heck, A. (2004). Assessment with maple ta: creation of test items. AMSTEL Institute, UvA, available online from Adept Scientific via: http://www. adeptscience. co. uk/products/mathsim/mapleta/MapleTA whitepaper. pdf [Accessed 19 June 2008] .

Heck, A. (2017). Using sowiso to realize interactive mathe-matical documents for learning, practising, and assess-ing mathematics. MSOR Connections, 15 (2), 6–16.

5https://www.invisionapp.com/ 6

https://marvelapp.com/

Heck, A., Brouwer, N., Amado, N., & Carreira, S. (2015). Digital assessment-driven examples-based mathemat-ics for computer science students.

Kawazoe, M., & Yoshitomi, K. (2017). E-learning/e-assessment systems based on webmathematica for uni-versity mathematics education. MSOR Connections, 15 (2), 17–24.

Kohavi, R., & Longbotham, R. (2015). Online controlled experiments and a/b tests. Encyclopedia of machine learning and data mining, 1–11.

Perfect, C. (2015). A demonstration of numbas, an e-assessment system for mathematical disciplines. In Caa conference.

Rieman, J., Franzke, M., & Redmiles, D. (1995). Usability evaluation with the cognitive walkthrough. In Confer-ence companion on human factors in computing sys-tems (pp. 387–388).

Shneiderman, B. (2010). Designing the user interface: strategies for effective human-computer interaction. Pearson Education India.

Threlfall, J., Pool, P., Homer, M., & Swinnerton, B. (2007). Implicit aspects of paper and pencil mathematics as-sessment that come to light through the use of the computer. Educational Studies in Mathematics, 66 (3), 335.

Toetssoftware bij digitale toetsen. (2016). http://toetsing .uva.nl/toetscyclus/afnemen/\\afnemen.html# anker-toetssoftware-bij-digitale-toetsen. (Ac-cessed on: 11-03-2017)

Wharton, C., Rieman, J., Lewis, C., & Polson, P. (1994). The cognitive walkthrough method: A practitioner’s guide. In Usability inspection methods (pp. 105–140).

(15)

APPENDIX

A.

INTERVIEW LEO DORST (DUTCH)

-31/05/2017

Wat was voor u de belangrijkste beweegreden om over te stappen op digitaal toetsen?

Het toenemende aantal studenten, dus de schaalbaarheid. Is SOWISO de enige tool die u heeft gebruikt voor de digitale toetsen?

Ja. Het tentamen was trouwens een combinatie van vragen in SOWISO met een schriftelijk gedeelte. Ongeveer 60 pro-cent SOWISO, 40 propro-cent op papier.

Is SOWISO een goed systeem om lineaire algebra mee te toetsen?

Niet helemaal, daarom is er ook een deel nog steeds schriftelijk. Er zijn gewoon dingen die je makkelijker op papier doet. SOWISO is goed geschikt om aparte (wiskundige) vragen te stellen. Voor mijn type vragen, waar de context belan-grijk is, was het nodig om dit telkens handmatig boven elke vervolgvraag te plaatsen, omdat de student slechts 1 vraag tegelijk kan zien. Je kunt niet scrollen. Daarom heb ik bij het tweede tentamen ook een PDF gemaakt met onder elkaar alle vragen, zodat de studenten deze ernaast konden houden om het overzicht te houden en het verband tussen de vragen beter konden zien. Dat vonden de studenten veel prettiger. Want mijn manier van vragen stellen is zo: 2 tot 3 thema’s met ieder 8-10 deelvragen. Dat is met wat puzzelen wel redelijk goed gelukt om dit in SOWISO te krijgen.

Is het binnen SOWISO goed mogelijk om wiskundige

symbolen in te voegen in een vraag? Vindt u dit

eenvoudig in het gebruik?

Ja. Je kunt wiskundig equivalente, maar verschillende antwo-orden laten afvangen. Het enige wat je niet makkelijk kan doen is mensen die alleen maar een rekenfoutje gemaakt hebben eruit pikken. Die hebben gewoon het antwoord fout. Je kunt wel proberen te voorzien wat voor type fouten er gemaakt gaan worden. Maar dit moet dus van tevoren. Bi-jvoorbeeld bij een 3x3 matrix kun je per kolom 1/3e punt geven, dan is bij een rekenfoutje niet het hele antwoord gelijk fout. Maar dit moet dus voor het afnemen van de toets, achteraf kun je niet de manier waarop een vraag wordt nagekeken aanpassen, gezien het feit dat de SOWISO vra-gen direct door SOWISO worden nagekeken.

Zou u aan het systeem SOWISO bepaalde func-tionaliteiten toegevoegd willen zien die het voor (do-centen als) u makkelijker in het gebruik zouden maken? Ik heb geen idee hoe lastig het is om deelantwoorden die je eigenlijk wel goed vindt, of equivalente antwoorden toch goed te rekenen, evenals doorrekenfouten. Hiervoor zou bi-jvoorbeeld variabele x uit deelvraag a moeten worden gekop-peld aan variabele y uit vraag e, als deze y dezelfde waarde als x zou moeten hebben volgens het goede antwoord, op zo’n manier dat wanneer x fout is berekend, maar de stu-dent inziet dat y=x, dat dan het verkeerd berekenen van x niet dubbel bestraft wordt. Hoe eenvoudig dit is binnen SOWISO weet ik niet. Ook moeten, omdat alle mogeli-jke fouten (en gedeeltelijk goede antwoorden) van tevoren moeten gedefinieerd, de mensen die de toetsen voor mij maken in SOWISO eerder van tevoren worden benaderd en

ingehuurd. Deze mensen moeten door de UvA worden be-taald. Of dit het allemaal waard is dat ik 1 of 2 dagen minder nakijkwerk heb, qua kosten voor de UvA, daarvan ben ikzelf niet helemaal overtuigd.

Een technisch dingetje is dat SOWISO matrices en vectoren als twee verschillende dingen ziet. Dit zou wat mij betreft moeten worden gezien als dezelfde dingen want vectoren zijn matrices.

Denkt u als docent dat u effici¨enter kunt toetsen met behulp van een tool als SOWISO?

Ja. Echter, stel: de resultaten dit jaar zijn slechter. Dat kan aan veel dingen liggen. Maar ik denk over het alge-meen van wel. Er wordt wel meer precisie gevraagd van de studenten; SOWISO kijkt op een andere manier na dan ik. Vragen kunnen door SOWISO deels goed worden gerekend als de juiste methode wordt gekozen om de berekening te maken en vervolgens worden er meer punten toegekend als het numerieke antwoord daadwerkelijk correct is. Hiervoor is het ook nodig om dit in antwoordmodellen van tevoren te defini¨eren, wat tijd kost. Dit kost meer tijd, echter zijn uitgebreidere antwoordmodellen niet nadelig. Bij sommige vragen kan dit probleem worden opgelost door een vraag in twee deelvragen op te lossen. Hier kwam ik pas bij het tweede deeltentamen achter, het gaat dus niet in een keer goed. Voor studenten zijn nabesprekingen effici¨enter omdat zij al voordat ze naar de nabespreking van het tentamen komen hun gegeven antwoorden per vraag kunnen bekijken, het geeft studenten wat meer controle.

Heeft u door het gebruiken van SOWISO tijdswinst geboekt bij het nakijken?

Ja. Waar het vroeger 3 dagen duurde om na te kijken is dat tegenwoordig 1 dag. Ik vind dit ook eerlijk tegenover de studenten toe, want je vraagt hen in principe meer moeite te doen. Maar het brengt ook wel ordening in je antwoorden. Ik kan nu ook als docent makkelijker zien of een bepaalde vraag slecht gemaakt is, en er zodoende voor kiezen om zo’n vraag bonusvraag te maken. Dit heb ik de eerste twee keer echter niet gedaan.

Kunt u met behulp van SOWISO beter of slechter de leerdoelen toetsen?

Soms is het al wat waard als een student inziet dat voor een bepaalde vraag bijvoorbeeld het dot product moet worden berekend. Als dan vervolgens het antwoord verkeerd is, zou het wenselijk zijn dat SOWISO dan ook voor zulke deelant-woorden punten toekent. Dit probleem probeer ik echter zoveel mogelijk te vermijden door mijn vragen in kleinere vragen op te splitsen.

Ook raad ik mijn studenten daarom aan om ´alle vragen op papier te maken en in te leveren, ook de SOWISO vragen, zodat ze bij de inzage kunnen aantonen wat hun bereken-ing/antwoord was, mochten ze een typfoutje hebben gemaakt bij het invoeren in SOWISO. Ik laat het eerst door SOWISO nakijken en de papieren vragen kijk ik zelf na, maar als stu-denten bij de inzage bezwaar maken over een fout gerekende SOWISO vraag, ga ik de uitwerking van de SOWISO vraag op papier erbij pakken. Desondanks zijn er veel studenten die de SOWISO vragen niet maken op papier, omdat ze sim-pelweg niet luisteren.

Zijn er nadelen ten opzichte van toetsen op

(16)

pier?

Zo’n 40 procent van de vragen is nog steeds op papier. Som-mige vragen moesten gewoon op papier omdat je als student dan iets moet ’vertellen’. Dat is prettiger om op papier te hebben dat in SOWISO naar die vraag te gaan voor iedere student, dit zou moeten omdat vertelvragen moeten worden gelezen door een mens, dit kan SOWISO niet nakijken. Voor zowel studenten en docenten is het stellen van zulke vragen op papier makkelijker. Er zitten vrijwel geen multiple choice vragen bij.

Kregen de studenten tijdens het digitale tentamen ook de ’immediate personalized feedback’ ?

Niet tijdens het tentamen, wel tijdens het wekelijkse SOW-ISO huiswerk.

Hoe is het de studenten bevallen de toetsen met SOWISO te doen? Weet u daar iets van?

Uit de enquˆetes kwam niet naar voren dat de studenten on-tevreden waren. De drempel voor hen bij het tentamen was ook niet hoog, omdat ze bij het wekelijkse huiswerk de tool SOWISO al hadden gebruikt en dus wisten hoe met SOW-ISO om te gaan en hoe ze de wiskundige symbolen moesten invoeren.

B.

INTERVIEW JEROEN LEMMENS

(DUTCH) - 27/06/2017

Voor welk soort vakken heeft u toetsen gemaakt met TestVision Online?

In april 2017 heb ik het voor het vak Entertainmentcom-municatie, onderdeel van de Bachelor Communicatieweten-schap, een toets gemaakt met TestVision Online.

Wat was voor u de belangrijkste beweegreden om over te stappen op digitaal toetsen?

Het gebruiksgemak met nakijken speelt een grote rol. Nu krijgen we getypte tekst te zien op een computerscherm, dat is eenvoudiger te lezen dat de honderden verschillende handschriften van studenten. Tevens is het nu mogelijk om de studenten tijdens een tentamen een video te tonen en er daarna vragen over te stellen. Dat deze mogelijkheid er is heeft ook meegespeeld.

Is TestVision Online de enige tool die u heeft ge-bruikt voor de digitale toetsen? Nee, want vorig jaar hebben we voor de digitale toets voor entertainmentcommu-nicatie de tool Remindo gebruikt.

Is TestVision Online een goed systeem?

Jazeker, het doet wat het moet doen. We hebben twee keer 200 studenten getoetst en er waren geen technische proble-men tijdens het tentaproble-men, veroorzaakt door TestVision On-line. Dit zou je misschien wel verwachten. Wat ik persoon-lijk geweldig vind, is dat het met TestVision Online mogepersoon-lijk is om gifjes af te spelen, in tegenstelling tot Remindo.

Hoelang/hoeveel toetsen heeft het geduurd voor-dat u het maken van vragen met TestVision Online volledig onder de knie kreeg?

Ik weet niet of ik het volledig onder de knie heb. Meerkeuzevra-gen en open vraMeerkeuzevra-gen kan ik prima aanmaken. Toevoegen van videofragmenten vond ik niet heel makkelijk. Youtube

videos invoegen dat lukte wel, maar embedded video levert betere beeldkwaliteit op. Met behulp van Elgin Blankwater ging dit wel, die heeft dit voor mij gedaan. Hoe ik zelf de MP4 bestanden zou moeten toevoegen is mij niet duidelijk. Video toevoegen ging in Remindo iets makkelijker.

Voor welke typen vragen is TestVision Online het beste geschikt?

Meerkeuzevragen en invulvragen.

Is het binnen TestVision Online goed mogelijk om wiskundige symbolen in te voegen in een vraag? Vindt u dit eenvoudig in het gebruik?

Van wiskundige symbolen heb ik eigenlijk geen gebruik gemaakt. Wel was het een paar keer nodig om bijvoorbeeld een accent aigu in te voeren. Simpelweg kopi¨eren uit een Microsoft Word bestand werkte hier voor mij.

Zou u aan het systeem TestVision Online bepaalde functionaliteiten toegevoegd willen zien die het voor (docenten als) u makkelijker in het gebruik zouden maken?

Ja, een paar punten hadden beter of makkelijker gekund. Via het kopje ’Afnames’ en dan de resultaten van een stu-dent van een bepaalde afname kan ik de resultaten van een student inzien en eventueel aanpassen. Waarom kan dit niet via het kopje ’Beoordelen’ ? Dit werkt een beetje verwar-rend omdat je zou verwachten dat het via dat kopje moet. Studenten konden vragen markeren/vlaggen tijdens het ten-tamen. Maar dit werkte voor de studenten niet intu¨ıtief, wij moesten ze eerst vertellen hoe ze precies een markering kon-den toevoegen. Een knop met bijvoorbeeld een vlaggetje zou gewenst zijn.

Ook mag de ’5-minute-warning’ voor het einde van een ten-tamen iets duidelijker zijn. Sommige studenten claimden niet door te hebben gehad dat ze deze waarschuwing onder ogen hebben gekregen. Bij het eerstvolgende tentamen zal ik het omroepen, op de ouderwetse manier.

Bij de inzage voor de studenten kan er ook een en ander ver-beterd worden. Studenten konden alleen zien wat ze had-den geantwoord op een (deels) fout gerekende vraag, maar moesten een scherm terug om het aantal behaalde punten voor die vraag te kunnen zien. Ook is er geen antwoord-model voor de vragen voor studenten beschikbaar in dit scherm. Daarom moesten wij de student een aparte PDF met antwoordmodellen verschaffen.

Nakijken kan beter. Voor een toets hadden wij 200 studen-ten. Als je er 100 hebt nagekeken en je bent een tijdje weg van je computer, dan word je automatisch uitgelogd. Als je dan weer inlogt en verder wilt gaan waar je gebleven was moet je vanaf nummer 1 doorklikken tot aan nummer 100. Dit vind ik omslachtig.

Denkt u als docent dat u effici¨enter kunt toetsen met behulp van een tool als TestVision Online? Ja natuurlijk! Het gaat sneller. Er hoeven namelijk geen vellen papier meer worden ingeleverd.

Heeft u door het gebruiken van TestVision Online tijdswinst geboekt bij het nakijken?

Jazeker.

Kunt u met behulp van TestVision Online beter

(17)

of slechter de leerdoelen toetsen?

Beter, aangezien we nu vragen over video’s kunnen stellen. Zijn er nadelen ten opzichte van toetsen op pa-pier?

Nee, die kan ik niet bedenken.

Hoe is het de studenten bevallen de toetsen met TestVision Online te doen? Weet u daar iets van? Volgens mij beviel het ze allemaal prima. Er waren natu-urlijk enkele uitzonderingen. Studenten beweerden dat het scherm viel opeens uitviel terwijl ze vlak voor het eind een antwoord aan het intypen waren en gaven dat als reden voor het niet kunnen beantwoorden van een vraag. Of dit echt gebeurd is, betwijfel ik.

Bij vragen waar een video bekeken moest worden hadden de studenten niet in de gaten dat de volumeknop naar het minimum was gedraaid, dus hoorden ze niks en was assis-tentie van werkplekondersteuners nodig. Dit heeft echter niets te maken met het systeem TestVision Online. Uiter-aard was het tentamen voor de studenten ook te ’moeilijk’, qua moeilijkheidsgraad, maar dat zal je altijd houden.

(18)

C.

STUDENT EVALUATION DATA 2015-2016

(19)

10.06.2016 EvaSys evaluation Page 1

NOT FOR RESALE

Ad hoc evaluaties

Digitaal Toetsen () No. of responses = 180

Survey Results

Survey Results

Legend

Question text Left pole Right pole

n=No. of responses av.=Mean md=Median dev.=Std. Dev. ab.=Abstention 25% 1 0% 2 50% 3 0% 4 25% 5 Relative Frequencies of answers Std. Dev. Mean Median

Scale Histogram

1. Digitaal Toetsen

1. Digitaal Toetsen

Welke digitale toets heb je zojuist gemaakt?

1.1)

n=180

18 april: Basiswiskunde in de Psychobiologie 41.7%

21 april: Besturingssystemen 20.6%

21 april: Collectieve Intelligentie 23.3%

28 april: Methoden van Onderzoek en Statistiek 4.4%

4 mei: Biosystems Data Analysis 0%

19 mei: Advanced Statistics for Analytical Chemistry 0%

24 mei: Onderzoeksmethoden en Analyse van Wetenschappelijk Onderzoek 2.2%

26 mei: Besturingssystemen 1.7%

26 mei: Collectieve Intelligentie 5.6%

31 mei: Virtual Globe 0.6%

Hoe heb je het werken met deze toetsomgeving ervaren?

1.2)

Zeer prettig

Zeer onprettig n=180av.=3.5

md=4 dev.=1 3.9% 1 14.4% 2 28.3% 3 39.4% 4 13.9% 5

Hoe heb je het gehele digitale toetsproces ervaren (nieuwe regels in toetszalen, computers zelf, inloggen enz.)

1.3) Zeer prettig Zeer onprettig n=179 av.=3.5 md=4 dev.=0.9 2.2% 1 12.3% 2 29.6% 3 44.7% 4 11.2% 5

Hoe ervaar je het navigeren door de digitale toetsomgeving?

1.4)

Erg duidelijk

Erg onduidelijk n=177av.=3.8

md=4 dev.=0.9 1.7% 1 7.3% 2 20.9% 3 48.6% 4 21.5% 5

Had je al ervaring met digitaal toetsen? Zo ja, welke toetsomgeving werd hiervoor gebruikt?

1.6) n=180 Nee 18.3% Ja, Remindo 53.3% Ja, SOWISO 21.1% Ja, Blackboard 27.8% Ja, QMP 0%

(20)

10.06.2016 EvaSys evaluation Page 2

NOT FOR RESALE

Heb je technische problemen ondervonden tijdens de toets?

1.7)

n=179

Ja 19%

Nee 81%

Zou er bij meer vakken digitaal getoetst moeten worden?

1.9)

Zeer eens

Zeer oneens n=179av.=3

md=3 dev.=1.3 16.2% 1 20.7% 2 26.3% 3 24.6% 4 12.3% 5

(21)

10.06.2016 EvaSys evaluation Page 3

NOT FOR RESALE

Profile

Subunit: FNWI-bachelor Name of the instructor: Ad hoc evaluaties Name of the course:

(Name of the survey)

Digitaal Toetsen

Values used in the profile line: Mean

1. Digitaal Toetsen

1. Digitaal Toetsen

1.2) Hoe heb je het werken met deze

toetsomgeving ervaren? Zeer onprettig Zeer prettig n=180 av.=3.5 md=4.0 dev.=1.0

1.3) Hoe heb je het gehele digitale toetsproces

ervaren (nieuwe regels in toetszalen, computers zelf, inloggen enz.)

Zeer onprettig Zeer prettig

n=179 av.=3.5 md=4.0 dev.=0.9

1.4) Hoe ervaar je het navigeren door de digitale

toetsomgeving? Erg onduidelijk Erg duidelijk n=177 av.=3.8 md=4.0 dev.=0.9

1.9) Zou er bij meer vakken digitaal getoetst

(22)

10.06.2016 EvaSys evaluation Page 4

NOT FOR RESALE

Comments Report

Comments Report

1. Digitaal Toetsen

1. Digitaal Toetsen

Indien onduidelijk, wat zou je graag anders zien?

1.5)

- (2 Counts)

Als je op de backspace toets drukt maar niet een typcursor hebt, verdwijnt de toets en dat brengt stress

Bepaalde multiple-choice vragen waren ingesteld als invulvragen waarbij je de juiste optie moest invullen, in plaats van aanklikken. Ook was het voor sommige mensen onduidelijk dat het inloggen met studentID via remindo moest gebeuren.

Dat ook de tussenstappen beoordeeld worden, zodat je nog wel punten kan scoren voor de dingen die je wel goed heb. Dat scherm niet automatisch verplaatst.

De inlog procedure is een beetje omslachtig (2 of 3 redirects), liever 1 startpagina voor de toets. De keuze SurfContext of eigen gebruikersnaam mag wel worden toegelicht.

De kieskeurigheid van Sowiso is iets waarmee je moet leren werken. Echter, het zorgt er zelfs bij recidivisten nog steeds voor dat een opdracht afgekeurd wordt op een vermenigvuldingingsteken e.d. wat soms behoorlijk frustrerend kan zijn

De rekenmachine die beschikbaar is in sowiso is nogal beperkt qua invoer. Dat maakt een grafische juist beter. Als deze iets uitgebruider wordt zou dat fijn zijn.

De rekenmachine die gebruikt moet worden, vind ik zeer onprettig De vragen gewoon op papier in plaats van op een scherm.

Een toetsomgeving geechikt voor python. zodat we niet meer deze dingen uit ons hoofd hoeven leren. En ik vind dat we altijd een toelichting moeten kunnen geven ergens.

Er was een opdracht waarbij je blokken kon verschuiven en verplaatsen. Ik dacht eerst dat ik alleen vertikaal hoefde te verschuiven, maar het bleek dat alles naar rechts toe moest. Dit kan duidelijker aangegeven worden. Gelukkig werd het wel duidelijk toen er een waarschuwing kwam bij het afsluiten van de toets

Extra confirmatie bij het afronden zodat je dit niet perongeluk doet. Geen computer, Survaillanten die vriendelijker zijn

Gewoon één screen per vraag, niet dat splitsen in kopjes en dan zeggen dat er maar 4 vragen zijn als ik de toets opstart. Geen 1, 2ab, 3, 4 maar gewoon 1 t/m 12 op losse pagina's.

Google chrome gebruiken

Graag geen technische problemen aan het begin, waardoor er herrie ontstaat in de zaal.

Het is vervelend om 3 uur lang niet te mogen eten en drinken. Bovendien moet je alle uitwerkingen toch op papier maken, dus het voelt onnuttig om een toets digitaal te maken, als daardoor niet gegeten en gedronken kan worden en digitale problemen kunnen optreden.

Het navigeren naar vorige stappen kan nog verbeterd worden. Als je bijvoorbeeld bij de laatste vraag bent en terug wil naar een eerdere vraag, dan moet je eerst opslaan en dan wordt je naar een nieuwe pagina gestuurd. Pas dan kun je weer terug. Het zou prettiger zijn als iets automatisch op wordt geslagen waarmee ook het navigeren makkelijker wordt.

Ik vind het jammer dat in de computerzaal geen water toegestaan is. Begrijpelijk, maar bij een lange toets vind ik dit lastig Je hebt geen idee of je dicht bij een antwoord bent of niet. Het is vaak fout, of goed.

Misschien 1 start html die doorlinkt naar alle anderen om verwarring te voorkomen. Misschien feedback, als je een antwoord ingevuld hebt.

Na het beantwoorden van een vraag die ik eerder had overgeslagen liever meteen een knop met afronden i.p.v. alle volgende antwoorden opnieuw opslaan

Nee

Niet meer online toetsen maken, of ervoor zorgen dat meerdere goede antwoorden goed zijn. Niks, helemaal prima.

Referenties

GERELATEERDE DOCUMENTEN

Inbreng farmaceutische kennis gewenst De gedachte dat farmacie een medisch specialisme dient te zijn, verdient mijns inziens meer aandacht.. “De

JAARGROEP 6 Thema 1

Weet je nog meer namen?. Schrijf ze op

Weet je nog meer namen?. Schrijf ze op

Weet je nog meer namen?. Schrijf ze op

Weet je nog meer namen?. Schrijf ze op

Weet je nog meer namen?. Schrijf ze op

Als eenjarige mengsels vlak na de bloei worden afgemaaid, loop je als beheerder een grote kans dat je het mengsel het jaar erop bijna niet meer terugziet. Verwijderen van