DiCE

Digital module evaluations

DiCE is a product that focuses on optimizing the process of educational evaluations and is aimed at obtaining valid results about the quality of education. These results are useful to the lecturer who can use them to improve his or her education, to the internal quality assurance (study program and examination committee), and to the management (steering of educational quality, use of information for human resources). DiCE offers a solution to problems such as:

  • No standard questionnaires, no benchmarks, no management information
  • Standard questionnaires hardly adaptable and therefore not appropriate for the evaluated education
  • No overview of all running educational evaluations
  • Hardly any involvement of lecturers in the evaluation process
  • None or laborious feedback of evaluation results to students
  • Reporting only available after weeks

DiCE makes use of the possibilities offered by digital evaluations. DiCE provides tailored support to all those involved in the evaluation process, creating a streamlined solution that supports the process to the utmost.

The emphasis is on improving the efficiency and effectiveness with which speed and added value of the evaluations is increased and the processing of results and thereby governing the quality of education is optimized.

DiCE offers process support for each of the four basic steps in educational (course) evaluations:

  • Gathering information
  • Running the evaluation
  • Collecting and enriching results
  • Reporting and Analyses

Gathering information

The evaluation process starts with the administration and providing the course information to DiCE from the administration (SIS student information system). The coordinator coordinates the process based on the received data and takes care of the selection of the right question blocks and any additions to the evaluations with course specific questions together with the course lecturers.

Running the evaluation

Preparations are checked, released and automatically published. The evaluation is automatically opened and closed according to data from SIS or the coordinator in DiCE. Students can be informed in various ways that an evaluation has been opened such as a DiCE Widget, an email, a link in the VLE or provided by the lecturer, and so on. These communication tools are designed to maximize response. Needless to say, but students can evaluate at any time (24/7), at any location and at any (mobile) device available.

Collecting and enrich results

Once the evaluation is closed, the results are collected and directly made available based on reporting models to the various parties involved: the lecturer, the coordinator and the student. The lecturer will give the opportunity to comment (feedback) on the results and can indicate how the results of the evaluation will be utilized in the course next time. Immediately making the report available to the student, possibly supplemented with comments from the lecturer, is an additional means of communication which increases the response.

Institution-wide reporting and analyses

The report provided by DiCE focuses primarily on the individual results of a conducted course evaluation. The processing of historical results and the determination of a standard deviation are part of the reports. The reports can be customized for each target group by DICE administrator. The results (including the lecturer feedback) can be transmitted to a data warehouse (DWH). The DWH provides facilities for a comprehensive analysis of grouped results, by faculty, per course or the institution. DiCE provides the base for an institution-wide quality assurance aimed at education.

Roles and (access) rights

 

The process steps have different managers and stakeholders. These are specified in four different roles in DiCE: the coordinator, the lecturer, the student and the (functional) administrator.

The coordinator is responsible for the evaluation process for all or part of the institution. The coordinator is optimally supported by DiCE to perform as effectively and efficiently as possible in the evaluation process. The basic course information, including lecturers and enrolment, is automatically supplied by administrative educational system and is the starting point for DiCE for the process.

The coordination process

The process steps in an evaluation within the field of view of the coordinator are dependent on the method of the institution. The process in DiCE is configurable, distinguishing the following basic steps in the domain of the coordinator:

  • Indicate a non-standard way of evaluation
  • Assign an evaluation to lecturer
  • Configure the evaluation
  • Validate the evaluation configuration
  • Publish the evaluation
  • Summarize the evaluation configuration process
  • Trigger / inform the lecturer
  • Report evaluation results
  • Overview the feedback process of the evaluation results

Steer the evaluation process

 

The process begins with a list overview of the evaluations to prepare for by the coordinator in the segment where the coordinator is responsible for, such as a faculty. This overview shows all evaluations that are supplied by the educational system. Automatic processing of the data provides, if submitted, in filtering those courses that are marked as ‘evaluation-complicit’.

The coordinator can indicate in this list if an evaluation is not necessary or that a specific evaluation is not digital but will be administered on paper and should follow an alternative publishing process.

Assign an evaluation to lecturer

Depending on the chosen method, the coordinator can release an evaluation for the lecturer (who can configure the evaluation), or the evaluation is immediately available as a configurable evaluation for the lecturer of the course.

Configure evaluation

The coordinator can configure an evaluation. Configuring means that it is possible to add extra question blocks or individual questions to the standard evaluation template.

Validate the evaluation configuration

 

The evaluation overview indicates whether an evaluation has the initial state, or is 100% based on the default template, or that an evaluation is different from the default. If an evaluation deviates, that means that additional questions or question blocks were added. It is possible to preview visual changes to the evaluation from the overview. Depending on the used method, in case of a negative validation result, the evaluation is returned to the lecturer or is corrected by the coordinator.

Publish the evaluation (manual and automatic)

The coordinator may decide to publish “good” evaluations. Publishing means that the questionnaire is created and that the correct activation and termination time are registered. The questionnaire will automatically start and end at these specified instances. Evaluations that are not published on a configurable number of days before the activation date will be published automatically based on default template rules. A default template for the faculty can be, for example, a selection of standard question blocks, supplemented with an automatic selection of the blocks for lectures and tutorials if there are lecturers committed.

Summarize evaluation results

Once a digital evaluation is complete (the termination date has expired), then immediately reports are made for the different target groups. The coordinator has access to a specific report and to an overview of all evaluation reports.

Summary of the evaluation

 

The coordinator has continuous insight into the progression of the process. On the one hand through the list of “evaluations to prepare for”, and on the other hand through the dashboard. The dashboard shows at a glance the number of evaluations already published, those that are not conducted digitally and those that are not yet published and therefore require attention. After the evaluations are completed the dashboard provides insight into the feedback provided by lecturers on the evaluation results.

Trigger / inform lecturer

Insight in the process also requires support in guiding the stakeholders. From the overview a coordinator can directly sent an automated message to the teacher responsible for a course whose evaluation is still “open”..

The lecturer is responsible for the configuration of the evaluations of their own courses. The teacher is supported optimally by DICE in configuring the evaluation as effectively and efficiently as possible. Ease of use and simplicity are leading values in the teacher support so that lecturers can focus on their primary tasks.

The process

The process steps in an evaluation within the field of view of the teacher are dependent on the method of the institution. In the extreme, it is possible that the role ‘Teacher’ plays no role in the evaluation process. The process in DiCE is configurable distinguishing the following basic steps in the domain of the teacher:

  • Managing and configuring evaluations of their own courses
  • Checking the evaluation (preview)
  • Publishing the evaluation (manual)
  • Reporting evaluation results
  • Providing feedback on evaluation results

Managing and configuring evaluations

The primary overview for the lecturer lists the evaluations lecturers need to prepare for their own courses. This overview shows all evaluations that are supplied by the educational system and are characterized by the coordinator as ‘evaluation-complicit’. In this list the teacher can select whether an evaluation is agreed upon or whether the evaluation can be conducted based on the default setting. The evaluation is consequently returned to the coordinator for publication.

Configure an evaluation

 

The teacher can configure an evaluation. Configuring means that it is possible to add extra question blocks or individual questions to the standard evaluation template.

Validate the evaluation (preview)

From the overview evaluations can be checked visually through a preview before returning them to the coordinator or publishing them.

Publishing the evaluation (manual)

Depending on his or her the teacher can publish the evaluations. Publishing means that the questionnaire is created and that the correct activation and termination time are registered. The questionnaire will automatically start and end at these specified instances.

Reporting evaluation results

Once a digital evaluation is complete (the termination date has expired), then immediately reports are made for the different target groups. The teacher has its own specific report and has direct access to the reports via the overview of the evaluation results.

Providing feedback on evaluation results

 

The teacher can provide feedback on the evaluation results. The teacher has access to the functionality to give feedback by way of the overview of the evaluation results. This feedback can be used in the report to the student, or the historical reports to prospective students for example.

Engaging and motivating students

The role of the student in the evaluation process is very different from the other roles. Where in the other roles the emphasis is on (process) support, ease of use and simplicity in the role of the student the focus is on optimizing engagement. A known challenge for digital processes is receiving an as high as possible response rate. DiCE provides many tools to encourage the students to fill out evaluations.

Response optimization

The essence of the response optimization is finding the middle ground between “informing and inviting ‘and’ harassment ‘. For every situation this middle ground is different. DiCE provides a wide range of opportunities to give substance to the optimization and to walk the middle ground. Some of these possibilities are:

  • Feedback on evaluation results
  • Enhanced evaluation results with teacher feedback
  • Invites by e-mail
  • Reminders by e-mail
  • Integration with portal
  • Anytime and anywhere access to the evaluation

Feedback of evaluation results

According to the principle of “quid pro quo” the student is rewarded for completing the evaluation with feedback on the results. A specific reporting for students is offered by DiCE.

Enhanced evaluation results with teacher feedback

 

The reward in the form of the evaluation results can be increased if students are given insight into what is being done with the results. The feedback given by the teacher as an extra or as an extension to the standard reporting is provided by default.

Invites by e-mail

Students can actively be informed via personal e-mail that an evaluation is ready to be filled in.

Reminders by e-mail

In addition to inviting students via e-mail to fill in an evaluation it is also possible to send follow-up reminder messages via email.

Integration with student portal

Besides the more active (or perhaps by the student considered aggressive) approach via e-mail, numerous integration opportunities are offered with the portal of the institution. Among other things, overviews are provided of evaluations that are due and reports feature DiCE design or are adapted to the layout of the portal.

Anywhere, anytime, any deviceAnytime and anywhere access to the evaluation

The evaluations are digitally available in the evaluation environment, which is available 24/7 and also provides extensive support for both PCs and mobile devices.

Eric Meijer

Marcel Noordzij

Steven Losekoot

Jeroen van Schagen